Sample records for sizing analysis tool

  1. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  2. "PowerUp"!: A Tool for Calculating Minimum Detectable Effect Sizes and Minimum Required Sample Sizes for Experimental and Quasi-Experimental Design Studies

    ERIC Educational Resources Information Center

    Dong, Nianbo; Maynard, Rebecca

    2013-01-01

    This paper and the accompanying tool are intended to complement existing supports for conducting power analysis tools by offering a tool based on the framework of Minimum Detectable Effect Sizes (MDES) formulae that can be used in determining sample size requirements and in estimating minimum detectable effect sizes for a range of individual- and…

  3. Development of the ECLSS Sizing Analysis Tool and ARS Mass Balance Model Using Microsoft Excel

    NASA Technical Reports Server (NTRS)

    McGlothlin, E. P.; Yeh, H. Y.; Lin, C. H.

    1999-01-01

    The development of a Microsoft Excel-compatible Environmental Control and Life Support System (ECLSS) sizing analysis "tool" for conceptual design of Mars human exploration missions makes it possible for a user to choose a certain technology in the corresponding subsystem. This tool estimates the mass, volume, and power requirements of every technology in a subsystem and the system as a whole. Furthermore, to verify that a design sized by the ECLSS Sizing Tool meets the mission requirements and integrates properly, mass balance models that solve for component throughputs of such ECLSS systems as the Water Recovery System (WRS) and Air Revitalization System (ARS) must be developed. The ARS Mass Balance Model will be discussed in this paper.

  4. Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT)

    NASA Technical Reports Server (NTRS)

    Brown, Cheryl B.; Conger, Bruce C.; Miranda, Bruno M.; Bue, Grant C.; Rouen, Michael N.

    2007-01-01

    An effort was initiated by NASA/JSC in 2001 to develop an Extravehicular Activity System Sizing Analysis Tool (EVAS_SAT) for the sizing of Extravehicular Activity System (EVAS) architecture and studies. Its intent was to support space suit development efforts and to aid in conceptual designs for future human exploration missions. Its basis was the Life Support Options Performance Program (LSOPP), a spacesuit and portable life support system (PLSS) sizing program developed for NASA/JSC circa 1990. EVAS_SAT estimates the mass, power, and volume characteristics for user-defined EVAS architectures, including Suit Systems, Airlock Systems, Tools and Translation Aids, and Vehicle Support equipment. The tool has undergone annual changes and has been updated as new data have become available. Certain sizing algorithms have been developed based on industry standards, while others are based on the LSOPP sizing routines. The sizing algorithms used by EVAS_SAT are preliminary. Because EVAS_SAT was designed for use by members of the EVA community, subsystem familiarity on the part of the intended user group and in the analysis of results is assumed. The current EVAS_SAT is operated within Microsoft Excel 2003 using a Visual Basic interface system.

  5. Association between hospital size and quality improvement for pharmaceutical services.

    PubMed

    Nau, David P; Garber, Mathew C; Lipowski, Earlene E; Stevenson, James G

    2004-01-15

    The relationship between hospital size and quality improvement (QI) for pharmaceutical services was studied. A questionnaire on QI was sent to hospital pharmacy directors in Michigan and Florida in 2002. The questionnaire included items on QI lead-team composition, QI tools, QI training, and QI culture. Usable responses were received from 162 (57%) of 282 pharmacy directors. Pharmacy QI lead teams were present in 57% of institutions, with larger teams in large hospitals (> or = 300 patients). Only two QI tools were used by a majority of hospitals: root-cause analysis (62%) and flow charts (66%). Small hospitals (< 50 patients) were less likely than medium-sized hospitals (50-299 patients) and large hospitals to use several QI tools, including control charts, cause-and-effect diagrams, root-cause analysis, flow charts, and histograms. Large hospitals were more likely than small and medium-sized hospitals to use root-cause analysis and control charts. There was no relationship between hospital size and the frequency with which physician or patient satisfaction with pharmaceutical services was measured. There were no differences in QI training or QI culture across hospital size categories. A survey suggested that a majority of hospital pharmacies in Michigan and Florida have begun to adopt QI techniques but that most are not using rigorous QI tools. Pharmacies in large hospitals had more QI lead-team members and were more likely to use certain QI tools, but there was no relationship between hospital size and satisfaction measurements, QI training, or QI culture.

  6. Performance and Sizing Tool for Quadrotor Biplane Tailsitter UAS

    NASA Astrophysics Data System (ADS)

    Strom, Eric

    The Quadrotor-Biplane-Tailsitter (QBT) configuration is the basis for a mechanically simplistic rotorcraft capable of both long-range, high-speed cruise as well as hovering flight. This work presents the development and validation of a set of preliminary design tools built specifically for this aircraft to enable its further development, including: a QBT weight model, preliminary sizing framework, and vehicle analysis tools. The preliminary sizing tool presented here shows the advantage afforded by QBT designs in missions with aggressive cruise requirements, such as offshore wind turbine inspections, wherein transition from a quadcopter configuration to a QBT allows for a 5:1 trade of battery weight for wing weight. A 3D, unsteady panel method utilizing a nonlinear implementation of the Kutta-Joukowsky condition is also presented as a means of computing aerodynamic interference effects and, through the implementation of rotor, body, and wing geometry generators, is prepared for coupling with a comprehensive rotor analysis package.

  7. In response to 'Can sugars be produced from fatty acids? A test case for pathway analysis tools'.

    PubMed

    Faust, Karoline; Croes, Didier; van Helden, Jacques

    2009-12-01

    In their article entitled 'Can sugars be produced from fatty acids? A test case for pathway analysis tools' de Figueiredo and co-authors assess the performance of three pathway prediction tools (METATOOL, PathFinding and Pathway Hunter Tool) using the synthesis of glucose-6-phosphate (G6P) from acetyl-CoA in humans as a test case. We think that this article is biased for three reasons: (i) the metabolic networks used as input for the respective tools were of very different sizes; (ii) the 'assessment' is restricted to two study cases; (iii) developers are inherently more skilled to use their own tools than those developed by other people. We extended the analyses led by de Figueiredo and clearly show that the apparent superior performance of their tool (METATOOL) is partly due to the differences in input network sizes. We also see a conceptual problem in the comparison of tools that serve different purposes. In our opinion, metabolic path finding and elementary mode analysis are answering different biological questions, and should be considered as complementary rather than competitive approaches. Supplementary data are available at Bioinformatics online.

  8. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  9. Study of heat generation and cutting force according to minimization of grain size (500 nm to 180 nm) of WC ball endmill using FEM

    NASA Astrophysics Data System (ADS)

    Byeon, J. H.; Ahmed, F.; Ko, T. J.; lee, D. K.; Kim, J. S.

    2018-03-01

    As the industry develops, miniaturization and refinement of products are important issues. Precise machining is required for cutting, which is a typical method of machining a product. The factor determining the workability of the cutting process is the material of the tool. Tool materials include carbon tool steel, alloy tool steel, high-speed steel, cemented carbide, and ceramics. In the case of a carbide material, the smaller the particle size, the better the mechanical properties with higher hardness, strength and toughness. The specific heat, density, and thermal diffusivity are also changed through finer particle size of the material. In this study, finite element analysis was performed to investigate the change of heat generation and cutting power depending on the physical properties (specific heat, density, thermal diffusivity) of tool material. The thermal conductivity coefficient was obtained by measuring the thermal diffusivity, specific heat, and density of the material (180 nm) in which the particle size was finer and the particle material (0.05 μm) in the conventional size. The coefficient of thermal conductivity was calculated as 61.33 for 180nm class material and 46.13 for 0.05μm class material. As a result of finite element analysis using this value, the average temperature of exothermic heat of micronized particle material (180nm) was 532.75 °C and the temperature of existing material (0.05μm) was 572.75 °C. Cutting power was also compared but not significant. Therefore, if the thermal conductivity is increased through particle refinement, the surface power can be improved and the tool life can be prolonged by lowering the temperature generated in the tool during machining without giving a great influence to the cutting power.

  10. Dust control effectiveness of drywall sanding tools.

    PubMed

    Young-Corbett, Deborah E; Nussbaum, Maury A

    2009-07-01

    In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.

  11. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  12. Simultaneous Aerodynamic and Structural Design Optimization (SASDO) for a 3-D Wing

    NASA Technical Reports Server (NTRS)

    Gumbert, Clyde R.; Hou, Gene J.-W.; Newman, Perry A.

    2001-01-01

    The formulation and implementation of an optimization method called Simultaneous Aerodynamic and Structural Design Optimization (SASDO) is shown as an extension of the Simultaneous Aerodynamic Analysis and Design Optimization (SAADO) method. It is extended by the inclusion of structure element sizing parameters as design variables and Finite Element Method (FEM) analysis responses as constraints. The method aims to reduce the computational expense. incurred in performing shape and sizing optimization using state-of-the-art Computational Fluid Dynamics (CFD) flow analysis, FEM structural analysis and sensitivity analysis tools. SASDO is applied to a simple. isolated, 3-D wing in inviscid flow. Results show that the method finds the saine local optimum as a conventional optimization method with some reduction in the computational cost and without significant modifications; to the analysis tools.

  13. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  14. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  15. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  16. Unleashing the Power of Distributed CPU/GPU Architectures: Massive Astronomical Data Analysis and Visualization Case Study

    NASA Astrophysics Data System (ADS)

    Hassan, A. H.; Fluke, C. J.; Barnes, D. G.

    2012-09-01

    Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.

  17. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    NASA Technical Reports Server (NTRS)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  18. CAT 2 - An improved version of Cryogenic Analysis Tools for online and offline monitoring and analysis of large size cryostats

    NASA Astrophysics Data System (ADS)

    Pagliarone, C. E.; Uttaro, S.; Cappelli, L.; Fallone, M.; Kartal, S.

    2017-02-01

    CAT, Cryogenic Analysis Tools is a software package developed using LabVIEW and ROOT environments to analyze the performances of large size cryostats, where many parameters, input, and control variables need to be acquired and studied at the same time. The present paper describes how CAT works and which are the main improvements achieved in the new version: CAT 2. New Graphical User Interfaces have been developed in order to make the use of the full package more user-friendly as well as a process of resource optimization has been carried out. The offline analysis of the full cryostat performances is available both trough ROOT line command interface band also by using the new graphical interfaces.

  19. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  20. Tool for Sizing Analysis of the Advanced Life Support System

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsie Jannivine; Brown, Cheryl B.; Jeng, Frank J.

    2005-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) is a computer model for sizing and analyzing designs of environmental-control and life support systems (ECLSS) for spacecraft and surface habitats involved in the exploration of Mars and Moon. It performs conceptual designs of advanced life support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water, and process wastes in order to reduce the need of resource resupply. By assuming steady-state operations, ALSSAT is a means of investigating combinations of such subsystems technologies and thereby assisting in determining the most cost-effective technology combination available. In fact, ALSSAT can perform sizing analysis of the ALS subsystems that are operated dynamically or steady in nature. Using the Microsoft Excel spreadsheet software with Visual Basic programming language, ALSSAT has been developed to perform multiple-case trade studies based on the calculated ECLSS mass, volume, power, and Equivalent System Mass, as well as parametric studies by varying the input parameters. ALSSAT s modular format is specifically designed for the ease of future maintenance and upgrades.

  1. Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.

    2005-01-01

    This document is the final report for the project entitled, "Multi-Scale Sizing of Lightweight Multifunctional Spacecraft Structural Components," funded under the NRA entitled "Cross-Enterprise Technology Development Program" issued by the NASA Office of Space Science in 2000. The project was funded in 2001, and spanned a four year period from March, 2001 to February, 2005. Through enhancements to and synthesis of unique, state of the art structural mechanics and micromechanics analysis software, a new multi-scale tool has been developed that enables design, analysis, and sizing of advance lightweight composite and smart materials and structures from the full vehicle, to the stiffened structure, to the micro (fiber and matrix) scales. The new software tool has broad, cross-cutting value to current and future NASA missions that will rely on advanced composite and smart materials and structures.

  2. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  3. TMA Vessel Segmentation Based on Color and Morphological Features: Application to Angiogenesis Research

    PubMed Central

    Fernández-Carrobles, M. Milagro; Tadeo, Irene; Bueno, Gloria; Noguera, Rosa; Déniz, Oscar; Salido, Jesús; García-Rojo, Marcial

    2013-01-01

    Given that angiogenesis and lymphangiogenesis are strongly related to prognosis in neoplastic and other pathologies and that many methods exist that provide different results, we aim to construct a morphometric tool allowing us to measure different aspects of the shape and size of vascular vessels in a complete and accurate way. The developed tool presented is based on vessel closing which is an essential property to properly characterize the size and the shape of vascular and lymphatic vessels. The method is fast and accurate improving existing tools for angiogenesis analysis. The tool also improves the accuracy of vascular density measurements, since the set of endothelial cells forming a vessel is considered as a single object. PMID:24489494

  4. A new tool called DISSECT for analysing large genomic data sets using a Big Data approach

    PubMed Central

    Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert

    2015-01-01

    Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010

  5. U.S. Geological Survey ArcMap Sediment Classification tool

    USGS Publications Warehouse

    O'Malley, John

    2007-01-01

    The U.S. Geological Survey (USGS) ArcMap Sediment Classification tool is a custom toolbar that extends the Environmental Systems Research Institute, Inc. (ESRI) ArcGIS 9.2 Desktop application to aid in the analysis of seabed sediment classification. The tool uses as input either a point data layer with field attributes containing percentage of gravel, sand, silt, and clay or four raster data layers representing a percentage of sediment (0-100%) for the various sediment grain size analysis: sand, gravel, silt and clay. This tool is designed to analyze the percent of sediment at a given location and classify the sediments according to either the Folk (1954, 1974) or Shepard (1954) as modified by Schlee(1973) classification schemes. The sediment analysis tool is based upon the USGS SEDCLASS program (Poppe, et al. 2004).

  6. Effect of High-Frequency Transcranial Magnetic Stimulation on Craving in Substance Use Disorder: A Meta-Analysis.

    PubMed

    Maiti, Rituparna; Mishra, Biswa Ranjan; Hota, Debasish

    2017-01-01

    Repetitive transcranial magnetic stimulation (rTMS), a noninvasive, neuromodulatory tool, has been used to reduce craving in different substance use disorders. There are some studies that have reported conflicting and inconclusive results; therefore, this meta-analysis was conducted to evaluate the effect of high-frequency rTMS on craving in substance use disorder and to investigate the reasons behind the inconsistency across the studies. The authors searched clinical trials from MEDLINE, Cochrane databases, and International Clinical Trials Registry Platform. The PRISMA guidelines, as well as recommended meta-analysis practices, were followed in the selection process, analysis, and reporting of the findings. The effect estimate used was the standardized mean difference (Hedge's g), and heterogeneity across the considered studies was explored using subgroup analyses. The quality assessment was done using the Cochrane risk of bias tool, and sensitivity analysis was performed to check the influences on effect size by statistical models. After screening and assessment of eligibility, finally 10 studies were included for meta-analysis, which includes six studies on alcohol and four studies on nicotine use disorder. The random-model analysis revealed a pooled effect size of 0.75 (95% CI=0.29 to 1.21, p=0.001), whereas the fixed-model analysis showed a large effect size of 0.87 (95% CI=0.63 to 1.12, p<0.00001). Subgroup analysis for alcohol use disorder showed an effect size of -0.06 (95% CI=-0.89 to 0.77, p=0.88). In the case of nicotine use disorder, random-model analysis revealed an effect size of 1.00 (95% CI=0.48 to 1.55, p=0.0001), whereas fixed-model analysis also showed a large effect size of 0.96 (95% CI=0.71 to 1.22). The present meta-analysis identified a beneficial effect of high-frequency rTMS on craving associated with nicotine use disorder but not alcohol use disorder.

  7. Hybrid Wing Body Planform Design with Vehicle Sketch Pad

    NASA Technical Reports Server (NTRS)

    Wells, Douglas P.; Olson, Erik D.

    2011-01-01

    The objective of this paper was to provide an update on NASA s current tools for design and analysis of hybrid wing body (HWB) aircraft with an emphasis on Vehicle Sketch Pad (VSP). NASA started HWB analysis using the Flight Optimization System (FLOPS). That capability is enhanced using Phoenix Integration's ModelCenter(Registered TradeMark). Model Center enables multifidelity analysis tools to be linked as an integrated structure. Two major components are linked to FLOPS as an example; a planform discretization tool and VSP. The planform discretization tool ensures the planform is smooth and continuous. VSP is used to display the output geometry. This example shows that a smooth & continuous HWB planform can be displayed as a three-dimensional model and rapidly sized and analyzed.

  8. ALSSAT Version 6.0

    NASA Technical Reports Server (NTRS)

    Yeh, Hue-Hsia; Brown, Cheryl; Jeng, Frank

    2012-01-01

    Advanced Life Support Sizing Analysis Tool (ALSSAT) at the time of this reporting has been updated to version 6.0. A previous version was described in Tool for Sizing Analysis of the Advanced Life Support System (MSC- 23506), NASA Tech Briefs, Vol. 29, No. 12 (December 2005), page 43. To recapitulate: ALSSAT is a computer program for sizing and analyzing designs of environmental-control and life-support systems for spacecraft and surface habitats to be involved in exploration of Mars and the Moon. Of particular interest for analysis by ALSSAT are conceptual designs of advanced life-support (ALS) subsystems that utilize physicochemical and biological processes to recycle air and water and process human wastes to reduce the need of resource resupply. ALSSAT is a means of investigating combinations of such subsystems technologies featuring various alternative conceptual designs and thereby assisting in determining which combination is most cost-effective. ALSSAT version 6.0 has been improved over previous versions in several respects, including the following additions: an interface for reading sizing data from an ALS database, computational models of a redundant regenerative CO2 and Moisture Removal Amine Swing Beds (CAMRAS) for CO2 removal, upgrade of the Temperature & Humidity Control's Common Cabin Air Assembly to a detailed sizing model, and upgrade of the Food-management subsystem.

  9. Integrating Transportation Modeling and Desktop GIS: A Practical and Affordable Analysis Tool for Small and Medium Sized Communities

    DOT National Transportation Integrated Search

    1998-09-16

    This paper and presentation discuss some of the benefits of integrating travel : demand models and desktop GIS (ArchInfo and ArcView for PCs) as a : cost-effective and staff saving tool, as well as specific improvements to : transportation planning m...

  10. SlideJ: An ImageJ plugin for automated processing of whole slide images.

    PubMed

    Della Mea, Vincenzo; Baroni, Giulia L; Pilutti, David; Di Loreto, Carla

    2017-01-01

    The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images-up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations.

  11. SlideJ: An ImageJ plugin for automated processing of whole slide images

    PubMed Central

    Baroni, Giulia L.; Pilutti, David; Di Loreto, Carla

    2017-01-01

    The digital slide, or Whole Slide Image, is a digital image, acquired with specific scanners, that represents a complete tissue sample or cytological specimen at microscopic level. While Whole Slide image analysis is recognized among the most interesting opportunities, the typical size of such images—up to Gpixels- can be very demanding in terms of memory requirements. Thus, while algorithms and tools for processing and analysis of single microscopic field images are available, Whole Slide images size makes the direct use of such tools prohibitive or impossible. In this work a plugin for ImageJ, named SlideJ, is proposed with the objective to seamlessly extend the application of image analysis algorithms implemented in ImageJ for single microscopic field images to a whole digital slide analysis. The plugin has been complemented by examples of macro in the ImageJ scripting language to demonstrate its use in concrete situations. PMID:28683129

  12. Dietary assessment in minority ethnic groups: a systematic review of instruments for portion-size estimation in the United Kingdom

    PubMed Central

    Almiron-Roig, Eva; Aitken, Amanda; Galloway, Catherine

    2017-01-01

    Context: Dietary assessment in minority ethnic groups is critical for surveillance programs and for implementing effective interventions. A major challenge is the accurate estimation of portion sizes for traditional foods and dishes. Objective: The aim of this systematic review was to assess records published up to 2014 describing a portion-size estimation element (PSEE) applicable to the dietary assessment of UK-residing ethnic minorities. Data sources, selection, and extraction: Electronic databases, internet sites, and theses repositories were searched, generating 5683 titles, from which 57 eligible full-text records were reviewed. Data analysis: Forty-two publications about minority ethnic groups (n = 20) or autochthonous populations (n = 22) were included. The most common PSEEs (47%) were combination tools (eg, food models and portion-size lists), followed by portion-size lists in questionnaires/guides (19%) and image-based and volumetric tools (17% each). Only 17% of PSEEs had been validated against weighed data. Conclusions: When developing ethnic-specific dietary assessment tools, it is important to consider customary portion sizes by sex and age, traditional household utensil usage, and population literacy levels. Combining multiple PSEEs may increase accuracy, but such methods require validation. PMID:28340101

  13. Laser Surface Modification of H13 Die Steel using Different Laser Spot Sizes

    NASA Astrophysics Data System (ADS)

    Aqida, S. N.; Naher, S.; Brabazon, D.

    2011-05-01

    This paper presents a laser surface modification process of AISI H13 tool steel using three sizes of laser spot with an aim to achieve reduced grain size and surface roughness. A Rofin DC-015 diffusion-cooled CO2 slab laser was used to process AISI H13 tool steel samples. Samples of 10 mm diameter were sectioned to 100 mm length in order to process a predefined circumferential area. The parameters selected for examination were laser peak power, overlap percentage and pulse repetition frequency (PRF). Metallographic study and image analysis were done to measure the grain size and the modified surface roughness was measured using two-dimensional surface profilometer. From metallographic study, the smallest grain sizes measured by laser modified surface were between 0.51 μm and 2.54 μm. The minimum surface roughness, Ra, recorded was 3.0 μm. This surface roughness of the modified die steel is similar to the surface quality of cast products. The grain size correlation with hardness followed the findings correlate with Hall-Petch relationship. The potential found for increase in surface hardness represents an important method to sustain tooling life.

  14. A thin-plate spline analysis of the face and tongue in obstructive sleep apnea patients.

    PubMed

    Pae, E K; Lowe, A A; Fleetham, J A

    1997-12-01

    The shape characteristics of the face and tongue in obstructive sleep apnea (OSA) patients were investigated using thin-plate (TP) splines. A relatively new analytic tool, the TP spline method, provides a means of size normalization and image analysis. When shape is one's main concern, various sizes of a biologic structure may be a source of statistical noise. More seriously, the strong size effect could mask underlying, actual attributes of the disease. A set of size normalized data in the form of coordinates was generated from cephalograms of 80 male subjects. The TP spline method envisioned the differences in the shape of the face and tongue between OSA patients and nonapneic subjects and those between the upright and supine body positions. In accordance with OSA severity, the hyoid bone and the submental region positioned inferiorly and the fourth vertebra relocated posteriorly with respect to the mandible. This caused a fanlike configuration of the lower part of the face and neck in the sagittal plane in both upright and supine body positions. TP splines revealed tongue deformations caused by a body position change. Overall, the new morphometric tool adopted here was found to be viable in the analysis of morphologic changes.

  15. Planetarium instructional efficacy: A research synthesis

    NASA Astrophysics Data System (ADS)

    Brazell, Bruce D.

    The purpose of the current study was to explore the instructional effectiveness of the planetarium in astronomy education using meta-analysis. A review of the literature revealed 46 studies related to planetarium efficacy. However, only 19 of the studies satisfied selection criteria for inclusion in the meta-analysis. Selected studies were then subjected to coding procedures, which extracted information such as subject characteristics, experimental design, and outcome measures. From these data, 24 effect sizes were calculated in the area of student achievement and five effect sizes were determined in the area of student attitudes using reported statistical information. Mean effect sizes were calculated for both the achievement and the attitude distributions. Additionally, each effect size distribution was subjected to homogeneity analysis. The attitude distribution was found to be homogeneous with a mean effect size of -0.09, which was not significant, p = .2535. The achievement distribution was found to be heterogeneous with a statistically significant mean effect size of +0.28, p < .05. Since the achievement distribution was heterogeneous, the analog to the ANOVA procedure was employed to explore variability in this distribution in terms of the coded variables. The analog to the ANOVA procedure revealed that the variability introduced by the coded variables did not fully explain the variability in the achievement distribution beyond subject-level sampling error under a fixed effects model. Therefore, a random effects model analysis was performed which resulted in a mean effect size of +0.18, which was not significant, p = .2363. However, a large random effect variance component was determined indicating that the differences between studies were systematic and yet to be revealed. The findings of this meta-analysis showed that the planetarium has been an effective instructional tool in astronomy education in terms of student achievement. However, the meta-analysis revealed that the planetarium has not been a very effective tool for improving student attitudes towards astronomy.

  16. Computer-assisted adjuncts for aneurysmal morphologic assessment: toward more precise and accurate approaches

    NASA Astrophysics Data System (ADS)

    Rajabzadeh-Oghaz, Hamidreza; Varble, Nicole; Davies, Jason M.; Mowla, Ashkan; Shakir, Hakeem J.; Sonig, Ashish; Shallwani, Hussain; Snyder, Kenneth V.; Levy, Elad I.; Siddiqui, Adnan H.; Meng, Hui

    2017-03-01

    Neurosurgeons currently base most of their treatment decisions for intracranial aneurysms (IAs) on morphological measurements made manually from 2D angiographic images. These measurements tend to be inaccurate because 2D measurements cannot capture the complex geometry of IAs and because manual measurements are variable depending on the clinician's experience and opinion. Incorrect morphological measurements may lead to inappropriate treatment strategies. In order to improve the accuracy and consistency of morphological analysis of IAs, we have developed an image-based computational tool, AView. In this study, we quantified the accuracy of computer-assisted adjuncts of AView for aneurysmal morphologic assessment by performing measurement on spheres of known size and anatomical IA models. AView has an average morphological error of 0.56% in size and 2.1% in volume measurement. We also investigate the clinical utility of this tool on a retrospective clinical dataset and compare size and neck diameter measurement between 2D manual and 3D computer-assisted measurement. The average error was 22% and 30% in the manual measurement of size and aneurysm neck diameter, respectively. Inaccuracies due to manual measurements could therefore lead to wrong treatment decisions in 44% and inappropriate treatment strategies in 33% of the IAs. Furthermore, computer-assisted analysis of IAs improves the consistency in measurement among clinicians by 62% in size and 82% in neck diameter measurement. We conclude that AView dramatically improves accuracy for morphological analysis. These results illustrate the necessity of a computer-assisted approach for the morphological analysis of IAs.

  17. Federal metering data analysis needs and existing tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henderson, Jordan W.; Fowler, Kimberly M.

    2015-07-01

    Agencies have been working to improve their metering data collection, management, and analysis efforts over the last decade (since EPAct 2005) and will continue to address these challenges as new requirements and data needs come into place. Unfortunately there is no “one-size-fits-all” solution. As agencies continue to expand their capabilities to use metered consumption data to reducing resource use and improve operations, the hope is that shared knowledge will empower others to follow suit. This paper discusses the Federal metering data analysis needs and some existing tools.

  18. Small Launch Vehicle Trade Space Definition: Development of a Zero Level Mass Estimation Tool with Trajectory Validation

    NASA Technical Reports Server (NTRS)

    Waters, Eric D.

    2013-01-01

    Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.

  19. Improvements in Thermal Protection Sizing Capabilities for TCAT: Conceptual Design for Advanced Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Olds, John R.; Izon, Stephen James

    2002-01-01

    The Thermal Calculation Analysis Tool (TCAT), originally developed for the Space Systems Design Lab at the Georgia Institute of Technology, is a conceptual design tool capable of integrating aeroheating analysis into conceptual reusable launch vehicle design. It provides Thermal Protection System (TPS) unit thicknesses and acreage percentages based on the geometry of the vehicle and a reference trajectory to be used in calculation of the total cost and weight of the vehicle design. TCAT has proven to be reasonably accurate at calculating the TPS unit weights for in-flight trajectories; however, it does not have the capability of sizing TPS materials above cryogenic fuel tanks for ground hold operations. During ground hold operations, the vehicle is held for a brief period (generally about two hours) during which heat transfer from the TPS materials to the cryogenic fuel occurs. If too much heat is extracted from the TPS material, the surface temperature may fall below the freezing point of water, thereby freezing any condensation that may be present at the surface of the TPS. Condensation or ice on the surface of the vehicle is potentially hazardous to the mission and can also damage the TPS. It is questionable whether or not the TPS thicknesses provided by the aeroheating analysis would be sufficiently thick to insulate the surface of the TPS from the heat transfer to the fuel. Therefore, a design tool has been developed that is capable of sizing TPS materials at these cryogenic fuel tank locations to augment TCAT's TPS sizing capabilities.

  20. Thermal-Structural Optimization of Integrated Cryogenic Propellant Tank Concepts for a Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.

    2004-01-01

    A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.

  1. Vehicle Design Evaluation Program (VDEP). A computer program for weight sizing, economic, performance and mission analysis of fuel-conservative aircraft, multibodied aircraft and large cargo aircraft using both JP and alternative fuels

    NASA Technical Reports Server (NTRS)

    Oman, B. H.

    1977-01-01

    The NASA Langley Research Center vehicle design evaluation program (VDEP-2) was expanded by (1) incorporating into the program a capability to conduct preliminary design studies on subsonic commercial transport type aircraft using both JP and such alternate fuels as hydrogen and methane;(2) incorporating an aircraft detailed mission and performance analysis capability; and (3) developing and incorporating an external loads analysis capability. The resulting computer program (VDEP-3) provides a preliminary design tool that enables the user to perform integrated sizing, structural analysis, and cost studies on subsonic commercial transport aircraft. Both versions of the VDEP-3 Program which are designated preliminary Analysis VDEP-3 and detailed Analysis VDEP utilize the same vehicle sizing subprogram which includes a detailed mission analysis capability, as well as a geometry and weight analysis for multibodied configurations.

  2. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    PubMed

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).

  3. Effect Size Measures for Mediation Models: Quantitative Strategies for Communicating Indirect Effects

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Kelley, Ken

    2011-01-01

    The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those…

  4. Design sensitivity analysis and optimization tool (DSO) for sizing design applications

    NASA Technical Reports Server (NTRS)

    Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa

    1992-01-01

    The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.

  5. Sizing and Lifecycle Cost Analysis of an Ares V Composite Interstage

    NASA Technical Reports Server (NTRS)

    Mann, Troy; Smeltzer, Stan; Grenoble, Ray; Mason, Brian; Rosario, Sev; Fairbairn, Bob

    2012-01-01

    The Interstage Element of the Ares V launch vehicle was sized using a commercially available structural sizing software tool. Two different concepts were considered, a metallic design and a composite design. Both concepts were sized using similar levels of analysis fidelity and included the influence of design details on each concept. Additionally, the impact of the different manufacturing techniques and failure mechanisms for composite and metallic construction were considered. Significant details were included in analysis models of each concept, including penetrations for human access, joint connections, as well as secondary loading effects. The designs and results of the analysis were used to determine lifecycle cost estimates for the two Interstage designs. Lifecycle cost estimates were based on industry provided cost data for similar launch vehicle components. The results indicated that significant mass as well as cost savings are attainable for the chosen composite concept as compared with a metallic option.

  6. Automatic differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. AD is assessed as a tool for engineering design. The forward and reverse modes of AD, their computing requirements, as well as approaches to implementing AD are discussed. The application of two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation is also discussed. The observation is made that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available; in some instances, AD may be the alternative to consider in lieu of analytical sensitivity analysis.

  7. Common Bolted Joint Analysis Tool

    NASA Technical Reports Server (NTRS)

    Imtiaz, Kauser

    2011-01-01

    Common Bolted Joint Analysis Tool (comBAT) is an Excel/VB-based bolted joint analysis/optimization program that lays out a systematic foundation for an inexperienced or seasoned analyst to determine fastener size, material, and assembly torque for a given design. Analysts are able to perform numerous what-if scenarios within minutes to arrive at an optimal solution. The program evaluates input design parameters, performs joint assembly checks, and steps through numerous calculations to arrive at several key margins of safety for each member in a joint. It also checks for joint gapping, provides fatigue calculations, and generates joint diagrams for a visual reference. Optimum fastener size and material, as well as correct torque, can then be provided. Analysis methodology, equations, and guidelines are provided throughout the solution sequence so that this program does not become a "black box:" for the analyst. There are built-in databases that reduce the legwork required by the analyst. Each step is clearly identified and results are provided in number format, as well as color-coded spelled-out words to draw user attention. The three key features of the software are robust technical content, innovative and user friendly I/O, and a large database. The program addresses every aspect of bolted joint analysis and proves to be an instructional tool at the same time. It saves analysis time, has intelligent messaging features, and catches operator errors in real time.

  8. IVisTMSA: Interactive Visual Tools for Multiple Sequence Alignments.

    PubMed

    Pervez, Muhammad Tariq; Babar, Masroor Ellahi; Nadeem, Asif; Aslam, Naeem; Naveed, Nasir; Ahmad, Sarfraz; Muhammad, Shah; Qadri, Salman; Shahid, Muhammad; Hussain, Tanveer; Javed, Maryam

    2015-01-01

    IVisTMSA is a software package of seven graphical tools for multiple sequence alignments. MSApad is an editing and analysis tool. It can load 409% more data than Jalview, STRAP, CINEMA, and Base-by-Base. MSA comparator allows the user to visualize consistent and inconsistent regions of reference and test alignments of more than 21-MB size in less than 12 seconds. MSA comparator is 5,200% efficient and more than 40% efficient as compared to BALiBASE c program and FastSP, respectively. MSA reconstruction tool provides graphical user interfaces for four popular aligners and allows the user to load several sequence files at a time. FASTA generator converts seven formats of alignments of unlimited size into FASTA format in a few seconds. MSA ID calculator calculates identity matrix of more than 11,000 sequences with a sequence length of 2,696 base pairs in less than 100 seconds. Tree and Distance Matrix calculation tools generate phylogenetic tree and distance matrix, respectively, using neighbor joining% identity and BLOSUM 62 matrix.

  9. Terminal Restriction Fragment Length Polymorphism Analysis Program, a Web-Based Research Tool for Microbial Community Analysis

    PubMed Central

    Marsh, Terence L.; Saxman, Paul; Cole, James; Tiedje, James

    2000-01-01

    Rapid analysis of microbial communities has proven to be a difficult task. This is due, in part, to both the tremendous diversity of the microbial world and the high complexity of many microbial communities. Several techniques for community analysis have emerged over the past decade, and most take advantage of the molecular phylogeny derived from 16S rRNA comparative sequence analysis. We describe a web-based research tool located at the Ribosomal Database Project web site (http://www.cme.msu.edu/RDP/html/analyses.html) that facilitates microbial community analysis using terminal restriction fragment length polymorphism of 16S ribosomal DNA. The analysis function (designated TAP T-RFLP) permits the user to perform in silico restriction digestions of the entire 16S sequence database and derive terminal restriction fragment sizes, measured in base pairs, from the 5′ terminus of the user-specified primer to the 3′ terminus of the restriction endonuclease target site. The output can be sorted and viewed either phylogenetically or by size. It is anticipated that the site will guide experimental design as well as provide insight into interpreting results of community analysis with terminal restriction fragment length polymorphisms. PMID:10919828

  10. Analytical Formulation for Sizing and Estimating the Dimensions and Weight of Wind Turbine Hub and Drivetrain Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Parsons, T.; King, R.

    This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less

  11. Analysis of the Surface of Deposited Copper After Electroerosion Treatment

    NASA Astrophysics Data System (ADS)

    Ablyaz, T. R.; Simonov, M. Yu.; Shlykov, E. S.

    2018-03-01

    An electron microscope analysis of the surface of deposited copper is performed after a profiling-piercing electroerosion treatment. The deposited copper is treated with steel, duralumin, and copper electrode tools at different pulse energies. The treatment with the duralumin electrode produces on the treated surface a web-like structure and cubic-morphology polyhedral dimples about 10 μm in size. The main components of the surface treated with the steel electrode are developed polyhedral dimples with a size of 10 - 50 μm. After the treatment with the copper electrode the main components of the treated surface are large polyhedral dimples about 30 - 80 μm in size.

  12. Increasing conclusiveness of clinical breath analysis by improved baseline correction of multi capillary column - ion mobility spectrometry (MCC-IMS) data.

    PubMed

    Szymańska, Ewa; Tinnevelt, Gerjen H; Brodrick, Emma; Williams, Mark; Davies, Antony N; van Manen, Henk-Jan; Buydens, Lutgarde M C

    2016-08-05

    Current challenges of clinical breath analysis include large data size and non-clinically relevant variations observed in exhaled breath measurements, which should be urgently addressed with competent scientific data tools. In this study, three different baseline correction methods are evaluated within a previously developed data size reduction strategy for multi capillary column - ion mobility spectrometry (MCC-IMS) datasets. Introduced for the first time in breath data analysis, the Top-hat method is presented as the optimum baseline correction method. A refined data size reduction strategy is employed in the analysis of a large breathomic dataset on a healthy and respiratory disease population. New insights into MCC-IMS spectra differences associated with respiratory diseases are provided, demonstrating the additional value of the refined data analysis strategy in clinical breath analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. The PEDro scale had acceptably high convergent validity, construct validity, and interrater reliability in evaluating methodological quality of pharmaceutical trials.

    PubMed

    Yamato, Tie Parma; Maher, Chris; Koes, Bart; Moseley, Anne

    2017-06-01

    The Physiotherapy Evidence Database (PEDro) scale has been widely used to investigate methodological quality in physiotherapy randomized controlled trials; however, its validity has not been tested for pharmaceutical trials. The aim of this study was to investigate the validity and interrater reliability of the PEDro scale for pharmaceutical trials. The reliability was also examined for the Cochrane Back and Neck (CBN) Group risk of bias tool. This is a secondary analysis of data from a previous study. We considered randomized placebo controlled trials evaluating any pain medication for chronic spinal pain or osteoarthritis. Convergent validity was evaluated by correlating the PEDro score with the summary score of the CBN risk of bias tool. The construct validity was tested using a linear regression analysis to determine the degree to which the total PEDro score is associated with treatment effect sizes, journal impact factor, and the summary score for the CBN risk of bias tool. The interrater reliability was estimated using the Prevalence and Bias Adjusted Kappa coefficient and 95% confidence interval (CI) for the PEDro scale and CBN risk of bias tool. Fifty-three trials were included, with 91 treatment effect sizes included in the analyses. The correlation between PEDro scale and CBN risk of bias tool was 0.83 (95% CI 0.76-0.88) after adjusting for reliability, indicating strong convergence. The PEDro score was inversely associated with effect sizes, significantly associated with the summary score for the CBN risk of bias tool, and not associated with the journal impact factor. The interrater reliability for each item of the PEDro scale and CBN risk of bias tool was at least substantial for most items (>0.60). The intraclass correlation coefficient for the PEDro score was 0.80 (95% CI 0.68-0.88), and for the CBN, risk of bias tool was 0.81 (95% CI 0.69-0.88). There was evidence for the convergent and construct validity for the PEDro scale when used to evaluate methodological quality of pharmacological trials. Both risk of bias tools have acceptably high interrater reliability. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. JacketSE: An Offshore Wind Turbine Jacket Sizing Tool; Theory Manual and Sample Usage with Preliminary Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick

    This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.

  15. PFGE MAPPER and PFGE READER: two tools to aid in the analysis and data input of pulse field gel electrophoresis maps.

    PubMed Central

    Shifman, M. A.; Nadkarni, P.; Miller, P. L.

    1992-01-01

    Pulse field gel electrophoresis mapping is an important technique for characterizing large segments of DNA. We have developed two tools to aid in the construction of pulse field electrophoresis gel maps: PFGE READER which stores experimental conditions and calculates fragment sizes and PFGE MAPPER which constructs pulse field gel electrophoresis maps. PMID:1482898

  16. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    1998-07-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to give guidance to their selection of seasonal palettes for use in production of the private-label merchandise of a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  17. When product designers use perceptually based color tools

    NASA Astrophysics Data System (ADS)

    Bender, Walter R.

    2001-01-01

    Palette synthesis and analysis tools have been built based upon a model of color experience. This model adjusts formal compositional elements such as hue, value, chroma, and their contrasts, as well as size and proportion. Clothing and household product designers were given these tools to guide their selection of seasonal palettes in the production of the private-label merchandise in a large retail chain. The designers chose base palettes. Accents to these palettes were generated with and without the aid of the color tools. These palettes are compared by using perceptual metrics and interviews. The results are presented.

  18. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    PubMed

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  19. Hand tool permits shrink sizing of assembled tubing

    NASA Technical Reports Server (NTRS)

    Millett, A.; Odor, M.

    1966-01-01

    Portable tool sizes tubing ends without disassembling the tubing installation. The shrink sizing tool is clamped to the tubing and operated by a ratchet wrench. A gear train forces the tubing end against an appropriate die or mandrel to effect the sizing.

  20. Automated extraction and analysis of rock discontinuity characteristics from 3D point clouds

    NASA Astrophysics Data System (ADS)

    Bianchetti, Matteo; Villa, Alberto; Agliardi, Federico; Crosta, Giovanni B.

    2016-04-01

    A reliable characterization of fractured rock masses requires an exhaustive geometrical description of discontinuities, including orientation, spacing, and size. These are required to describe discontinuum rock mass structure, perform Discrete Fracture Network and DEM modelling, or provide input for rock mass classification or equivalent continuum estimate of rock mass properties. Although several advanced methodologies have been developed in the last decades, a complete characterization of discontinuity geometry in practice is still challenging, due to scale-dependent variability of fracture patterns and difficult accessibility to large outcrops. Recent advances in remote survey techniques, such as terrestrial laser scanning and digital photogrammetry, allow a fast and accurate acquisition of dense 3D point clouds, which promoted the development of several semi-automatic approaches to extract discontinuity features. Nevertheless, these often need user supervision on algorithm parameters which can be difficult to assess. To overcome this problem, we developed an original Matlab tool, allowing fast, fully automatic extraction and analysis of discontinuity features with no requirements on point cloud accuracy, density and homogeneity. The tool consists of a set of algorithms which: (i) process raw 3D point clouds, (ii) automatically characterize discontinuity sets, (iii) identify individual discontinuity surfaces, and (iv) analyse their spacing and persistence. The tool operates in either a supervised or unsupervised mode, starting from an automatic preliminary exploration data analysis. The identification and geometrical characterization of discontinuity features is divided in steps. First, coplanar surfaces are identified in the whole point cloud using K-Nearest Neighbor and Principal Component Analysis algorithms optimized on point cloud accuracy and specified typical facet size. Then, discontinuity set orientation is calculated using Kernel Density Estimation and principal vector similarity criteria. Poles to points are assigned to individual discontinuity objects using easy custom vector clustering and Jaccard distance approaches, and each object is segmented into planar clusters using an improved version of the DBSCAN algorithm. Modal set orientations are then recomputed by cluster-based orientation statistics to avoid the effects of biases related to cluster size and density heterogeneity of the point cloud. Finally, spacing values are measured between individual discontinuity clusters along scanlines parallel to modal pole vectors, whereas individual feature size (persistence) is measured using 3D convex hull bounding boxes. Spacing and size are provided both as raw population data and as summary statistics. The tool is optimized for parallel computing on 64bit systems, and a Graphic User Interface (GUI) has been developed to manage data processing, provide several outputs, including reclassified point clouds, tables, plots, derived fracture intensity parameters, and export to modelling software tools. We present test applications performed both on synthetic 3D data (simple 3D solids) and real case studies, validating the results with existing geomechanical datasets.

  1. Performance Analysis of Scientific and Engineering Applications Using MPInside and TAU

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Mehrotra, Piyush; Taylor, Kenichi Jun Haeng; Shende, Sameer Suresh; Biswas, Rupak

    2010-01-01

    In this paper, we present performance analysis of two NASA applications using performance tools like Tuning and Analysis Utilities (TAU) and SGI MPInside. MITgcmUV and OVERFLOW are two production-quality applications used extensively by scientists and engineers at NASA. MITgcmUV is a global ocean simulation model, developed by the Estimating the Circulation and Climate of the Ocean (ECCO) Consortium, for solving the fluid equations of motion using the hydrostatic approximation. OVERFLOW is a general-purpose Navier-Stokes solver for computational fluid dynamics (CFD) problems. Using these tools, we analyze the MPI functions (MPI_Sendrecv, MPI_Bcast, MPI_Reduce, MPI_Allreduce, MPI_Barrier, etc.) with respect to message size of each rank, time consumed by each function, and how ranks communicate. MPI communication is further analyzed by studying the performance of MPI functions used in these two applications as a function of message size and number of cores. Finally, we present the compute time, communication time, and I/O time as a function of the number of cores.

  2. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    NASA Technical Reports Server (NTRS)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  3. Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Kiefer, D. A.; Turner, W.

    2013-12-01

    This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.

  4. Automating Structural Analysis of Spacecraft Vehicles

    NASA Technical Reports Server (NTRS)

    Hrinda, Glenn A.

    2004-01-01

    A major effort within NASA's vehicle analysis discipline has been to automate structural analysis and sizing optimization during conceptual design studies of advanced spacecraft. Traditional spacecraft structural sizing has involved detailed finite element analysis (FEA) requiring large degree-of-freedom (DOF) finite element models (FEM). Creation and analysis of these models can be time consuming and limit model size during conceptual designs. The goal is to find an optimal design that meets the mission requirements but produces the lightest structure. A structural sizing tool called HyperSizer has been successfully used in the conceptual design phase of a reusable launch vehicle and planetary exploration spacecraft. The program couples with FEA to enable system level performance assessments and weight predictions including design optimization of material selections and sizing of spacecraft members. The software's analysis capabilities are based on established aerospace structural methods for strength, stability and stiffness that produce adequately sized members and reliable structural weight estimates. The software also helps to identify potential structural deficiencies early in the conceptual design so changes can be made without wasted time. HyperSizer's automated analysis and sizing optimization increases productivity and brings standardization to a systems study. These benefits will be illustrated in examining two different types of conceptual spacecraft designed using the software. A hypersonic air breathing, single stage to orbit (SSTO), reusable launch vehicle (RLV) will be highlighted as well as an aeroshell for a planetary exploration vehicle used for aerocapture at Mars. By showing the two different types of vehicles, the software's flexibility will be demonstrated with an emphasis on reducing aeroshell structural weight. Member sizes, concepts and material selections will be discussed as well as analysis methods used in optimizing the structure. Analysis based on the HyperSizer structural sizing software will be discussed. Design trades required to optimize structural weight will be presented.

  5. Measurements of Size Resolved Organic Particulate Mass Using An On-line Aerosol Mass Spectrometer (ams) Laboratory Validation; Analysis Tool Development; and Interpretation of Field Data

    NASA Astrophysics Data System (ADS)

    Alfarra, M. R.; Coe, H.; Allan, J. D.; Bower, K. N.; Garforth, A. A.; Canagaratna, M.; Worsnop, D.

    The aerosol mass spectrometer (AMS) is a quantitative instrument designed to deliver real-time size resolved chemical composition of the volatile and semi volatile aerosol fractions. The AMS response to a wide range of organic compounds has been exper- imentally characterized, and has been shown to compare well with standard libraries of 70 eV electron impact ionization mass spectra. These results will be presented. Due to the scanning nature of the quadrupole mass spectrometer, the AMS provides averaged composition of ensemble of particles rather than single particle composi- tion. However, the mass spectra measured by AMS are reproducible and similar to those of standard libraries so analysis tools can be developed on large mass spectral libraries that can provide chemical composition information about the type of organic compounds in the aerosol. One such tool is presented and compared with laboratory measurements of single species and mixed component organic particles by the AMS. We will then discuss the applicability of these tools to interpreting field AMS data ob- tained in a range of experiments at different sites in the UK and Canada. The data will be combined with other measurements to show the behaviour of the organic aerosol fraction in urban and sub-urban environments.

  6. A simulation model for risk assessment of turbine wheels

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Hage, Richard T.

    1991-01-01

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  7. A simulation model for risk assessment of turbine wheels

    NASA Astrophysics Data System (ADS)

    Safie, Fayssal M.; Hage, Richard T.

    A simulation model has been successfully developed to evaluate the risk of the Space Shuttle auxiliary power unit (APU) turbine wheels for a specific inspection policy. Besides being an effective tool for risk/reliability evaluation, the simulation model also allows the analyst to study the trade-offs between wheel reliability, wheel life, inspection interval, and rejection crack size. For example, in the APU application, sensitivity analysis results showed that the wheel life limit has the least effect on wheel reliability when compared to the effect of the inspection interval and the rejection crack size. In summary, the simulation model developed represents a flexible tool to predict turbine wheel reliability and study the risk under different inspection policies.

  8. Improved Aerodynamic Analysis for Hybrid Wing Body Conceptual Design Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2012-01-01

    This paper provides an overview of ongoing efforts to develop, evaluate, and validate different tools for improved aerodynamic modeling and systems analysis of Hybrid Wing Body (HWB) aircraft configurations. Results are being presented for the evaluation of different aerodynamic tools including panel methods, enhanced panel methods with viscous drag prediction, and computational fluid dynamics. Emphasis is placed on proper prediction of aerodynamic loads for structural sizing as well as viscous drag prediction to develop drag polars for HWB conceptual design optimization. Data from transonic wind tunnel tests at the Arnold Engineering Development Center s 16-Foot Transonic Tunnel was used as a reference data set in order to evaluate the accuracy of the aerodynamic tools. Triangularized surface data and Vehicle Sketch Pad (VSP) models of an X-48B 2% scale wind tunnel model were used to generate input and model files for the different analysis tools. In support of ongoing HWB scaling studies within the NASA Environmentally Responsible Aviation (ERA) program, an improved finite element based structural analysis and weight estimation tool for HWB center bodies is currently under development. Aerodynamic results from these analyses are used to provide additional aerodynamic validation data.

  9. Automatic Differentiation as a tool in engineering design

    NASA Technical Reports Server (NTRS)

    Barthelemy, Jean-Francois M.; Hall, Laura E.

    1992-01-01

    Automatic Differentiation (AD) is a tool that systematically implements the chain rule of differentiation to obtain the derivatives of functions calculated by computer programs. In this paper, it is assessed as a tool for engineering design. The paper discusses the forward and reverse modes of AD, their computing requirements, and approaches to implementing AD. It continues with application to two different tools to two medium-size structural analysis problems to generate sensitivity information typically necessary in an optimization or design situation. The paper concludes with the observation that AD is to be preferred to finite differencing in most cases, as long as sufficient computer storage is available.

  10. The Impact of Size and Specialisation on Universities' Department Performance: A DEA Analysis Applied to Austrian Universities

    ERIC Educational Resources Information Center

    Leitner, Karl-Heinz; Prikoszovits, Julia; Schaffhauser-Linzatti, Michaela; Stowasser, Rainer; Wagner, Karin

    2007-01-01

    This paper explores the performance efficiency of natural and technical science departments at Austrian universities using Data Envelopment Analysis (DEA). We present DEA as an alternative tool for benchmarking and ranking the assignment of decision-making units (organisations and organisational units). The method applies a multiple input and…

  11. Space Launch System Upper Stage Technology Assessment

    NASA Technical Reports Server (NTRS)

    Holladay, Jon; Hampton, Bryan; Monk, Timothy

    2014-01-01

    The Space Launch System (SLS) is envisioned as a heavy-lift vehicle that will provide the foundation for future beyond low-Earth orbit (LEO) exploration missions. Previous studies have been performed to determine the optimal configuration for the SLS and the applicability of commercial off-the-shelf in-space stages for Earth departure. Currently NASA is analyzing the concept of a Dual Use Upper Stage (DUUS) that will provide LEO insertion and Earth departure burns. This paper will explore candidate in-space stages based on the DUUS design for a wide range of beyond LEO missions. Mission payloads will range from small robotic systems up to human systems with deep space habitats and landers. Mission destinations will include cislunar space, Mars, Jupiter, and Saturn. Given these wide-ranging mission objectives, a vehicle-sizing tool has been developed to determine the size of an Earth departure stage based on the mission objectives. The tool calculates masses for all the major subsystems of the vehicle including propellant loads, avionics, power, engines, main propulsion system components, tanks, pressurization system and gases, primary structural elements, and secondary structural elements. The tool uses an iterative sizing algorithm to determine the resulting mass of the stage. Any input into one of the subsystem sizing routines or the mission parameters can be treated as a parametric sweep or as a distribution for use in Monte Carlo analysis. Taking these factors together allows for multi-variable, coupled analysis runs. To increase confidence in the tool, the results have been verified against two point-of-departure designs of the DUUS. The tool has also been verified against Apollo moon mission elements and other manned space systems. This paper will focus on trading key propulsion technologies including chemical, Nuclear Thermal Propulsion (NTP), and Solar Electric Propulsion (SEP). All of the key performance inputs and relationships will be presented and discussed in light of the various missions. For each mission there are several trajectory options and each will be discussed in terms of delta-v required and transit duration. Each propulsion system will be modeled, sized, and judged based on their applicability to the whole range of beyond LEO missions. Criteria for scoring will include the resulting dry mass of the stage, resulting propellant required, time to destination, and an assessment of key enabling technologies. In addition to the larger metrics, this paper will present the results of several coupled sensitivity studies. The ultimate goals of these tools and studies are to provide NASA with the most mass-, technology-, and cost-effective in-space stage for its future exploration missions.

  12. Generic Modeling of a Life Support System for Process Technology Comparison

    NASA Technical Reports Server (NTRS)

    Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.

    1993-01-01

    This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.

  13. Agreement between clinical estimation and a new quantitative analysis by Photoshop software in fundus and angiographic image variables.

    PubMed

    Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R

    2009-12-01

    To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.

  14. Cost analysis of objective resident cataract surgery assessments.

    PubMed

    Nandigam, Kiran; Soh, Jonathan; Gensheimer, William G; Ghazi, Ahmed; Khalifa, Yousuf M

    2015-05-01

    To compare 8 ophthalmology resident surgical training tools to determine which is most cost effective. University of Rochester Medical Center, Rochester, New York, USA. Retrospective evaluation of technology. A cost-analysis model was created to compile all relevant costs in running each tool in a medium-sized ophthalmology program. Quantitative cost estimates were obtained based on cost of tools, cost of time in evaluations, and supply and maintenance costs. For wet laboratory simulation, Eyesi was the least expensive cataract surgery simulation method; however, it is only capable of evaluating simulated cataract surgery rehearsal and requires supplementation with other evaluative methods for operating room performance and for noncataract wet lab training and evaluation. The most expensive training tool was the Eye Surgical Skills Assessment Test (ESSAT). The 2 most affordable methods for resident evaluation in operating room performance were the Objective Assessment of Skills in Intraocular Surgery (OASIS) and Global Rating Assessment of Skills in Intraocular Surgery (GRASIS). Cost-based analysis of ophthalmology resident surgical training tools are needed so residency programs can implement tools that are valid, reliable, objective, and cost effective. There is no perfect training system at this time. Copyright © 2015 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  15. Irena : tool suite for modeling and analysis of small-angle scattering.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilavsky, J.; Jemian, P.

    2009-04-01

    Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less

  16. Measuring the costs and benefits of conservation programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Einhorn, M.A.

    1985-07-25

    A step-by-step analysis of the effects of utility-sponsored conservation promoting programs begins by identifying several factors which will reduce a program's effectiveness. The framework for measuring cost savings and designing a conservation program needs to consider the size of appliance subsidies, what form incentives should take, and how will customer behavior change as a result of incentives. Continual reevaluation is necessary to determine whether to change the size of rebates or whether to continue the program. Analytical tools for making these determinations are improving as conceptual breakthroughs in econometrics permit more rigorous analysis. 5 figures.

  17. How many holes is too many? A prototype tool for estimating mosquito entry risk into damaged bed nets.

    PubMed

    Sutcliffe, James; Ji, Xin; Yin, Shaoman

    2017-08-01

    Insecticide-treated bed nets (ITNs) have played an integral role in malaria reduction but how insecticide depletion and accumulating physical damage affect ITN performance is poorly understood. More accurate methods are needed to assess damage to bed nets so that they can be designed, deployed and replaced optimally. Video recordings of female Anopheles gambiae in near approach (1-½ cm) to occupied untreated rectangular bed nets in a laboratory study were used to quantify the amount of mosquito activity (appearances over time) around different parts of the net, the per-appearance probability of a mosquito coming close to holes of different sizes (hole encounter) and the per-encounter probability of mosquitoes passing through holes of different sizes (hole passage). Appearance frequency on different parts of the net reflected previously reported patterns: the area of the net under greatest mosquito pressure was the roof, followed by the bottom 30 cm of the sides, followed by the 30 cm area immediately above this, followed by the upper two-thirds of the sides. The ratio of activity in these areas was (respectively) 250:33:5:1. Per-appearance probability of hole encounter on all parts of the net was strongly predicted by a factor combining hole perimeter and area. Per-encounter probability of hole passage, in turn, was strongly predicted by hole width. For a given width, there was a 20% greater risk of passage through holes on the roof than holes on the sides. Appearance, encounter and passage predictors correspond to various mosquito behaviours that have previously been described and are combined into a prototype mosquito entry risk tool that predicts mosquito entry rates for nets with various amounts of damage. Scenarios that use the entry risk tool to test the recommendations of the WHOPES proportionate hole index (pHI) suggest that the pHI hole size categories and failure to account for hole location likely sometimes lead to incorrect conclusions about net serviceability that could be avoided by using an entry risk tool of the form presented here instead. Practical methods of collecting hole position, shape and size information for bed net assessments using the tool in the field are discussed and include using image analysis and on-line geometric analysis tools.

  18. Getting the most out of RNA-seq data analysis.

    PubMed

    Khang, Tsung Fei; Lau, Ching Yee

    2015-01-01

    Background. A common research goal in transcriptome projects is to find genes that are differentially expressed in different phenotype classes. Biologists might wish to validate such gene candidates experimentally, or use them for downstream systems biology analysis. Producing a coherent differential gene expression analysis from RNA-seq count data requires an understanding of how numerous sources of variation such as the replicate size, the hypothesized biological effect size, and the specific method for making differential expression calls interact. We believe an explicit demonstration of such interactions in real RNA-seq data sets is of practical interest to biologists. Results. Using two large public RNA-seq data sets-one representing strong, and another mild, biological effect size-we simulated different replicate size scenarios, and tested the performance of several commonly-used methods for calling differentially expressed genes in each of them. We found that, when biological effect size was mild, RNA-seq experiments should focus on experimental validation of differentially expressed gene candidates. Importantly, at least triplicates must be used, and the differentially expressed genes should be called using methods with high positive predictive value (PPV), such as NOISeq or GFOLD. In contrast, when biological effect size was strong, differentially expressed genes mined from unreplicated experiments using NOISeq, ASC and GFOLD had between 30 to 50% mean PPV, an increase of more than 30-fold compared to the cases of mild biological effect size. Among methods with good PPV performance, having triplicates or more substantially improved mean PPV to over 90% for GFOLD, 60% for DESeq2, 50% for NOISeq, and 30% for edgeR. At a replicate size of six, we found DESeq2 and edgeR to be reasonable methods for calling differentially expressed genes at systems level analysis, as their PPV and sensitivity trade-off were superior to the other methods'. Conclusion. When biological effect size is weak, systems level investigation is not possible using RNAseq data, and no meaningful result can be obtained in unreplicated experiments. Nonetheless, NOISeq or GFOLD may yield limited numbers of gene candidates with good validation potential, when triplicates or more are available. When biological effect size is strong, NOISeq and GFOLD are effective tools for detecting differentially expressed genes in unreplicated RNA-seq experiments for qPCR validation. When triplicates or more are available, GFOLD is a sharp tool for identifying high confidence differentially expressed genes for targeted qPCR validation; for downstream systems level analysis, combined results from DESeq2 and edgeR are useful.

  19. A RAND Analysis Tool for Intelligence, Surveillance, and Reconnaissance: The Collections Operations Model

    DTIC Science & Technology

    2008-01-01

    appropriate; scan cycle, emission interval, or emission probability; frequency bands; relative angular size of 2 Carl Rhodes, Jeff Hagen, and Mark...choices based on its own perceptions. An agent has autonomy. 2 In this report, “behaviors” are individual scripts , programs, instructions, or decision...relative angular size of main and side lobes (for directional signals); and the effective radiated power of each radiative lobe. With these parameters and

  20. Analysis of significantly mutated genes as a clinical tool for the diagnosis in a case of lung cancer.

    PubMed

    Miyashita, Yoshihiro; Hirotsu, Yosuke; Tsutsui, Toshiharu; Higashi, Seishi; Sogami, Yusuke; Kakizaki, Yumiko; Goto, Taichiro; Amemiya, Kenji; Oyama, Toshio; Omata, Masao

    2017-01-01

    Bronchoendoscopic examination is not necessarily comfortable procedure and limited by its sensitivity, depending on the location and size of the tumor lesion. Patients with a non-diagnostic bronchoendoscopic examination often undergo further invasive examinations. Non-invasive diagnostic tool of lung cancer is desired. A 72-year-old man had a 3.0 cm × 2.5 cm mass lesion in the segment B1 of right lung. Cytological examination of sputum, bronchial washing and curetted samples were all "negative". We could confirm a diagnosis of lung cancer after right upper lung lobe resection pathologically, and also obtained concordant results by genomic analysis using cytological negative samples from airways collected before operation. Genetic analysis showed mutational profiles of both resected specimens and samples from airways were identical. These data clearly indicated the next generation sequencing (NGS) may yield a diagnostic tool to conduct "precision medicine".

  1. Children's estimates of food portion size: the development and evaluation of three portion size assessment tools for use with children.

    PubMed

    Foster, E; Matthews, J N S; Lloyd, J; Marshall, L; Mathers, J C; Nelson, M; Barton, K L; Wrieden, W L; Cornelissen, P; Harris, J; Adamson, A J

    2008-01-01

    A number of methods have been developed to assist subjects in providing an estimate of portion size but their application in improving portion size estimation by children has not been investigated systematically. The aim was to develop portion size assessment tools for use with children and to assess the accuracy of children's estimates of portion size using the tools. The tools were food photographs, food models and an interactive portion size assessment system (IPSAS). Children (n 201), aged 4-16 years, were supplied with known quantities of food to eat, in school. Food leftovers were weighed. Children estimated the amount of each food using each tool, 24 h after consuming the food. The age-specific portion sizes represented were based on portion sizes consumed by children in a national survey. Significant differences were found between the accuracy of estimates using the three tools. Children of all ages performed well using the IPSAS and food photographs. The accuracy and precision of estimates made using the food models were poor. For all tools, estimates of the amount of food served were more accurate than estimates of the amount consumed. Issues relating to reporting of foods left over which impact on estimates of the amounts of foods actually consumed require further study. The IPSAS has shown potential for assessment of dietary intake with children. Before practical application in assessment of dietary intake of children the tool would need to be expanded to cover a wider range of foods and to be validated in a 'real-life' situation.

  2. Efficient, Multi-Scale Designs Take Flight

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.

  3. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  4. Visualizing Phylogenetic Treespace Using Cartographic Projections

    NASA Astrophysics Data System (ADS)

    Sundberg, Kenneth; Clement, Mark; Snell, Quinn

    Phylogenetic analysis is becoming an increasingly important tool for biological research. Applications include epidemiological studies, drug development, and evolutionary analysis. Phylogenetic search is a known NP-Hard problem. The size of the data sets which can be analyzed is limited by the exponential growth in the number of trees that must be considered as the problem size increases. A better understanding of the problem space could lead to better methods, which in turn could lead to the feasible analysis of more data sets. We present a definition of phylogenetic tree space and a visualization of this space that shows significant exploitable structure. This structure can be used to develop search methods capable of handling much larger datasets.

  5. A Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing (SAPE)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2009-01-01

    SAPE is a Python-based multidisciplinary analysis tool for systems analysis of planetary entry, descent, and landing (EDL) for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. The purpose of SAPE is to provide a variable-fidelity capability for conceptual and preliminary analysis within the same framework. SAPE includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and structural sizing. SAPE uses the Python language-a platform-independent open-source software for integration and for the user interface. The development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE runs on Microsoft Windows and Apple Mac OS X and has been partially tested on Linux.

  6. Feasibility analysis of a Commercial HPWH with CO 2 Refrigerant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nawaz, Kashif; Shen, Bo; Elatar, Ahmed F.

    2017-02-12

    A scoping-level analysis has conducted to establish the feasibility of using CO 2 as refrigerant for a commercial heat pump water heater (HPWH) for U.S. applications. The DOE/ORNL Heat Pump Design Model (HPDM) modeling tool was used for the assessment with data from a Japanese heat pump water heater (Sanden) using CO 2 as refrigerant for calibration. A CFD modeling tool was used to further refine the HPDM tank model. After calibration, the model was used to simulate the performance of commercial HPWHs using CO 2 and R-134a (baseline). The parametric analysis concluded that compressor discharge pressure and water temperaturemore » stratification are critical parameters for the system. For comparable performance the compressor size and water-heater size can be significantly different for R-134 and CO 2 HPWHs. The proposed design deploying a gas-cooler configuration not only exceeds the Energy Star Energy Factor criteria i.e. 2.20, but is also comparable to some of the most efficient products in the market using conventional refrigerants.« less

  7. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    PubMed

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model anatomical design studies are performed on a total of eight different complex patient specific anatomies. Using SURGEM, more than 30 new anatomical designs (or candidate configurations) are created, and the corresponding user times presented. CFD performances for eight of these candidate configurations are also presented.

  9. PyHLA: tests for the association between HLA alleles and diseases.

    PubMed

    Fan, Yanhui; Song, You-Qiang

    2017-02-06

    Recently, several tools have been designed for human leukocyte antigen (HLA) typing using single nucleotide polymorphism (SNP) array and next-generation sequencing (NGS) data. These tools provide high-throughput and cost-effective approaches for identifying HLA types. Therefore, tools for downstream association analysis are highly desirable. Although several tools have been designed for multi-allelic marker association analysis, they were designed only for microsatellite markers and do not scale well with increasing data volumes, or they were designed for large-scale data but provided a limited number of tests. We have developed a Python package called PyHLA, which implements several methods for HLA association analysis, to fill the gap. PyHLA is a tailor-made, easy to use, and flexible tool designed specifically for the association analysis of the HLA types imputed from genome-wide genotyping and NGS data. PyHLA provides functions for association analysis, zygosity tests, and interaction tests between HLA alleles and diseases. Monte Carlo permutation and several methods for multiple testing corrections have also been implemented. PyHLA provides a convenient and powerful tool for HLA analysis. Existing methods have been integrated and desired methods have been added in PyHLA. Furthermore, PyHLA is applicable to small and large sample sizes and can finish the analysis in a timely manner on a personal computer with different platforms. PyHLA is implemented in Python. PyHLA is a free, open source software distributed under the GPLv2 license. The source code, tutorial, and examples are available at https://github.com/felixfan/PyHLA.

  10. Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization

    NASA Technical Reports Server (NTRS)

    Gern, Frank H.

    2015-01-01

    HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.

  11. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    PubMed Central

    2013-01-01

    Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934 PMID:23402499

  12. Decorrelation correction for nanoparticle tracking analysis of dilute polydisperse suspensions in bulk flow

    NASA Astrophysics Data System (ADS)

    Hartman, John; Kirby, Brian

    2017-03-01

    Nanoparticle tracking analysis, a multiprobe single particle tracking technique, is a widely used method to quickly determine the concentration and size distribution of colloidal particle suspensions. Many popular tools remove non-Brownian components of particle motion by subtracting the ensemble-average displacement at each time step, which is termed dedrifting. Though critical for accurate size measurements, dedrifting is shown here to introduce significant biasing error and can fundamentally limit the dynamic range of particle size that can be measured for dilute heterogeneous suspensions such as biological extracellular vesicles. We report a more accurate estimate of particle mean-square displacement, which we call decorrelation analysis, that accounts for correlations between individual and ensemble particle motion, which are spuriously introduced by dedrifting. Particle tracking simulation and experimental results show that this approach more accurately determines particle diameters for low-concentration polydisperse suspensions when compared with standard dedrifting techniques.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salama, A.; Mikhail, M.

    Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less

  14. Bioelectrical impedance analysis: A new tool for assessing fish condition

    USGS Publications Warehouse

    Hartman, Kyle J.; Margraf, F. Joseph; Hafs, Andrew W.; Cox, M. Keith

    2015-01-01

    Bioelectrical impedance analysis (BIA) is commonly used in human health and nutrition fields but has only recently been considered as a potential tool for assessing fish condition. Once BIA is calibrated, it estimates fat/moisture levels and energy content without the need to kill fish. Despite the promise held by BIA, published studies have been divided on whether BIA can provide accurate estimates of body composition in fish. In cases where BIA was not successful, the models lacked the range of fat levels or sample sizes we determined were needed for model success (range of dry fat levels of 29%, n = 60, yielding an R2 of 0.8). Reduced range of fat levels requires an increased sample size to achieve that benchmark; therefore, standardization of methods is needed. Here we discuss standardized methods based on a decade of research, identify sources of error, discuss where BIA is headed, and suggest areas for future research.

  15. Analysis of copy number variants by three detection algorithms and their association with body size in horses.

    PubMed

    Metzger, Julia; Philipp, Ute; Lopes, Maria Susana; da Camara Machado, Artur; Felicetti, Michela; Silvestrelli, Maurizio; Distl, Ottmar

    2013-07-18

    Copy number variants (CNVs) have been shown to play an important role in genetic diversity of mammals and in the development of many complex phenotypic traits. The aim of this study was to perform a standard comparative evaluation of CNVs in horses using three different CNV detection programs and to identify genomic regions associated with body size in horses. Analysis was performed using the Illumina Equine SNP50 genotyping beadchip for 854 horses. CNVs were detected by three different algorithms, CNVPartition, PennCNV and QuantiSNP. Comparative analysis revealed 50 CNVs that affected 153 different genes mainly involved in sensory perception, signal transduction and cellular components. Genome-wide association analysis for body size showed highly significant deleted regions on ECA1, ECA8 and ECA9. Homologous regions to the detected CNVs on ECA1 and ECA9 have also been shown to be correlated with human height. Comparative analysis of CNV detection algorithms was useful to increase the specificity of CNV detection but had certain limitations dependent on the detection tool. GWAS revealed genome-wide associated CNVs for body size in horses.

  16. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  17. P-TRAP: a Panicle TRAit Phenotyping tool.

    PubMed

    A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza

    2013-08-29

    In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods.

  18. P-TRAP: a Panicle Trait Phenotyping tool

    PubMed Central

    2013-01-01

    Background In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. Results This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. Conclusions P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user-friendly environment for experiments. The experimental results showed very good accuracy compared to field operator, expert verification and well-known academic methods. PMID:23987653

  19. Launch vehicle design and GNC sizing with ASTOS

    NASA Astrophysics Data System (ADS)

    Cremaschi, Francesco; Winter, Sebastian; Rossi, Valerio; Wiegand, Andreas

    2018-03-01

    The European Space Agency (ESA) is currently involved in several activities related to launch vehicle designs (Future Launcher Preparatory Program, Ariane 6, VEGA evolutions, etc.). Within these activities, ESA has identified the importance of developing a simulation infrastructure capable of supporting the multi-disciplinary design and preliminary guidance navigation and control (GNC) design of different launch vehicle configurations. Astos Solutions has developed the multi-disciplinary optimization and launcher GNC simulation and sizing tool (LGSST) under ESA contract. The functionality is integrated in the Analysis, Simulation and Trajectory Optimization Software for space applications (ASTOS) and is intended to be used from the early design phases up to phase B1 activities. ASTOS shall enable the user to perform detailed vehicle design tasks and assessment of GNC systems, covering all aspects of rapid configuration and scenario management, sizing of stages, trajectory-dependent estimation of structural masses, rigid and flexible body dynamics, navigation, guidance and control, worst case analysis, launch safety analysis, performance analysis, and reporting.

  20. Optimization Issues with Complex Rotorcraft Comprehensive Analysis

    NASA Technical Reports Server (NTRS)

    Walsh, Joanne L.; Young, Katherine C.; Tarzanin, Frank J.; Hirsh, Joel E.; Young, Darrell K.

    1998-01-01

    This paper investigates the use of the general purpose automatic differentiation (AD) tool called Automatic Differentiation of FORTRAN (ADIFOR) as a means of generating sensitivity derivatives for use in Boeing Helicopter's proprietary comprehensive rotor analysis code (VII). ADIFOR transforms an existing computer program into a new program that performs a sensitivity analysis in addition to the original analysis. In this study both the pros (exact derivatives, no step-size problems) and cons (more CPU, more memory) of ADIFOR are discussed. The size (based on the number of lines) of the VII code after ADIFOR processing increased by 70 percent and resulted in substantial computer memory requirements at execution. The ADIFOR derivatives took about 75 percent longer to compute than the finite-difference derivatives. However, the ADIFOR derivatives are exact and are not functions of step-size. The VII sensitivity derivatives generated by ADIFOR are compared with finite-difference derivatives. The ADIFOR and finite-difference derivatives are used in three optimization schemes to solve a low vibration rotor design problem.

  1. Inertial focusing of microparticles and its limitations

    NASA Astrophysics Data System (ADS)

    Cruz, FJ; Hooshmand Zadeh, S.; Wu, ZG; Hjort, K.

    2016-10-01

    Microfluidic devices are useful tools for healthcare, biological and chemical analysis and materials synthesis amongst fields that can benefit from the unique physics of these systems. In this paper we studied inertial focusing as a tool for hydrodynamic sorting of particles by size. Theory and experimental results are provided as a background for a discussion on how to extend the technology to submicron particles. Different geometries and dimensions of microchannels were designed and simulation data was compared to the experimental results.

  2. The effectiveness of tools used to evaluate successful critical decision making skills for applicants to healthcare graduate educational programs: a systematic review.

    PubMed

    Benham, Brian; Hawley, Diane

    2015-05-15

    Students leave healthcare academic programs for a variety of reasons. When they attrite, it is disappointing for the student as well as their faculty. Advanced practice nursing and other healthcare professions require not only extensive academic preparation, but also the ability to critically evaluate patient care situations. The ability to critically evaluate a situation is not innate. Critical decision making skills are high level skills that are difficult to assess. For the purpose of this review, critical decision making and critical thinking skills refer to the same constructs and will be referred to globally as critical decision making skills. The objective of this review was to identify the effectiveness of tools used to evaluate critical decision making skills for applicants to healthcare graduate educational programs. Adult (18 years of age or older) applicants, students enrolled and/or recent graduates (within one year from completion) of healthcare graduate educational programs. Types of interventions: This review considered studies that evaluated the utilization of unique tools as well as standard tools, such as the Graduate Record Exam or grade point average, to evaluate critical decision making skills in graduate healthcare program applicants. Types of studies: Experimental and non-experimental studies were considered for inclusion. Types of outcomes: Successful quantitative evaluations based on specific field of study standards. The search strategy aimed to find both published and unpublished studies. Studies published in English after 1969 were considered for inclusion in this review. Databases that included both published and unpublished (grey) literature were searched. Additionally, reference lists from all articles retrieved were examined for articles for inclusion. Selected papers were assessed by two independent reviewers using standardized critical appraisal instruments from Joanna Briggs Institute. Any disagreement between reviewers was resolved through discussion or with a third reviewer. Data was extracted independently by each reviewer from papers included in the review using a Microsoft Excel spreadsheet. Included data included study type, 'r' values, number of subjects and reported 'p' values. These were indexed by author, year and study title. The meta-analysis was performed using the method for effect size analysis from Hunter and Schmidt. The syntax for equations was transposed into a Microsoft Excel spreadsheet for data entry, analysis and graph creation. No articles or paper addressing unique tools for ascertaining critical decision making skills met the inclusion criteria. Standard tools, which were represented in the literature, assess critical decision making skills via prediction of academic and clinical success, which indicates the presence of critical decision making skills in graduate healthcare students. A total of 16 studies addressing standard tools were included in this review. All were retrospective case series studies. The date range for the included studies was 1970 to 2009. The strongest relationship was undergraduate grade point average's correlation to graduate grade point average (small effect size with an 'r' value of 0.27, credibility interval of 0.18-0.37). The second strongest relationship was between Graduate Record Examination’s verbal section and graduate grade point average (small effect size with an r value of 0.24, CrI of 0.11-0.37). An applicant’s undergraduate GPA has the strongest correlation with graduate healthcare program success of the indicators analyzed (r = 0.27, small effect size). The next best predictor of graduate healthcare program success was the GRE Verbal score (r = 0.24, small effect size). However, all of the variables carried positive correlations with graduate success, just of lesser effect size strength. This review supports the continued use of traditional indicators of graduate school potential in the undergraduate grade point average and the various sections of the Graduate Record Examination for the selection of graduate healthcare applicants. Primary studies should be funded and performed to assess the use of unique tools in assessing critical thinking in graduate healthcare students. The Joanna Briggs Institute.

  3. GUIDELINES FOR THE APPLICATION OF SEM/EDX ANALYTICAL TECHNIQUES FOR FINE AND COARSE PM SAMPLES

    EPA Science Inventory

    Scanning Electron Microscopy (SEM) coupled with Energy-Dispersive X-ray analysis (EDX) is a powerful tool in the characterization and source apportionment of environmental particulate matter (PM), providing size, chemistry, and morphology of particles as small as a few tenths ...

  4. MetaGenyo: a web tool for meta-analysis of genetic association studies.

    PubMed

    Martorell-Marugan, Jordi; Toro-Dominguez, Daniel; Alarcon-Riquelme, Marta E; Carmona-Saez, Pedro

    2017-12-16

    Genetic association studies (GAS) aims to evaluate the association between genetic variants and phenotypes. In the last few years, the number of this type of study has increased exponentially, but the results are not always reproducible due to experimental designs, low sample sizes and other methodological errors. In this field, meta-analysis techniques are becoming very popular tools to combine results across studies to increase statistical power and to resolve discrepancies in genetic association studies. A meta-analysis summarizes research findings, increases statistical power and enables the identification of genuine associations between genotypes and phenotypes. Meta-analysis techniques are increasingly used in GAS, but it is also increasing the amount of published meta-analysis containing different errors. Although there are several software packages that implement meta-analysis, none of them are specifically designed for genetic association studies and in most cases their use requires advanced programming or scripting expertise. We have developed MetaGenyo, a web tool for meta-analysis in GAS. MetaGenyo implements a complete and comprehensive workflow that can be executed in an easy-to-use environment without programming knowledge. MetaGenyo has been developed to guide users through the main steps of a GAS meta-analysis, covering Hardy-Weinberg test, statistical association for different genetic models, analysis of heterogeneity, testing for publication bias, subgroup analysis and robustness testing of the results. MetaGenyo is a useful tool to conduct comprehensive genetic association meta-analysis. The application is freely available at http://bioinfo.genyo.es/metagenyo/ .

  5. Resource depletion through primate stone technology

    PubMed Central

    Tan, Amanda; Haslam, Michael; Kulik, Lars; Proffitt, Tomos; Malaivijitnond, Suchinda; Gumert, Michael

    2017-01-01

    Tool use has allowed humans to become one of the most successful species. However, tool-assisted foraging has also pushed many of our prey species to extinction or endangerment, a technology-driven process thought to be uniquely human. Here, we demonstrate that tool-assisted foraging on shellfish by long-tailed macaques (Macaca fascicularis) in Khao Sam Roi Yot National Park, Thailand, reduces prey size and prey abundance, with more pronounced effects where the macaque population size is larger. We compared availability, sizes and maturation stages of shellfish between two adjacent islands inhabited by different-sized macaque populations and demonstrate potential effects on the prey reproductive biology. We provide evidence that once technological macaques reach a large enough group size, they enter a feedback loop – driving shellfish prey size down with attendant changes in the tool sizes used by the monkeys. If this pattern continues, prey populations could be reduced to a point where tool-assisted foraging is no longer beneficial to the macaques, which in return may lessen or extinguish the remarkable foraging technology employed by these primates. PMID:28884681

  6. On the use of cartographic projections in visualizing phylo-genetic tree space

    PubMed Central

    2010-01-01

    Phylogenetic analysis is becoming an increasingly important tool for biological research. Applications include epidemiological studies, drug development, and evolutionary analysis. Phylogenetic search is a known NP-Hard problem. The size of the data sets which can be analyzed is limited by the exponential growth in the number of trees that must be considered as the problem size increases. A better understanding of the problem space could lead to better methods, which in turn could lead to the feasible analysis of more data sets. We present a definition of phylogenetic tree space and a visualization of this space that shows significant exploitable structure. This structure can be used to develop search methods capable of handling much larger data sets. PMID:20529355

  7. Equation-free analysis of agent-based models and systematic parameter determination

    NASA Astrophysics Data System (ADS)

    Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.

    2016-12-01

    Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for configuring the system specific EF parameters.

  8. Ion mobility-mass spectrometry as a tool to investigate protein-ligand interactions.

    PubMed

    Göth, Melanie; Pagel, Kevin

    2017-07-01

    Ion mobility-mass spectrometry (IM-MS) is a powerful tool for the simultaneous analysis of mass, charge, size, and shape of ionic species. It allows the characterization of even low-abundant species in complex samples and is therefore particularly suitable for the analysis of proteins and their assemblies. In the last few years even complex and intractable species have been investigated successfully with IM-MS and the number of publications in this field is steadily growing. This trend article highlights recent advances in which IM-MS was used to study protein-ligand complexes and in particular focuses on the catch and release (CaR) strategy and collision-induced unfolding (CIU). Graphical Abstract Native mass spectrometry and ion mobility-mass spectrometry are versatile tools to follow the stoichiometry, energetics, and structural impact of protein-ligand binding.

  9. Habitat Design Optimization and Analysis

    NASA Technical Reports Server (NTRS)

    SanSoucie, Michael P.; Hull, Patrick V.; Tinker, Michael L.

    2006-01-01

    Long-duration surface missions to the Moon and Mars will require habitats for the astronauts. The materials chosen for the habitat walls play a direct role in the protection against the harsh environments found on the surface. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Advanced optimization techniques are necessary for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat design optimization tool utilizing genetic algorithms has been developed. Genetic algorithms use a "survival of the fittest" philosophy, where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multi-objective formulation of structural analysis, heat loss, radiation protection, and meteoroid protection. This paper presents the research and development of this tool.

  10. Rapid fabrication of embossing tools for the production of polymeric microfluidic devices for bioanalytical applications

    NASA Astrophysics Data System (ADS)

    Ford, Sean M.; McCandless, Andrew B.; Liu, Xuezhu; Soper, Steven A.

    2001-09-01

    In this paper we present embossing tools that were fabricated using both UV and X-ray lithography. The embossing tools created were used to emboss microfluidic channels for bioanalytical applications. Specifically, two tools were fabricated. One, using x-ray lithography, was fabricated for electrophoretic separations of DNA restriction fragment analysis. A second tool, fabricated using SU8, was designed for micro PCR applications. Depths of both tools were approximately 100 micrometers . Both tools were made by directly electroforming nickel on a stainless steel base. Fabrication time for the tool fabricated using x-ray lithography was less than 1 week, and largely depended on the availability of the x-ray source. The SU8 embossing tool was fabricated in less than 24 hours. The resulting nickel electroforms from both processes were extremely robust and did not fail under embossing conditions required for PMMA and/or polycarbonate. Some problems removing SU8 after electroforming were sen for smaller size gaps between nickel structures.

  11. The value of job analysis, job description and performance.

    PubMed

    Wolfe, M N; Coggins, S

    1997-01-01

    All companies, regardless of size, are faced with the same employment concerns. Efficient personnel management requires the use of three human resource techniques--job analysis, job description and performance appraisal. These techniques and tools are not for large practices only. Small groups can obtain the same benefits by employing these performance control measures. Job analysis allows for the development of a compensation system. Job descriptions summarize the most important duties. Performance appraisals help reward outstanding work.

  12. Lunar Outpost Technologies Breakeven Study

    NASA Technical Reports Server (NTRS)

    Perka, Alan

    2008-01-01

    This viewgraph presentation compares several Lunar Outpost (LO) life support technology combinations, evaluates the combinations for two clothing options, (i.e., Disposable clothing, and using Laundry to clean the soiled clothing) and evaluates the use of the Advanced Life Support Sizing and Analysis Tool (ALSSAT) to estimate Equivalent System Mass (ESM)

  13. Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...

  14. StrAuto: automation and parallelization of STRUCTURE analysis.

    PubMed

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  15. Mars Reconnaissance Orbiter Uplink Analysis Tool

    NASA Technical Reports Server (NTRS)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; Hwang, Pauline

    2008-01-01

    This software analyzes Mars Reconnaissance Orbiter (MRO) orbital geometry with respect to Mars Exploration Rover (MER) contact windows, and is the first tool of its kind designed specifically to support MRO-MER interface coordination. Prior to this automated tool, this analysis was done manually with Excel and the UNIX command line. In total, the process would take approximately 30 minutes for each analysis. The current automated analysis takes less than 30 seconds. This tool resides on the flight machine and uses a PHP interface that does the entire analysis of the input files and takes into account one-way light time from another input file. Input flies are copied over to the proper directories and are dynamically read into the tool s interface. The user can then choose the corresponding input files based on the time frame desired for analysis. After submission of the Web form, the tool merges the two files into a single, time-ordered listing of events for both spacecraft. The times are converted to the same reference time (Earth Transmit Time) by reading in a light time file and performing the calculations necessary to shift the time formats. The program also has the ability to vary the size of the keep-out window on the main page of the analysis tool by inputting a custom time for padding each MRO event time. The parameters on the form are read in and passed to the second page for analysis. Everything is fully coded in PHP and can be accessed by anyone with access to the machine via Web page. This uplink tool will continue to be used for the duration of the MER mission's needs for X-band uplinks. Future missions also can use the tools to check overflight times as well as potential site observation times. Adaptation of the input files to the proper format, and the window keep-out times, would allow for other analyses. Any operations task that uses the idea of keep-out windows will have a use for this program.

  16. Exploring radar and lightning variables associated with the Lightning Jump. Can we predict the size of the hail?

    NASA Astrophysics Data System (ADS)

    Farnell, C.; Rigo, T.; Pineda, N.

    2018-04-01

    Severe weather regularly hits the Lleida Plain (western part of Catalonia, NE of Iberian Peninsula), causing important damage to the local agriculture. In order to help severe weather surveillance tasks, the Meteorological Service of Catalonia (SMC) implemented in 2016 the Lightning Jump (LJ) algorithm as operative warning tool after an exhaustive validation phase of several months. The present study delves into the analysis of the relationship between Lightning Jump alerts and hail occurrence, through the analysis of lightning and radar variables in the moment when the warning is issued. Overall, the study has consisted of the analysis of 149 cases, grouping them into two categories according to hail size: small and large hail, with a threshold of 2 cm of diameter. The thunderstorms related to big sized hail presented remarkable differences in some of the variables analysed that could help forecast the size of hail when the LJ alert is triggered. Moreover, other variables have been allowed to observe and to corroborate how the LJ algorithm works during the 13 min before the warning is triggered.

  17. Laser cutting of various materials: Kerf width size analysis and life cycle assessment of cutting process

    NASA Astrophysics Data System (ADS)

    Yilbas, Bekir Sami; Shaukat, Mian Mobeen; Ashraf, Farhan

    2017-08-01

    Laser cutting of various materials including Ti-6Al-4V alloy, steel 304, Inconel 625, and alumina is carried out to assess the kerf width size variation along the cut section. The life cycle assessment is carried out to determine the environmental impact of the laser cutting in terms of the material waste during the cutting process. The kerf width size is formulated and predicted using the lump parameter analysis and it is measured from the experiments. The influence of laser output power and laser cutting speed on the kerf width size variation is analyzed using the analytical tools including scanning electron and optical microscopes. In the experiments, high pressure nitrogen assisting gas is used to prevent oxidation reactions in the cutting section. It is found that the kerf width size predicted from the lump parameter analysis agrees well with the experimental data. The kerf width size variation increases with increasing laser output power. However, this behavior reverses with increasing laser cutting speed. The life cycle assessment reveals that material selection for laser cutting is critical for the environmental protection point of view. Inconel 625 contributes the most to the environmental damages; however, recycling of the waste of the laser cutting reduces this contribution.

  18. Classification Algorithms for Big Data Analysis, a Map Reduce Approach

    NASA Astrophysics Data System (ADS)

    Ayma, V. A.; Ferreira, R. S.; Happ, P.; Oliveira, D.; Feitosa, R.; Costa, G.; Plaza, A.; Gamba, P.

    2015-03-01

    Since many years ago, the scientific community is concerned about how to increase the accuracy of different classification methods, and major achievements have been made so far. Besides this issue, the increasing amount of data that is being generated every day by remote sensors raises more challenges to be overcome. In this work, a tool within the scope of InterIMAGE Cloud Platform (ICP), which is an open-source, distributed framework for automatic image interpretation, is presented. The tool, named ICP: Data Mining Package, is able to perform supervised classification procedures on huge amounts of data, usually referred as big data, on a distributed infrastructure using Hadoop MapReduce. The tool has four classification algorithms implemented, taken from WEKA's machine learning library, namely: Decision Trees, Naïve Bayes, Random Forest and Support Vector Machines (SVM). The results of an experimental analysis using a SVM classifier on data sets of different sizes for different cluster configurations demonstrates the potential of the tool, as well as aspects that affect its performance.

  19. Web-Based Phylogenetic Assignment Tool for Analysis of Terminal Restriction Fragment Length Polymorphism Profiles of Microbial Communities

    PubMed Central

    Kent, Angela D.; Smith, Dan J.; Benson, Barbara J.; Triplett, Eric W.

    2003-01-01

    Culture-independent DNA fingerprints are commonly used to assess the diversity of a microbial community. However, relating species composition to community profiles produced by community fingerprint methods is not straightforward. Terminal restriction fragment length polymorphism (T-RFLP) is a community fingerprint method in which phylogenetic assignments may be inferred from the terminal restriction fragment (T-RF) sizes through the use of web-based resources that predict T-RF sizes for known bacteria. The process quickly becomes computationally intensive due to the need to analyze profiles produced by multiple restriction digests and the complexity of profiles generated by natural microbial communities. A web-based tool is described here that rapidly generates phylogenetic assignments from submitted community T-RFLP profiles based on a database of fragments produced by known 16S rRNA gene sequences. Users have the option of submitting a customized database generated from unpublished sequences or from a gene other than the 16S rRNA gene. This phylogenetic assignment tool allows users to employ T-RFLP to simultaneously analyze microbial community diversity and species composition. An analysis of the variability of bacterial species composition throughout the water column in a humic lake was carried out to demonstrate the functionality of the phylogenetic assignment tool. This method was validated by comparing the results generated by this program with results from a 16S rRNA gene clone library. PMID:14602639

  20. fMRat: an extension of SPM for a fully automatic analysis of rodent brain functional magnetic resonance series.

    PubMed

    Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel

    2016-05-01

    The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .

  1. High-resolution, submicron particle size distribution analysis using gravitational-sweep sedimentation.

    PubMed Central

    Mächtle, W

    1999-01-01

    Sedimentation velocity is a powerful tool for the analysis of complex solutions of macromolecules. However, sample turbidity imposes an upper limit to the size of molecular complexes currently amenable to such analysis. Furthermore, the breadth of the particle size distribution, combined with possible variations in the density of different particles, makes it difficult to analyze extremely complex mixtures. These same problems are faced in the polymer industry, where dispersions of latices, pigments, lacquers, and emulsions must be characterized. There is a rich history of methods developed for the polymer industry finding use in the biochemical sciences. Two such methods are presented. These use analytical ultracentrifugation to determine the density and size distributions for submicron-sized particles. Both methods rely on Stokes' equations to estimate particle size and density, whereas turbidity, corrected using Mie's theory, provides the concentration measurement. The first method uses the sedimentation time in dispersion media of different densities to evaluate the particle density and size distribution. This method works provided the sample is chemically homogeneous. The second method splices together data gathered at different sample concentrations, thus permitting the high-resolution determination of the size distribution of particle diameters ranging from 10 to 3000 nm. By increasing the rotor speed exponentially from 0 to 40,000 rpm over a 1-h period, size distributions may be measured for extremely broadly distributed dispersions. Presented here is a short history of particle size distribution analysis using the ultracentrifuge, along with a description of the newest experimental methods. Several applications of the methods are provided that demonstrate the breadth of its utility, including extensions to samples containing nonspherical and chromophoric particles. PMID:9916040

  2. Scalable Performance Environments for Parallel Systems

    NASA Technical Reports Server (NTRS)

    Reed, Daniel A.; Olson, Robert D.; Aydt, Ruth A.; Madhyastha, Tara M.; Birkett, Thomas; Jensen, David W.; Nazief, Bobby A. A.; Totty, Brian K.

    1991-01-01

    As parallel systems expand in size and complexity, the absence of performance tools for these parallel systems exacerbates the already difficult problems of application program and system software performance tuning. Moreover, given the pace of technological change, we can no longer afford to develop ad hoc, one-of-a-kind performance instrumentation software; we need scalable, portable performance analysis tools. We describe an environment prototype based on the lessons learned from two previous generations of performance data analysis software. Our environment prototype contains a set of performance data transformation modules that can be interconnected in user-specified ways. It is the responsibility of the environment infrastructure to hide details of module interconnection and data sharing. The environment is written in C++ with the graphical displays based on X windows and the Motif toolkit. It allows users to interconnect and configure modules graphically to form an acyclic, directed data analysis graph. Performance trace data are represented in a self-documenting stream format that includes internal definitions of data types, sizes, and names. The environment prototype supports the use of head-mounted displays and sonic data presentation in addition to the traditional use of visual techniques.

  3. NREL's System Advisor Model Simplifies Complex Energy Analysis (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2015-01-01

    NREL has developed a tool -- the System Advisor Model (SAM) -- that can help decision makers analyze cost, performance, and financing of any size grid-connected solar, wind, or geothermal power project. Manufacturers, engineering and consulting firms, research and development firms, utilities, developers, venture capital firms, and international organizations use SAM for end-to-end analysis that helps determine whether and how to make investments in renewable energy projects.

  4. A high-throughput label-free nanoparticle analyser.

    PubMed

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  5. Machine learning and data science in soft materials engineering

    NASA Astrophysics Data System (ADS)

    Ferguson, Andrew L.

    2018-01-01

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by ‘de-jargonizing’ data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  6. Parametric studies and orbital analysis for an electric orbit transfer vehicle space flight demonstration

    NASA Astrophysics Data System (ADS)

    Avila, Edward R.

    The Electric Insertion Transfer Experiment (ELITE) is an Air Force Advanced Technology Transition Demonstration which is being executed as a cooperative Research and Development Agreement between the Phillips Lab and TRW. The objective is to build, test, and fly a solar-electric orbit transfer and orbit maneuvering vehicle, as a precursor to an operational electric orbit transfer vehicle (EOTV). This paper surveys some of the analysis tools used to do parametric studies and discusses the study results. The primary analysis tool was the Electric Vehicle Analyzer (EVA) developed by the Phillips Lab and modified by The Aerospace Corporation. It uses a simple orbit averaging approach to model low-thrust transfer performance, and runs in a PC environment. The assumptions used in deriving the EVA math model are presented. This tool and others surveyed were used to size the solar array power required for the spacecraft, and develop a baseline mission profile that meets the requirements of the ELITE mission.

  7. Machine learning and data science in soft materials engineering.

    PubMed

    Ferguson, Andrew L

    2018-01-31

    In many branches of materials science it is now routine to generate data sets of such large size and dimensionality that conventional methods of analysis fail. Paradigms and tools from data science and machine learning can provide scalable approaches to identify and extract trends and patterns within voluminous data sets, perform guided traversals of high-dimensional phase spaces, and furnish data-driven strategies for inverse materials design. This topical review provides an accessible introduction to machine learning tools in the context of soft and biological materials by 'de-jargonizing' data science terminology, presenting a taxonomy of machine learning techniques, and surveying the mathematical underpinnings and software implementations of popular tools, including principal component analysis, independent component analysis, diffusion maps, support vector machines, and relative entropy. We present illustrative examples of machine learning applications in soft matter, including inverse design of self-assembling materials, nonlinear learning of protein folding landscapes, high-throughput antimicrobial peptide design, and data-driven materials design engines. We close with an outlook on the challenges and opportunities for the field.

  8. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  9. Meta-analyzing dependent correlations: an SPSS macro and an R script.

    PubMed

    Cheung, Shu Fai; Chan, Darius K-S

    2014-06-01

    The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.

  10. BioPig: Developing Cloud Computing Applications for Next-Generation Sequence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhatia, Karan; Wang, Zhong

    Next Generation sequencing is producing ever larger data sizes with a growth rate outpacing Moore's Law. The data deluge has made many of the current sequenceanalysis tools obsolete because they do not scale with data. Here we present BioPig, a collection of cloud computing tools to scale data analysis and management. Pig is aflexible data scripting language that uses Apache's Hadoop data structure and map reduce framework to process very large data files in parallel and combine the results.BioPig extends Pig with capability with sequence analysis. We will show the performance of BioPig on a variety of bioinformatics tasks, includingmore » screeningsequence contaminants, Illumina QA/QC, and gene discovery from metagenome data sets using the Rumen metagenome as an example.« less

  11. ALSSAT Development Status

    NASA Technical Reports Server (NTRS)

    Yeh, H. Y. Jannivine; Brown, Cheryl B.; Jeng, Frank F.; Anderson, Molly; Ewert, Michael K.

    2009-01-01

    The development of the Advanced Life Support (ALS) Sizing Analysis Tool (ALSSAT) using Microsoft(Registered TradeMark) Excel was initiated by the Crew and Thermal Systems Division (CTSD) of Johnson Space Center (JSC) in 1997 to support the ALS and Exploration Offices in Environmental Control and Life Support System (ECLSS) design and studies. It aids the user in performing detailed sizing of the ECLSS for different combinations of the Exploration Life support (ELS) regenerative system technologies. This analysis tool will assist the user in performing ECLSS preliminary design and trade studies as well as system optimization efficiently and economically. The latest ALSSAT related publication in ICES 2004 detailed ALSSAT s development status including the completion of all six ELS Subsystems (ELSS), namely, the Air Management Subsystem, the Biomass Subsystem, the Food Management Subsystem, the Solid Waste Management Subsystem, the Water Management Subsystem, and the Thermal Control Subsystem and two external interfaces, including the Extravehicular Activity and the Human Accommodations. Since 2004, many more regenerative technologies in the ELSS were implemented into ALSSAT. ALSSAT has also been used for the ELS Research and Technology Development Metric Calculation for FY02 thru FY06. It was also used to conduct the Lunar Outpost Metric calculation for FY08 and was integrated as part of a Habitat Model developed at Langley Research Center to support the Constellation program. This paper will give an update on the analysis tool s current development status as well as present the analytical results of one of the trade studies that was performed.

  12. Shape Variation in Aterian Tanged Tools and the Origins of Projectile Technology: A Morphometric Perspective on Stone Tool Function

    PubMed Central

    Iovita, Radu

    2011-01-01

    Background Recent findings suggest that the North African Middle Stone Age technocomplex known as the Aterian is both much older than previously assumed, and certainly associated with fossils exhibiting anatomically modern human morphology and behavior. The Aterian is defined by the presence of ‘tanged’ or ‘stemmed’ tools, which have been widely assumed to be among the earliest projectile weapon tips. The present study systematically investigates morphological variation in a large sample of Aterian tools to test the hypothesis that these tools were hafted and/or used as projectile weapons. Methodology/Principal Findings Both classical morphometrics and Elliptical Fourier Analysis of tool outlines are used to show that the shape variation in the sample exhibits size-dependent patterns consistent with a reduction of the tools from the tip down, with the tang remaining intact. Additionally, the process of reduction led to increasing side-to-side asymmetries as the tools got smaller. Finally, a comparison of shape-change trajectories between Aterian tools and Late Paleolithic arrowheads from the North German site of Stellmoor reveal significant differences in terms of the amount and location of the variation. Conclusions/Significance The patterns of size-dependent shape variation strongly support the functional hypothesis of Aterian tools as hafted knives or scrapers with alternating active edges, rather than as weapon tips. Nevertheless, the same morphological patterns are interpreted as one of the earliest evidences for a hafting modification, and for the successful combination of different raw materials (haft and stone tip) into one implement, in itself an important achievement in the evolution of hominin technologies. PMID:22216161

  13. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  14. Setting up a proper power spectral density (PSD) and autocorrelation analysis for material and process characterization

    NASA Astrophysics Data System (ADS)

    Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.

    2018-03-01

    Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations

  15. A Feasibility Study on a Parallel Mechanism for Examining the Space Shuttle Orbiter Payload Bay Radiators

    NASA Technical Reports Server (NTRS)

    Roberts, Rodney G.; LopezdelCastillo, Eduardo

    1996-01-01

    The goal of the project was to develop the necessary analysis tools for a feasibility study of a cable suspended robot system for examining the space shuttle orbiter payload bay radiators These tools were developed to address design issues such as workspace size, tension requirements on the cable, the necessary accuracy and resolution requirements and the stiffness and movement requirements of the system. This report describes the mathematical models for studying the inverse kinematics, statics, and stiffness of the robot. Each model is described by a matrix. The manipulator Jacobian was also related to the stiffness matrix, which characterized the stiffness of the system. Analysis tools were then developed based on the singular value decomposition (SVD) of the corresponding matrices. It was demonstrated how the SVD can be used to quantify the robot's performance and to provide insight into different design issues.

  16. Integrating advanced visualization technology into the planetary Geoscience workflow

    NASA Astrophysics Data System (ADS)

    Huffman, John; Forsberg, Andrew; Loomis, Andrew; Head, James; Dickson, James; Fassett, Caleb

    2011-09-01

    Recent advances in computer visualization have allowed us to develop new tools for analyzing the data gathered during planetary missions, which is important, since these data sets have grown exponentially in recent years to tens of terabytes in size. As part of the Advanced Visualization in Solar System Exploration and Research (ADVISER) project, we utilize several advanced visualization techniques created specifically with planetary image data in mind. The Geoviewer application allows real-time active stereo display of images, which in aggregate have billions of pixels. The ADVISER desktop application platform allows fast three-dimensional visualization of planetary images overlain on digital terrain models. Both applications include tools for easy data ingest and real-time analysis in a programmatic manner. Incorporation of these tools into our everyday scientific workflow has proved important for scientific analysis, discussion, and publication, and enabled effective and exciting educational activities for students from high school through graduate school.

  17. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  18. Prospects of photonic nanojets for precise exposure on microobjects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geints, Yu. E., E-mail: ygeints@iao.ru; Zuev Institute of Atmospheric Optics, SB Russian Academy of Sciences, Acad. Zuev Square 1, Tomsk, 634021; Panina, E. K., E-mail: pek@iao.ru

    We report on the new optical tool for precise manipulation of various microobjects. This tool is referred to as a “photonic nanojet” (PJ) and corresponds to specific spatially localized and high-intensity area formed near micron-sized transparent spherical dielectric particles illuminated by a visible laser radiation The descriptive analysis of the morphological shapes of photonic nanojets is presented. The PJ shape characterization is based on the numerical calculations of the near-field distribution according to the Mie theory and accounts for jet dimensions and shape complexity.

  19. Young Children's Computer Skills Development from Kindergarten to Third Grade

    ERIC Educational Resources Information Center

    Sackes, Mesut; Trundle, Kathy Cabe; Bell, Randy L.

    2011-01-01

    This investigation explores young children's computer skills development from kindergarten to third grade using the Early Childhood Longitudinal Study-Kindergarten (ECLS-K) dataset. The sample size of the study was 8642 children. Latent growth curve modeling analysis was used as an analytical tool to examine the development of children's computer…

  20. The application of nirvana to silvicultural studies

    Treesearch

    Chi-Leung So; Thomas Elder; Leslie Groom; John S. Kush; Jennifer Myszewski; Todd Shupe

    2006-01-01

    Previous results from this laboratory have shown that near infrared (NIR) spectroscopy, coupled with multivariate analysis, can be a powerful tool for the prediction of wood quality. While wood quality measurements are of utility, their determination can be both time and labor intensive, thus limiting their use where large sample sizes are concerned. This paper will...

  1. In-line monitoring of a pharmaceutical blending process using FT-Raman spectroscopy.

    PubMed

    Vergote, G J; De Beer, T R M; Vervaet, C; Remon, J P; Baeyens, W R G; Diericx, N; Verpoort, F

    2004-03-01

    FT-Raman spectroscopy (in combination with a fibre optic probe) was evaluated as an in-line tool to monitor a blending process of diltiazem hydrochloride pellets and paraffinic wax beads. The mean square of differences (MSD) between two consecutive spectra was used to identify the time required to obtain a homogeneous mixture. A traditional end-sampling thief probe was used to collect samples, followed by HPLC analysis to verify the Raman data. Large variations were seen in the FT-Raman spectra logged during the initial minutes of the blending process using a binary mixture (ratio: 50/50, w/w) of diltiazem pellets and paraffinic wax beads (particle size: 800-1200 microm). The MSD-profiles showed that a homogeneous mixture was obtained after about 15 min blending. HPLC analysis confirmed these observations. The Raman data showed that the mixing kinetics depended on the particle size of the material and on the mixing speed. The results of this study proved that FT-Raman spectroscopy can be successfully implemented as an in-line monitoring tool for blending processes.

  2. Examination of a high resolution laser optical plankton counter and FlowCAM for measuring plankton concentration and size

    NASA Astrophysics Data System (ADS)

    Kydd, Jocelyn; Rajakaruna, Harshana; Briski, Elizabeta; Bailey, Sarah

    2018-03-01

    Many commercial ships will soon begin to use treatment systems to manage their ballast water and reduce the global transfer of harmful aquatic organisms and pathogens in accordance with upcoming International Maritime Organization regulations. As a result, rapid and accurate automated methods will be needed to monitoring compliance of ships' ballast water. We examined two automated particle counters for monitoring organisms ≥ 50 μm in minimum dimension: a High Resolution Laser Optical Plankton Counter (HR-LOPC), and a Flow Cytometer with digital imaging Microscope (FlowCAM), in comparison to traditional (manual) microscopy considering plankton concentration, size frequency distributions and particle size measurements. The automated tools tended to underestimate particle concentration compared to standard microscopy, but gave similar results in terms of relative abundance of individual taxa. For most taxa, particle size measurements generated by FlowCAM ABD (Area Based Diameter) were more similar to microscope measurements than were those by FlowCAM ESD (Equivalent Spherical Diameter), though there was a mismatch in size estimates for some organisms between the FlowCAM ABD and microscope due to orientation and complex morphology. When a single problematic taxon is very abundant, the resulting size frequency distribution curves can become skewed, as was observed with Asterionella in this study. In particular, special consideration is needed when utilizing automated tools to analyse samples containing colonial species. Re-analysis of the size frequency distributions with the removal of Asterionella from FlowCAM and microscope data resulted in more similar curves across methods with FlowCAM ABD having the best fit compared to the microscope, although microscope concentration estimates were still significantly higher than estimates from the other methods. The results of our study indicate that both automated tools can generate frequency distributions of particles that might be particularly useful if correction factors can be developed for known differences in well-studied aquatic ecosystems.

  3. Visual-haptic integration with pliers and tongs: signal “weights” take account of changes in haptic sensitivity caused by different tools

    PubMed Central

    Takahashi, Chie; Watt, Simon J.

    2014-01-01

    When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the “weight” given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots) with different “gains” between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber's law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modeled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimizing the design of visual-haptic devices. PMID:24592245

  4. DMT-TAFM: a data mining tool for technical analysis of futures market

    NASA Astrophysics Data System (ADS)

    Stepanov, Vladimir; Sathaye, Archana

    2002-03-01

    Technical analysis of financial markets describes many patterns of market behavior. For practical use, all these descriptions need to be adjusted for each particular trading session. In this paper, we develop a data mining tool for technical analysis of the futures markets (DMT-TAFM), which dynamically generates rules based on the notion of the price pattern similarity. The tool consists of three main components. The first component provides visualization of data series on a chart with different ranges, scales, and chart sizes and types. The second component constructs pattern descriptions using sets of polynomials. The third component specifies the training set for mining, defines the similarity notion, and searches for a set of similar patterns. DMT-TAFM is useful to prepare the data, and then reveal and systemize statistical information about similar patterns found in any type of historical price series. We performed experiments with our tool on three decades of trading data fro hundred types of futures. Our results for this data set shows that, we can prove or disprove many well-known patterns based on real data, as well as reveal new ones, and use the set of relatively consistent patterns found during data mining for developing better futures trading strategies.

  5. Video analysis of the flight of a model aircraft

    NASA Astrophysics Data System (ADS)

    Tarantino, Giovanni; Fazio, Claudio

    2011-11-01

    A video-analysis software tool has been employed in order to measure the steady-state values of the kinematics variables describing the longitudinal behaviour of a radio-controlled model aircraft during take-off, climbing and gliding. These experimental results have been compared with the theoretical steady-state configurations predicted by the phugoid model for longitudinal flight. A comparison with the parameters and performance of the full-size aircraft has also been outlined.

  6. A meta-analysis of mismatch negativity in children with attention deficit-hyperactivity disorders.

    PubMed

    Cheng, Chia-Hsiung; Chan, Pei-Ying S; Hsieh, Yu-Wei; Chen, Kuan-Fu

    2016-01-26

    Mismatch negativity (MMN) is an optimal neurophysiological signal to assess the integrity of auditory sensory memory and involuntary attention switch. The generation of MMN is independent of overt behavioral requirements, concentration or motivation, and thus serves as a suitable tool to study the perceptual function in children with attention deficit-hyperactivity disorders (ADHD). It remains unclear whether ADHD children showed altered MMN responses. Therefore we performed a meta-analysis of peer-reviewed MMN studies that had targeted both typically developed and ADHD children to examine the pooled effect size. The published articles between 1990 and 2014 were searched in PubMed, Medline, Cochrane, and CINAHL. The mean effect size and a 95% confidence interval (CI) were estimated. Six studies, consisting of 10 individual investigations, were included in the final analysis. A significant effect size of 0.28 was found (p=0.028, 95% CI at 0.03-0.53). These results were also free from publication bias or heterogeneity. In conclusion, our meta-analysis results suggest ADHD children demonstrated a reduced MMN amplitude compared to healthy controls. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Analysis of the influence of the aperture size on the differences of L *a *b chromatic coordinates in a spectrocolorimeter

    NASA Astrophysics Data System (ADS)

    Medina-Marquez, J.; Balderas-Mata, S. E.; Flores, Jorge L.

    2016-09-01

    The study of the influence of the aperture size over the measurements of the L*a*b chromatic coordinates in spectrocolorimeters, in particular, the Macbeth 7000A ® spectrocolorimeter with an illumination/detection geometry d/8°. This is of importance due to the fact that many industry laboratories use it. This study will give us an invaluable insight of the variations in the measurements of the chromatic coordinates in the visible spectrum range regarding to three different aperture sizes; 2,5cm (AL), 1cm (AM), and 0,5cm (AS). The measurements are carried out on 13 Reference Materials (RMs) or diffusers with different hue under the following metrics; including specular component (SCI), excluding ultraviolet component (UVex), D65 illuminant, and 2° observer. The analysis and quantification of the data were done by the use of statistical tools such as variance analysis and Mendel parameters. In this work the analysis of the latter measurements as well as the methodology that quantifies the accuracy and precision of the method, i.e., repeatability and reproducibility, are presented.

  8. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    PubMed

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P < .05) with large effect sizes attributable to experience (Cohen d > 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P < .0001, d = 2.6), stitch orientation (P = .014,d = 1.4), and symmetry across the incision ratio (P = .022, d = 1.3). The authors found that a simple computer algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  9. Body size distributions signal a regime shift in a lake ecosystem

    USGS Publications Warehouse

    Spanbauer, Trisha; Allen, Craig R.; Angeler, David G.; Eason, Tarsha; Fritz, Sherilyn C.; Garmestani, Ahjond S.; Nash, Kirsty L.; Stone, Jeffery R.; Stow, Craig A.; Sundstrom, Shana M.

    2016-01-01

    Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this study, we assessed whether body size patterns serve as an indicator of a threshold between alternative regimes. Over the past 7000 years, the biological communities of Foy Lake (Montana, USA) have undergone a major regime shift owing to climate change. We used a palaeoecological record of diatom communities to estimate diatom sizes, and then analysed the discontinuous distribution of organism sizes over time. We used Bayesian classification and regression tree models to determine that all time intervals exhibited aggregations of sizes separated by gaps in the distribution and found a significant change in diatom body size distributions approximately 150 years before the identified ecosystem regime shift. We suggest that discontinuity analysis is a useful addition to the suite of tools for the detection of early warning signals of regime shifts.

  10. On the repeated measures designs and sample sizes for randomized controlled trials.

    PubMed

    Tango, Toshiro

    2016-04-01

    For the analysis of longitudinal or repeated measures data, generalized linear mixed-effects models provide a flexible and powerful tool to deal with heterogeneity among subject response profiles. However, the typical statistical design adopted in usual randomized controlled trials is an analysis of covariance type analysis using a pre-defined pair of "pre-post" data, in which pre-(baseline) data are used as a covariate for adjustment together with other covariates. Then, the major design issue is to calculate the sample size or the number of subjects allocated to each treatment group. In this paper, we propose a new repeated measures design and sample size calculations combined with generalized linear mixed-effects models that depend not only on the number of subjects but on the number of repeated measures before and after randomization per subject used for the analysis. The main advantages of the proposed design combined with the generalized linear mixed-effects models are (1) it can easily handle missing data by applying the likelihood-based ignorable analyses under the missing at random assumption and (2) it may lead to a reduction in sample size, compared with the simple pre-post design. The proposed designs and the sample size calculations are illustrated with real data arising from randomized controlled trials. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  12. Evaluating quantitative 3-D image analysis as a design tool for low enriched uranium fuel compacts for the transient reactor test facility: A preliminary study

    DOE PAGES

    Kane, J. J.; van Rooyen, I. J.; Craft, A. E.; ...

    2016-02-05

    In this study, 3-D image analysis when combined with a non-destructive examination technique such as X-ray computed tomography (CT) provides a highly quantitative tool for the investigation of a material’s structure. In this investigation 3-D image analysis and X-ray CT were combined to analyze the microstructure of a preliminary subsized fuel compact for the Transient Reactor Test Facility’s low enriched uranium conversion program to assess the feasibility of the combined techniques for use in the optimization of the fuel compact fabrication process. The quantitative image analysis focused on determining the size and spatial distribution of the surrogate fuel particles andmore » the size, shape, and orientation of voids within the compact. Additionally, the maximum effect of microstructural features on heat transfer through the carbonaceous matrix of the preliminary compact was estimated. The surrogate fuel particles occupied 0.8% of the compact by volume with a log-normal distribution of particle sizes with a mean diameter of 39 μm and a standard deviation of 16 μm. Roughly 39% of the particles had a diameter greater than the specified maximum particle size of 44 μm suggesting that the particles agglomerate during fabrication. The local volume fraction of particles also varies significantly within the compact although uniformities appear to be evenly dispersed throughout the analysed volume. The voids produced during fabrication were on average plate-like in nature with their major axis oriented perpendicular to the compaction direction of the compact. Finally, the microstructure, mainly the large preferentially oriented voids, may cause a small degree of anisotropy in the thermal diffusivity within the compact. α∥/α⊥, the ratio of thermal diffusivities parallel to and perpendicular to the compaction direction are expected to be no less than 0.95 with an upper bound of 1.« less

  13. Installation/Removal Tool for Screw-Mounted Components

    NASA Technical Reports Server (NTRS)

    Ash, J. P.

    1984-01-01

    Tweezerlike tool simplifies installation of screws in places reached only through narrow openings. With changes in size and shape, basic tool concept applicable to mounting and dismounting of transformers, sockets, terminal strips and mechanical parts. Inexpensive tool fabricated as needed by bending two pieces of steel wire. Exact size and shape selected to suit part manipulated and nature of inaccessible mounting space.

  14. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  15. Comparison Through Image Analysis Between Al Foams Produced Using Two Different Methods

    NASA Astrophysics Data System (ADS)

    Boschetto, A.; Campana, F.; Pilone, D.

    2014-02-01

    Several methods are available for making metal foams. They allow to tailor their mechanical, thermal, acoustic, and electrical properties for specific applications by varying the relative density as well as the cell size and morphology. Foams have a very heterogeneous structure so that their properties may show a large scatter. In this paper, an aluminum foam produced by means of foaming of powder compacts and another one prepared via the infiltration process were analyzed and compared. Image analysis has been used as a useful tool to determine size, morphology, and distribution of cells in both foams and to correlate cell morphology with the considered manufacturing process. The results highlighted that cell size and morphology are strictly dependent upon the manufacturing method. This paper shows how some standard 2D morphological indicators may be usefully adopted to characterize foams whose structure derives from the specific manufacturing process.

  16. Technology Combination Analysis Tool (TCAT) for Active Debris Removal

    NASA Astrophysics Data System (ADS)

    Chamot, B.; Richard, M.; Salmon, T.; Pisseloup, A.; Cougnet, C.; Axthelm, R.; Saunder, C.; Dupont, C.; Lequette, L.

    2013-08-01

    This paper present the work of the Swiss Space Center EPFL within the CNES-funded OTV-2 study. In order to find the most performant Active Debris Removal (ADR) mission architectures and technologies, a tool was developed in order to design and compare ADR spacecraft, and to plan ADR campaigns to remove large debris. Two types of architectures are considered to be efficient: the Chaser (single-debris spacecraft), the Mothership/ Kits (multiple-debris spacecraft). Both are able to perform controlled re-entry. The tool includes modules to optimise the launch dates and the order of capture, to design missions and spacecraft, and to select launch vehicles. The propulsion, power and structure subsystems are sized by the tool thanks to high-level parametric models whilst the other ones are defined by their mass and power consumption. Final results are still under investigation by the consortium but two concrete examples of the tool's outputs are presented in the paper.

  17. LLIMAS: Revolutionizing integrating modeling and analysis at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Doyle, Keith B.; Stoeckel, Gerhard P.; Rey, Justin J.; Bury, Mark E.

    2017-08-01

    MIT Lincoln Laboratory's Integrated Modeling and Analysis Software (LLIMAS) enables the development of novel engineering solutions for advanced prototype systems through unique insights into engineering performance and interdisciplinary behavior to meet challenging size, weight, power, environmental, and performance requirements. LLIMAS is a multidisciplinary design optimization tool that wraps numerical optimization algorithms around an integrated framework of structural, thermal, optical, stray light, and computational fluid dynamics analysis capabilities. LLIMAS software is highly extensible and has developed organically across a variety of technologies including laser communications, directed energy, photometric detectors, chemical sensing, laser radar, and imaging systems. The custom software architecture leverages the capabilities of existing industry standard commercial software and supports the incorporation of internally developed tools. Recent advances in LLIMAS's Structural-Thermal-Optical Performance (STOP), aeromechanical, and aero-optical capabilities as applied to Lincoln prototypes are presented.

  18. Subvisible (2-100 μm) Particle Analysis During Biotherapeutic Drug Product Development: Part 1, Considerations and Strategy.

    PubMed

    Narhi, Linda O; Corvari, Vincent; Ripple, Dean C; Afonina, Nataliya; Cecchini, Irene; Defelippis, Michael R; Garidel, Patrick; Herre, Andrea; Koulov, Atanas V; Lubiniecki, Tony; Mahler, Hanns-Christian; Mangiagalli, Paolo; Nesta, Douglas; Perez-Ramirez, Bernardo; Polozova, Alla; Rossi, Mara; Schmidt, Roland; Simler, Robert; Singh, Satish; Spitznagel, Thomas M; Weiskopf, Andrew; Wuchner, Klaus

    2015-06-01

    Measurement and characterization of subvisible particles (defined here as those ranging in size from 2 to 100 μm), including proteinaceous and nonproteinaceous particles, is an important part of every stage of protein therapeutic development. The tools used and the ways in which the information generated is applied depends on the particular product development stage, the amount of material, and the time available for the analysis. In order to compare results across laboratories and products, it is important to harmonize nomenclature, experimental protocols, data analysis, and interpretation. In this manuscript on perspectives on subvisible particles in protein therapeutic drug products, we focus on the tools available for detection, characterization, and quantification of these species and the strategy around their application. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  19. Multidisciplinary Conceptual Design for Reduced-Emission Rotorcraft

    NASA Technical Reports Server (NTRS)

    Silva, Christopher; Johnson, Wayne; Solis, Eduardo

    2018-01-01

    Python-based wrappers for OpenMDAO are used to integrate disparate software for practical conceptual design of rotorcraft. The suite of tools which are connected thus far include aircraft sizing, comprehensive analysis, and parametric geometry. The tools are exercised to design aircraft with aggressive goals for emission reductions relative to fielded state-of-the-art rotorcraft. Several advanced reduced-emission rotorcraft are designed and analyzed, demonstrating the flexibility of the tools to consider a wide variety of potentially transformative vertical flight vehicles. To explore scale effects, aircraft have been sized for 5, 24, or 76 passengers in their design missions. Aircraft types evaluated include tiltrotor, single-main-rotor, coaxial, and side-by-side helicopters. Energy and drive systems modeled include Lithium-ion battery, hydrogen fuel cell, turboelectric hybrid, and turboshaft drive systems. Observations include the complex nature of the trade space for this simple problem, with many potential aircraft design and operational solutions for achieving significant emission reductions. Also interesting is that achieving greatly reduced emissions may not require exotic component technologies, but may be achieved with a dedicated design objective of reducing emissions.

  20. Superior model for fault tolerance computation in designing nano-sized circuit systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, N. S. S., E-mail: narinderjit@petronas.com.my; Muthuvalu, M. S., E-mail: msmuthuvalu@gmail.com; Asirvadam, V. S., E-mail: vijanth-sagayan@petronas.com.my

    2014-10-24

    As CMOS technology scales nano-metrically, reliability turns out to be a decisive subject in the design methodology of nano-sized circuit systems. As a result, several computational approaches have been developed to compute and evaluate reliability of desired nano-electronic circuits. The process of computing reliability becomes very troublesome and time consuming as the computational complexity build ups with the desired circuit size. Therefore, being able to measure reliability instantly and superiorly is fast becoming necessary in designing modern logic integrated circuits. For this purpose, the paper firstly looks into the development of an automated reliability evaluation tool based on the generalizationmore » of Probabilistic Gate Model (PGM) and Boolean Difference-based Error Calculator (BDEC) models. The Matlab-based tool allows users to significantly speed-up the task of reliability analysis for very large number of nano-electronic circuits. Secondly, by using the developed automated tool, the paper explores into a comparative study involving reliability computation and evaluation by PGM and, BDEC models for different implementations of same functionality circuits. Based on the reliability analysis, BDEC gives exact and transparent reliability measures, but as the complexity of the same functionality circuits with respect to gate error increases, reliability measure by BDEC tends to be lower than the reliability measure by PGM. The lesser reliability measure by BDEC is well explained in this paper using distribution of different signal input patterns overtime for same functionality circuits. Simulation results conclude that the reliability measure by BDEC depends not only on faulty gates but it also depends on circuit topology, probability of input signals being one or zero and also probability of error on signal lines.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, Eric J

    The ResStock analysis tool is helping states, municipalities, utilities, and manufacturers identify which home upgrades save the most energy and money. Across the country there's a vast diversity in the age, size, construction practices, installed equipment, appliances, and resident behavior of the housing stock, not to mention the range of climates. These variations have hindered the accuracy of predicting savings for existing homes. Researchers at the National Renewable Energy Laboratory (NREL) developed ResStock. It's a versatile tool that takes a new approach to large-scale residential energy analysis by combining: large public and private data sources, statistical sampling, detailed subhourly buildingmore » simulations, high-performance computing. This combination achieves unprecedented granularity and most importantly - accuracy - in modeling the diversity of the single-family housing stock.« less

  2. Handle grip span for optimising finger-specific force capability as a function of hand size.

    PubMed

    Lee, Soo-Jin; Kong, Yong-Ku; Lowe, Brian D; Song, Seongho

    2009-05-01

    Five grip spans (45 to 65 mm) were tested to evaluate the effects of handle grip span and user's hand size on maximum grip strength, individual finger force and subjective ratings of comfort using a computerised digital dynamometer with independent finger force sensors. Forty-six males participated and were assigned into three hand size groups (small, medium, large) according to their hands' length. In general, results showed the 55- and 50-mm grip spans were rated as the most comfortable sizes and showed the largest grip strength (433.6 N and 430.8 N, respectively), whereas the 65-mm grip span handle was rated as the least comfortable size and the least grip strength. With regard to the interaction effect of grip span and hand size, small and medium-hand participants rated the best preference for the 50- to 55-mm grip spans and the least for the 65-mm grip span, whereas large-hand participants rated the 55- to 60-mm grip spans as the most preferred and the 45-mm grip span as the least preferred. Normalised grip span (NGS) ratios (29% and 27%) are the ratios of user's hand length to handle grip span. The NGS ratios were obtained and applied for suggesting handle grip spans in order to maximise subjective comfort as well as gripping force according to the users' hand sizes. In the analysis of individual finger force, the middle finger force showed the highest contribution (37.5%) to the total finger force, followed by the ring (28.7%), index (20.2%) and little (13.6%) finger. In addition, each finger was observed to have a different optimal grip span for exerting the maximum force, resulting in a bow-contoured shaped handle (the grip span of the handle at the centre is larger than the handle at the end) for two-handle hand tools. Thus, the grip spans for two-handle hand tools may be designed according to the users' hand/finger anthropometrics to maximise subjective ratings and performance based on this study. Results obtained in this study will provide guidelines for hand tool designers and manufacturers for designing grip spans of two-handle tools, which can maximise handle comfort and performance.

  3. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  4. New insights in morphological analysis for managing activated sludge systems.

    PubMed

    Oliveira, Pedro; Alliet, Marion; Coufort-Saudejaud, Carole; Frances, Christine

    2018-06-01

    In activated sludge (AS) process, the impact of the operational parameters on process efficiency is assumed to be correlated with the sludge properties. This study provides a better insight into these interactions by subjecting a laboratory-scale AS system to a sequence of operating condition modifications enabling typical situations of a wastewater treatment plant to be represented. Process performance was assessed and AS floc morphology (size, circularity, convexity, solidity and aspect ratio) was quantified by measuring 100,000 flocs per sample with an automated image analysis technique. Introducing 3D distributions, which combine morphological properties, allowed the identification of a filamentous bulking characterized by a floc population shift towards larger sizes and lower solidity and circularity values. Moreover, a washout phenomenon was characterized by smaller AS flocs and an increase in their solidity. Recycle ratio increase and COD:N ratio decrease both promoted a slight reduction of floc sizes and a constant evolution of circularity and convexity values. The analysis of the volume-based 3D distributions turned out to be a smart tool to combine size and shape data, allowing a deeper understanding of the dynamics of floc structure under process disturbances.

  5. One-step analysis of DNA/chitosan complexes by field-flow fractionation reveals particle size and free chitosan content.

    PubMed

    Ma, Pei Lian; Buschmann, Michael D; Winnik, Françoise M

    2010-03-08

    The composition of samples obtained upon complexation of DNA with chitosan was analyzed by asymmetrical flow field flow fractionation (AF4) with online UV-visible, multiangle light scattering (MALS), and dynamic light scattering (DLS) detectors. A chitosan labeled with rhodamine B to facilitate UV-vis detection of the polycation was complexed with DNA under conditions commonly used for transfection (chitosan glucosamine to DNA phosphate molar ratio of 5). AF4 analysis revealed that 73% of the chitosan-rhodamine remained free in the dispersion and that the DNA/chitosan complexes had a broad size distribution ranging from 20 to 160 nm in hydrodynamic radius. The accuracy of the data was assessed by comparison with data from batch-mode DLS and scanning electron microscopy. The AF4 combined with DLS allowed the characterization of small particles that were not detected by conventional batch-mode DLS. The AF4 analysis will prove to be an important tool in the field of gene therapy because it readily provides, in a single measurement, three important physicochemical parameters of the complexes: the amount of unbound polycation, the hydrodynamic size of the complexes, and their size distribution.

  6. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    PubMed

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  7. Efficient genotype compression and analysis of large genetic variation datasets

    PubMed Central

    Layer, Ryan M.; Kindlon, Neil; Karczewski, Konrad J.; Quinlan, Aaron R.

    2015-01-01

    Genotype Query Tools (GQT) is a new indexing strategy that expedites analyses of genome variation datasets in VCF format based on sample genotypes, phenotypes and relationships. GQT’s compressed genotype index minimizes decompression for analysis, and performance relative to existing methods improves with cohort size. We show substantial (up to 443 fold) performance gains over existing methods and demonstrate GQT’s utility for exploring massive datasets involving thousands to millions of genomes. PMID:26550772

  8. Multigrid Techniques for Highly Indefinite Equations

    NASA Technical Reports Server (NTRS)

    Shapira, Yair

    1996-01-01

    A multigrid method for the solution of finite difference approximations of elliptic PDE's is introduced. A parallelizable version of it, suitable for two and multi level analysis, is also defined, and serves as a theoretical tool for deriving a suitable implementation for the main version. For indefinite Helmholtz equations, this analysis provides a suitable mesh size for the coarsest grid used. Numerical experiments show that the method is applicable to diffusion equations with discontinuous coefficients and highly indefinite Helmholtz equations.

  9. Filtering Essays by Means of a Software Tool: Identifying Poor Essays

    ERIC Educational Resources Information Center

    Seifried, Eva; Lenhard, Wolfgang; Spinath, Birgit

    2017-01-01

    Writing essays and receiving feedback can be useful for fostering students' learning and motivation. When faced with large class sizes, it is desirable to identify students who might particularly benefit from feedback. In this article, we tested the potential of Latent Semantic Analysis (LSA) for identifying poor essays. A total of 14 teaching…

  10. Designing a Qualitative Data Collection Strategy (QDCS) for Africa - Phase 1: A Gap Analysis of Existing Models, Simulations, and Tools Relating to Africa

    DTIC Science & Technology

    2012-06-01

    generalized behavioral model characterized after the fictional Seldon equations (the one elaborated upon by Isaac Asimov in the 1951 novel, The...Foundation). Asimov described the Seldon equations as essentially statistical models with historical data of a sufficient size and variability that they

  11. Evaluating Classified MODIS Satellite Imagery as a Stratification Tool

    Treesearch

    Greg C. Liknes; Mark D. Nelson; Ronald E. McRoberts

    2004-01-01

    The Forest Inventory and Analysis (FIA) program of the USDA Forest Service collects forest attribute data on permanent plots arranged on a hexagonal network across all 50 states and Puerto Rico. Due to budget constraints, sample sizes sufficient to satisfy national FIA precision standards are seldom achieved for most inventory variables unless the estimation process is...

  12. Estimation of portion size in children's dietary assessment: lessons learnt.

    PubMed

    Foster, E; Adamson, A J; Anderson, A S; Barton, K L; Wrieden, W L

    2009-02-01

    Assessing the dietary intake of young children is challenging. In any 1 day, children may have several carers responsible for providing them with their dietary requirements, and once children reach school age, traditional methods such as weighing all items consumed become impractical. As an alternative to weighed records, food portion size assessment tools are available to assist subjects in estimating the amounts of foods consumed. Existing food photographs designed for use with adults and based on adult portion sizes have been found to be inappropriate for use with children. This article presents a review and summary of a body of work carried out to improve the estimation of portion sizes consumed by children. Feasibility work was undertaken to determine the accuracy and precision of three portion size assessment tools; food photographs, food models and a computer-based Interactive Portion Size Assessment System (IPSAS). These tools were based on portion sizes served to children during the National Diet and Nutrition Survey. As children often do not consume all of the food served to them, smaller portions were included in each tool for estimation of leftovers. The tools covered 22 foods, which children commonly consume. Children were served known amounts of each food and leftovers were recorded. They were then asked to estimate both the amount of food that they were served and the amount of any food leftover. Children were found to estimate food portion size with an accuracy approaching that of adults using both the food photographs and IPSAS. Further development is underway to increase the number of food photographs and to develop IPSAS to cover a much wider range of foods and to validate the use of these tools in a 'real life' setting.

  13. A Hybrid Parallel Strategy Based on String Graph Theory to Improve De Novo DNA Assembly on the TianHe-2 Supercomputer.

    PubMed

    Zhang, Feng; Liao, Xiangke; Peng, Shaoliang; Cui, Yingbo; Wang, Bingqiang; Zhu, Xiaoqian; Liu, Jie

    2016-06-01

    ' The de novo assembly of DNA sequences is increasingly important for biological researches in the genomic era. After more than one decade since the Human Genome Project, some challenges still exist and new solutions are being explored to improve de novo assembly of genomes. String graph assembler (SGA), based on the string graph theory, is a new method/tool developed to address the challenges. In this paper, based on an in-depth analysis of SGA we prove that the SGA-based sequence de novo assembly is an NP-complete problem. According to our analysis, SGA outperforms other similar methods/tools in memory consumption, but costs much more time, of which 60-70 % is spent on the index construction. Upon this analysis, we introduce a hybrid parallel optimization algorithm and implement this algorithm in the TianHe-2's parallel framework. Simulations are performed with different datasets. For data of small size the optimized solution is 3.06 times faster than before, and for data of middle size it's 1.60 times. The results demonstrate an evident performance improvement, with the linear scalability for parallel FM-index construction. This results thus contribute significantly to improving the efficiency of de novo assembly of DNA sequences.

  14. Experimental evaluation of tool run-out in micro milling

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  15. Metamodeling as a tool to size vegetative filter strips for surface runoff pollution control in European watersheds.

    NASA Astrophysics Data System (ADS)

    Lauvernet, Claire; Muñoz-Carpena, Rafael; Carluer, Nadia

    2015-04-01

    In Europe, a significant presence of contaminants is found in surface water, partly due to pesticide applications. Vegetative filter strips or buffer zones (VFS), often located along rivers, are a common best management practice (BMP) to reduce non point source pollution of water by reducing surface runoff. However, they need to be adapted to the agro-ecological and climatic conditions, both in terms of position and size, in order to be efficient. The TOPPS-PROWADIS project involves European experts and stakeholders to develop and recommend BMPs to reduce pesticide transfer by drift or runoff in several European countries. In this context, IRSTEA developed a guide accompanying the use of different tools, which allows designing site-specific VFS by simulating their efficiency to limit transfers using the mechanistic model VFSMOD. This method which is very complete assumes that the user provides detailed field knowledge and data, which are not always easily available. The aim of this study is to assist the buffer sizing by using a unique tool with a reduced set of parameters, adapted to the available information from the end-users. In order to fill in the lack of real data in many practical applications, a set of virtual scenarios was selected to encompass a large range of agro-pedo-climatic conditions in Europe, considering both the upslope agricultural field and the VFS characteristics. As a first step first, in this work we present scenarios based on North-West of France climate consisting of different rainfall intensities and durations, hillslope lengths and slopes, humidity conditions, a large set of field rainfall/runoff characteristics for the contributing area, and several shallow water table depths and soil types for the VFS. The sizing method based on the mechanistic model VFSMOD was applied for all these scenarios, and a global sensitivity analysis (GSA) of the VFS optimal length was performed for all the input parameters in order to understand their influence and interactions, and set priorities for data collecting and management. Based on GSA results, we compared several mathematical methods to compute the metamodel, and then validated it on an agricultural watershed with real data in the North-West of France. The analysis procedure allows for a robust and validated metamodel, before extending it on other climatic conditions in order to make the application on a large range of european watersheds possible. The tool will allow comparison of field scenarios, and to validate/improve actual existing placements and VFS sizing.

  16. Analysis of morphological variability and heritability in the head of the Argentine Black and White Tegu (Salvator merianae): undisturbed vs. disturbed environments.

    PubMed

    Imhoff, Carolina; Giri, Federico; Siroski, Pablo; Amavet, Patricia

    2018-04-01

    The heterogeneity of biotic and abiotic factors influencing fitness produce selective pressures that promote local adaptation and divergence among different populations of the same species. In order for adaptations to be maintained through evolutionary time, heritable genetic variation controlling the expression of the morphological features under selection is necessary. Here we compare morphological shape variability and size of the cephalic region of Salvator merianae specimens from undisturbed environments to those of individuals from disturbed environments, and estimated heritability for shape and size using geometric morphometric and quantitative genetics tools. The results of these analyzes indicated that there are statistically significant differences in shape and size between populations from the two environments. Possibly, one of the main determinants of cephalic shape and size is adaptation to the characteristics of the environment and to the trophic niche. Individuals from disturbed environments have a cephalic region with less shape variation and also have a larger centroid size when compared to individuals from undisturbed environments. The high heritability values obtained for shape and size in dorsal view and right side view indicate that these phenotypic characters have a great capacity to respond to the selection pressures to which they are subjected. Data obtained here could be used as an important tool when establishing guidelines for plans for the sustainable use and conservation of S. merianae and other species living in disturbed areas. Copyright © 2018 Elsevier GmbH. All rights reserved.

  17. Detection of copy number variations in epilepsy using exome data.

    PubMed

    Tsuchida, N; Nakashima, M; Kato, M; Heyman, E; Inui, T; Haginoya, K; Watanabe, S; Chiyonobu, T; Morimoto, M; Ohta, M; Kumakura, A; Kubota, M; Kumagai, Y; Hamano, S-I; Lourenco, C M; Yahaya, N A; Ch'ng, G-S; Ngu, L-H; Fattal-Valevski, A; Weisz Hubshman, M; Orenstein, N; Marom, D; Cohen, L; Goldberg-Stern, H; Uchiyama, Y; Imagawa, E; Mizuguchi, T; Takata, A; Miyake, N; Nakajima, H; Saitsu, H; Miyatake, S; Matsumoto, N

    2018-03-01

    Epilepsies are common neurological disorders and genetic factors contribute to their pathogenesis. Copy number variations (CNVs) are increasingly recognized as an important etiology of many human diseases including epilepsy. Whole-exome sequencing (WES) is becoming a standard tool for detecting pathogenic mutations and has recently been applied to detecting CNVs. Here, we analyzed 294 families with epilepsy using WES, and focused on 168 families with no causative single nucleotide variants in known epilepsy-associated genes to further validate CNVs using 2 different CNV detection tools using WES data. We confirmed 18 pathogenic CNVs, and 2 deletions and 2 duplications at chr15q11.2 of clinically unknown significance. Of note, we were able to identify small CNVs less than 10 kb in size, which might be difficult to detect by conventional microarray. We revealed 2 cases with pathogenic CNVs that one of the 2 CNV detection tools failed to find, suggesting that using different CNV tools is recommended to increase diagnostic yield. Considering a relatively high discovery rate of CNVs (18 out of 168 families, 10.7%) and successful detection of CNV with <10 kb in size, CNV detection by WES may be able to surrogate, or at least complement, conventional microarray analysis. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  18. Integrated Microfluidic System for Size-Based Selection and Trapping of Giant Vesicles.

    PubMed

    Kazayama, Yuki; Teshima, Tetsuhiko; Osaki, Toshihisa; Takeuchi, Shoji; Toyota, Taro

    2016-01-19

    Vesicles composed of phospholipids (liposomes) have attracted interest as artificial cell models and have been widely studied to explore lipid-lipid and lipid-protein interactions. However, the size dispersity of liposomes prepared by conventional methods was a major problem that inhibited their use in high-throughput analyses based on monodisperse liposomes. In this study, we developed an integrative microfluidic device that enables both the size-based selection and trapping of liposomes. This device consists of hydrodynamic selection and trapping channels in series, which made it possible to successfully produce an array of more than 60 monodisperse liposomes from a polydisperse liposome suspension with a narrow size distribution (the coefficient of variation was less than 12%). We successfully observed a size-dependent response of the liposomes to sequential osmotic stimuli, which had not clarified so far, by using this device. Our device will be a powerful tool to facilitate the statistical analysis of liposome dynamics.

  19. Computational Analysis of Material Flow During Friction Stir Welding of AA5059 Aluminum Alloys

    NASA Astrophysics Data System (ADS)

    Grujicic, M.; Arakere, G.; Pandurangan, B.; Ochterbeck, J. M.; Yen, C.-F.; Cheeseman, B. A.; Reynolds, A. P.; Sutton, M. A.

    2012-09-01

    Workpiece material flow and stirring/mixing during the friction stir welding (FSW) process are investigated computationally. Within the numerical model of the FSW process, the FSW tool is treated as a Lagrangian component while the workpiece material is treated as an Eulerian component. The employed coupled Eulerian/Lagrangian computational analysis of the welding process was of a two-way thermo-mechanical character (i.e., frictional-sliding/plastic-work dissipation is taken to act as a heat source in the thermal-energy balance equation) while temperature is allowed to affect mechanical aspects of the model through temperature-dependent material properties. The workpiece material (AA5059, solid-solution strengthened and strain-hardened aluminum alloy) is represented using a modified version of the classical Johnson-Cook model (within which the strain-hardening term is augmented to take into account for the effect of dynamic recrystallization) while the FSW tool material (AISI H13 tool steel) is modeled as an isotropic linear-elastic material. Within the analysis, the effects of some of the FSW key process parameters are investigated (e.g., weld pitch, tool tilt-angle, and the tool pin-size). The results pertaining to the material flow during FSW are compared with their experimental counterparts. It is found that, for the most part, experimentally observed material-flow characteristics are reproduced within the current FSW-process model.

  20. An Approximate Ablative Thermal Protection System Sizing Tool for Entry System Design

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2005-01-01

    A computer tool to perform entry vehicle ablative thermal protection systems sizing has been developed. Two options for calculating the thermal response are incorporated into the tool. One, an industry-standard, high-fidelity ablation and thermal response program was integrated into the tool, making use of simulated trajectory data to calculate its boundary conditions at the ablating surface. Second, an approximate method that uses heat of ablation data to estimate heat shield recession during entry has been coupled to a one-dimensional finite-difference calculation that calculates the in-depth thermal response. The in-depth solution accounts for material decomposition, but does not account for pyrolysis gas energy absorption through the material. Engineering correlations are used to estimate stagnation point convective and radiative heating as a function of time. The sizing tool calculates recovery enthalpy, wall enthalpy, surface pressure, and heat transfer coefficient. Verification of this tool is performed by comparison to past thermal protection system sizings for the Mars Pathfinder and Stardust entry systems and calculations are performed for an Apollo capsule entering the atmosphere at lunar and Mars return speeds.

  1. An Approximate Ablative Thermal Protection System Sizing Tool for Entry System Design

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Braun, Robert D.

    2006-01-01

    A computer tool to perform entry vehicle ablative thermal protection systems sizing has been developed. Two options for calculating the thermal response are incorporated into the tool. One, an industry-standard, high-fidelity ablation and thermal response program was integrated into the tool, making use of simulated trajectory data to calculate its boundary conditions at the ablating surface. Second, an approximate method that uses heat of ablation data to estimate heat shield recession during entry has been coupled to a one-dimensional finite-difference calculation that calculates the in-depth thermal response. The in-depth solution accounts for material decomposition, but does not account for pyrolysis gas energy absorption through the material. Engineering correlations are used to estimate stagnation point convective and radiative heating as a function of time. The sizing tool calculates recovery enthalpy, wall enthalpy, surface pressure, and heat transfer coefficient. Verification of this tool is performed by comparison to past thermal protection system sizings for the Mars Pathfinder and Stardust entry systems and calculations are performed for an Apollo capsule entering the atmosphere at lunar and Mars return speeds.

  2. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  3. Databases and Web Tools for Cancer Genomics Study

    PubMed Central

    Yang, Yadong; Dong, Xunong; Xie, Bingbing; Ding, Nan; Chen, Juan; Li, Yongjun; Zhang, Qian; Qu, Hongzhu; Fang, Xiangdong

    2015-01-01

    Publicly-accessible resources have promoted the advance of scientific discovery. The era of genomics and big data has brought the need for collaboration and data sharing in order to make effective use of this new knowledge. Here, we describe the web resources for cancer genomics research and rate them on the basis of the diversity of cancer types, sample size, omics data comprehensiveness, and user experience. The resources reviewed include data repository and analysis tools; and we hope such introduction will promote the awareness and facilitate the usage of these resources in the cancer research community. PMID:25707591

  4. Optimization of an Advanced Hybrid Wing Body Concept Using HCDstruct Version 1.2

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Hybrid Wing Body (HWB) aircraft concepts continue to be promising candidates for achieving the simultaneous fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project. In order to evaluate the projected benefits, improvements in structural analysis at the conceptual design level were necessary; thus, NASA researchers developed the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) tool to perform aeroservoelastic structural optimizations of advanced HWB concepts. In this paper, the authors present substantial updates to the HCDstruct tool and related analysis, including: the addition of four inboard and eight outboard control surfaces and two all-movable tail/rudder assemblies, providing a full aeroservoelastic analysis capability; the implementation of asymmetric load cases for structural sizing applications; and a methodology for minimizing control surface actuation power using NASTRAN SOL 200 and HCDstruct's aeroservoelastic finite-element model (FEM).

  5. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  6. Initial sequencing and comparative analysis of the mouse genome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waterston, Robert H.; Lindblad-Toh, Kerstin; Birney, Ewan

    2002-12-15

    The sequence of the mouse genome is a key informational tool for understanding the contents of the human genome and a key experimental tool for biomedical research. Here, we report the results of an international collaboration to produce a high-quality draft sequence of the mouse genome. We also present an initial comparative analysis of the mouse and human genomes, describing some of the insights that can be gleaned from the two sequences. We discuss topics including the analysis of the evolutionary forces shaping the size, structure and sequence of the genomes; the conservation of large-scale synteny across most of themore » genomes; the much lower extent of sequence orthology covering less than half of the genomes; the proportions of the genomes under selection; the number of protein-coding genes; the expansion of gene families related to reproduction and immunity; the evolution of proteins; and the identification of intraspecies polymorphism.« less

  7. Grain size statistics and depositional pattern of the Ecca Group sandstones, Karoo Supergroup in the Eastern Cape Province, South Africa

    NASA Astrophysics Data System (ADS)

    Baiyegunhi, Christopher; Liu, Kuiwu; Gwavava, Oswald

    2017-11-01

    Grain size analysis is a vital sedimentological tool used to unravel the hydrodynamic conditions, mode of transportation and deposition of detrital sediments. In this study, detailed grain-size analysis was carried out on thirty-five sandstone samples from the Ecca Group in the Eastern Cape Province of South Africa. Grain-size statistical parameters, bivariate analysis, linear discriminate functions, Passega diagrams and log-probability curves were used to reveal the depositional processes, sedimentation mechanisms, hydrodynamic energy conditions and to discriminate different depositional environments. The grain-size parameters show that most of the sandstones are very fine to fine grained, moderately well sorted, mostly near-symmetrical and mesokurtic in nature. The abundance of very fine to fine grained sandstones indicate the dominance of low energy environment. The bivariate plots show that the samples are mostly grouped, except for the Prince Albert samples that show scattered trend, which is due to the either mixture of two modes in equal proportion in bimodal sediments or good sorting in unimodal sediments. The linear discriminant function analysis is dominantly indicative of turbidity current deposits under shallow marine environments for samples from the Prince Albert, Collingham and Ripon Formations, while those samples from the Fort Brown Formation are lacustrine or deltaic deposits. The C-M plots indicated that the sediments were deposited mainly by suspension and saltation, and graded suspension. Visher diagrams show that saltation is the major process of transportation, followed by suspension.

  8. The application of simulation modeling to the cost and performance ranking of solar thermal power plants

    NASA Technical Reports Server (NTRS)

    Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.

    1981-01-01

    Small solar thermal power systems (up to 10 MWe in size) were tested. The solar thermal power plant ranking study was performed to aid in experiment activity and support decisions for the selection of the most appropriate technological approach. The cost and performance were determined for insolation conditions by utilizing the Solar Energy Simulation computer code (SESII). This model optimizes the size of the collector field and energy storage subsystem for given engine generator and energy transport characteristics. The development of the simulation tool, its operation, and the results achieved from the analysis are discussed.

  9. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  10. A tool to measure whether business management capacity in general practice impacts on the quality of chronic illness care.

    PubMed

    Holton, Christine H; Proudfoot, Judith G; Jayasinghe, Upali W; Grimm, Jane; Bubner, Tanya K; Winstanley, Julie; Harris, Mark F; Beilby, Justin J

    2010-11-01

    Our aim was to develop a tool to identify specific features of the business and financial management of practices that facilitate better quality care for chronic illness in primary care. Domains of management were identified, resulting in the development of a structured interview tool that was administered in 97 primary care practices in Australia. Interview items were screened and subjected to factor analysis, subscales identified and the overall model fit determined. The instrument's validity was assessed against another measure of quality of care. Analysis provided a four-factor solution containing 21 items, which explained 42.5% of the variance in the total scores. The factors related to administrative processes, human resources, marketing analysis and business development. All scores increased significantly with practice size. The business development subscale and total score were higher for rural practices. There was a significant correlation between the business development subscale and quality of care. The indicators of business and financial management in the final tool appear to be useful predictors of the quality of care. The instrument may help inform policy regarding the structure of general practice and implementation of a systems approach to chronic illness care. It can provide information to practices about areas for further development.

  11. Exploratory Study of Web-Based Planning and Mobile Text Reminders in an Overweight Population

    PubMed Central

    Murray, Peter; Cobain, Mark; Chinapaw, Mai; van Mechelen, Willem; Hurling, Robert

    2011-01-01

    Background Forming specific health plans can help translate good intentions into action. Mobile text reminders can further enhance the effects of planning on behavior. Objective Our aim was to explore the combined impact of a Web-based, fully automated planning tool and mobile text reminders on intention to change saturated fat intake, self-reported saturated fat intake, and portion size changes over 4 weeks. Methods Of 1013 men and women recruited online, 858 were randomly allocated to 1 of 3 conditions: a planning tool (PT), combined planning tool and text reminders (PTT), and a control group. All outcome measures were assessed by online self-reports. Analysis of covariance was used to analyze the data. Results Participants allocated to the PT (meansat urated fat 3.6, meancopingplanning 3) and PTT (meansaturatedfat 3.5, meancopingplanning 3.1) reported a lower consumption of high-fat foods (F 2,571 = 4.74, P = .009) and higher levels of coping planning (F 2,571 = 7.22, P < .001) than the control group (meansat urated f at 3.9, meancopingplanning 2.8). Participants in the PTT condition also reported smaller portion sizes of high-fat foods (mean 2.8; F 2, 569 = 4.12, P = .0) than the control group (meanportions 3.1). The reduction in portion size was driven primarily by the male participants in the PTT (P = .003). We found no significant group differences in terms of percentage saturated fat intake, intentions, action planning, self-efficacy, or feedback on the intervention. Conclusions These findings support the use of Web-based tools and mobile technologies to change dietary behavior. The combination of a fully automated Web-based planning tool with mobile text reminders led to lower self-reported consumption of high-fat foods and greater reductions in portion sizes than in a control condition. Trial Registration International Standard Randomized Controlled Trial Number (ISRCTN): 61819220; http://www.controlled-trials.com/ISRCTN61819220 (Archived by WebCite at http://www.webcitation.org/63YiSy6R8) PMID:22182483

  12. Finite-Time and -Size Scalings in the Evaluation of Large Deviation Functions. Numerical Analysis in Continuous Time

    NASA Astrophysics Data System (ADS)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provide a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to a selection rule that favors the rare trajectories of interest. However, such algorithms are plagued by finite simulation time- and finite population size- effects that can render their use delicate. Using the continuous-time cloning algorithm, we analyze the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of the rare trajectories. We use these scalings in order to propose a numerical approach which allows to extract the infinite-time and infinite-size limit of these estimators.

  13. 24. INTERIOR VIEW, WILLIAM GRAY AT SIZING GUAGE ADJACENT TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. INTERIOR VIEW, WILLIAM GRAY AT SIZING GUAGE ADJACENT TO BRADLEY HAMMER; NOTE THIS IS THE SAME TOOL AS BEING FORGED ABOVE - Warwood Tool Company, Foot of Nineteenth Street, Wheeling, Ohio County, WV

  14. Witnessing of Cheating-in-Exams Behavior and Factors Sustaining Integrity

    ERIC Educational Resources Information Center

    Starovoytova, Diana; Arimi, Milton

    2017-01-01

    This study is a fraction of a larger research on cheating, at the School of Engineering (SOE). The study design used a descriptive survey approach and a document analysis. A designed confidential self-report questioner was used as the main instrument, for this study, with the sample size of 100 subjects and response rate of 95%. The tool was…

  15. Delivering Integrated Services. Models for Facilitating Change in Small and Mid-Sized Firms. Business Assistance Tools.

    ERIC Educational Resources Information Center

    Mitchell, Stephen M.

    This guide draws on case studies to identify lessons for small and midsized firms who wish to improve the quality of their services and facilitate change. Following an introduction, section 2 describes the context in which the research was undertaken after a needs analysis was conducted of small and midsized businesses and service providers, and…

  16. An automated field phenotyping pipeline for application in grapevine research.

    PubMed

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-02-26

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale.

  17. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    PubMed Central

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  18. GenomicTools: a computational platform for developing high-throughput analytics in genomics.

    PubMed

    Tsirigos, Aristotelis; Haiminen, Niina; Bilal, Erhan; Utro, Filippo

    2012-01-15

    Recent advances in sequencing technology have resulted in the dramatic increase of sequencing data, which, in turn, requires efficient management of computational resources, such as computing time, memory requirements as well as prototyping of computational pipelines. We present GenomicTools, a flexible computational platform, comprising both a command-line set of tools and a C++ API, for the analysis and manipulation of high-throughput sequencing data such as DNA-seq, RNA-seq, ChIP-seq and MethylC-seq. GenomicTools implements a variety of mathematical operations between sets of genomic regions thereby enabling the prototyping of computational pipelines that can address a wide spectrum of tasks ranging from pre-processing and quality control to meta-analyses. Additionally, the GenomicTools platform is designed to analyze large datasets of any size by minimizing memory requirements. In practical applications, where comparable, GenomicTools outperforms existing tools in terms of both time and memory usage. The GenomicTools platform (version 2.0.0) was implemented in C++. The source code, documentation, user manual, example datasets and scripts are available online at http://code.google.com/p/ibm-cbc-genomic-tools.

  19. The Kinematic Analysis of Flat Leverage Mechanism of the Third Class

    NASA Astrophysics Data System (ADS)

    Zhauyt, A.; Mamatova, G.; Abdugaliyeva, G.; Alipov, K.; Sakenova, A.; Alimbetov, A.

    2017-10-01

    It is necessary to make link mechanisms calculation to the strength at designing of flat link mechanisms of high class after definition of block diagrams and link linear sizes i.e. it is rationally to choose their forms and to determine the section sizes. The algorithm of the definition of dimension of link mechanism lengths of high classes (MHC) and their metric parameters at successive approach is offered in this work. It this paper educational and research software named GIM is presented. This software has been developed with the aim of approaching the difficulties students usually encounter when facing up to kinematic analysis of mechanisms. A deep understanding of the kinematic analysis is necessary to go a step further into design and synthesis of mechanisms. In order to support and complement the theoretical lectures, GIM software is used during the practical exercises, serving as an educational complementary tool reinforcing the knowledge acquired by the students.

  20. De novo comparative transcriptome analysis of genes involved in fruit morphology of pumpkin cultivars with extreme size difference and development of EST-SSR markers.

    PubMed

    Xanthopoulou, Aliki; Ganopoulos, Ioannis; Psomopoulos, Fotis; Manioudaki, Maria; Moysiadis, Theodoros; Kapazoglou, Aliki; Osathanunkul, Maslin; Michailidou, Sofia; Kalivas, Apostolos; Tsaftaris, Athanasios; Nianiou-Obeidat, Irini; Madesis, Panagiotis

    2017-07-30

    The genetic basis of fruit size and shape was investigated for the first time in Cucurbita species and genetic loci associated with fruit morphology have been identified. Although extensive genomic resources are available at present for tomato (Solanum lycopersicum), cucumber (Cucumis sativus), melon (Cucumis melo) and watermelon (Citrullus lanatus), genomic databases for Cucurbita species are limited. Recently, our group reported the generation of pumpkin (Cucurbita pepo) transcriptome databases from two contrasting cultivars with extreme fruit sizes. In the current study we used these databases to perform comparative transcriptome analysis in order to identify genes with potential roles in fruit morphology and fruit size. Differential Gene Expression (DGE) analysis between cv. 'Munchkin' (small-fruit) and cv. 'Big Moose' (large-fruit) revealed a variety of candidate genes associated with fruit morphology with significant differences in gene expression between the two cultivars. In addition, we have set the framework for generating EST-SSR markers, which discriminate different C. pepo cultivars and show transferability to related Cucurbitaceae species. The results of the present study will contribute to both further understanding the molecular mechanisms regulating fruit morphology and furthermore identifying the factors that determine fruit size. Moreover, they may lead to the development of molecular marker tools for selecting genotypes with desired morphological traits. Copyright © 2017. Published by Elsevier B.V.

  1. Automated measurement of diatom size

    USGS Publications Warehouse

    Spaulding, Sarah A.; Jewson, David H.; Bixby, Rebecca J.; Nelson, Harry; McKnight, Diane M.

    2012-01-01

    Size analysis of diatom populations has not been widely considered, but it is a potentially powerful tool for understanding diatom life histories, population dynamics, and phylogenetic relationships. However, measuring cell dimensions on a light microscope is a time-consuming process. An alternative technique has been developed using digital flow cytometry on a FlowCAM® (Fluid Imaging Technologies) to capture hundreds, or even thousands, of images of a chosen taxon from a single sample in a matter of minutes. Up to 30 morphological measures may be quantified through post-processing of the high resolution images. We evaluated FlowCAM size measurements, comparing them against measurements from a light microscope. We found good agreement between measurement of apical cell length in species with elongated, straight valves, including small Achnanthidium minutissimum (11-21 µm) and largeDidymosphenia geminata (87–137 µm) forms. However, a taxon with curved cells, Hannaea baicalensis (37–96 µm), showed differences of ~ 4 µm between the two methods. Discrepancies appear to be influenced by the choice of feret or geodesic measurement for asymmetric cells. We describe the operating conditions necessary for analysis of size distributions and present suggestions for optimal instrument conditions for size analysis of diatom samples using the FlowCAM. The increased speed of data acquisition through use of imaging flow cytometers like the FlowCAM is an essential step for advancing studies of diatom populations.

  2. Effect of crumb cellular structure characterized by image analysis on cake softness.

    PubMed

    Dewaest, Marine; Villemejane, Cindy; Berland, Sophie; Neron, Stéphane; Clement, Jérôme; Verel, Aliette; Michon, Camille

    2018-06-01

    Sponge cake is a cereal product characterized by an aerated crumb and appreciated for its softness. When formulating such product, it is interesting to be able to characterize the crumb structure using image analysis and to bring knowledge about the effects of the crumb cellular structure on its mechanical properties which contribute to softness. An image analysis method based on mathematical morphology was adapted from the one developed for bread crumb. In order to evaluate its ability to discriminate cellular structures, series of cakes were prepared using two rather similar emulsifiers but also using flours with different aging times before use. The mechanical properties of the crumbs of these different cakes were also characterized. It allowed a cell structure classification taking into account cell size and homogeneity, but also cell wall thickness and the number of holes in the walls. Interestingly, the cellular structure differences had a larger impact on the aerated crumb Young modulus than the wall firmness. Increasing the aging time of flour before use leads to the production of firmer crumbs due to coarser and inhomogeneous cellular structures. Changing the composition of the emulsifier may change the cellular structure and, depending on the type of the structural changes, have an impact on the firmness of the crumb. Cellular structure rather than cell wall firmness was found to impact cake crumb firmness. The new fast and automated tool for cake crumb structure analysis allows detecting quickly any change in cell size or homogeneity but also cell wall thickness and number of holes in the walls (openness degree). To obtain a softer crumb, it seems that options are to decrease the cell size and the cell wall thickness and/or to increase the openness degree. It is then possible to easily evaluate the effects of ingredients (flour composition, emulsifier …) or change in the process on the crumb structure and thus its softness. Moreover, this image analysis is a very efficient tool for quality control. © 2017 Wiley Periodicals, Inc.

  3. Evaluation of tools for highly variable gene discovery from single-cell RNA-seq data.

    PubMed

    Yip, Shun H; Sham, Pak Chung; Wang, Junwen

    2018-02-21

    Traditional RNA sequencing (RNA-seq) allows the detection of gene expression variations between two or more cell populations through differentially expressed gene (DEG) analysis. However, genes that contribute to cell-to-cell differences are not discoverable with RNA-seq because RNA-seq samples are obtained from a mixture of cells. Single-cell RNA-seq (scRNA-seq) allows the detection of gene expression in each cell. With scRNA-seq, highly variable gene (HVG) discovery allows the detection of genes that contribute strongly to cell-to-cell variation within a homogeneous cell population, such as a population of embryonic stem cells. This analysis is implemented in many software packages. In this study, we compare seven HVG methods from six software packages, including BASiCS, Brennecke, scLVM, scran, scVEGs and Seurat. Our results demonstrate that reproducibility in HVG analysis requires a larger sample size than DEG analysis. Discrepancies between methods and potential issues in these tools are discussed and recommendations are made.

  4. CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.

    PubMed

    Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee

    2018-04-20

    The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.

  5. ParCAT: A Parallel Climate Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. ParCAT is implemented in C to provide efficient file IO. The file IO operations in the toolkit use the parallel-netcdf library; this enables the code to use the parallel IO capabilities of modern HPC systems. Analysis that currently requires an estimated 12+ hours with the traditional CCSM Land Model Diagnostics Package can now be performed in as little as 30 minutes on a single desktop workstation and a few minutes for relatively small jobs completed on modern HPC systems such as ORNL's Jaguar.

  6. On-line Tool Wear Detection on DCMT070204 Carbide Tool Tip Based on Noise Cutting Audio Signal using Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.

    2018-01-01

    This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.

  7. Fast probabilistic file fingerprinting for big data

    PubMed Central

    2013-01-01

    Background Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. Results We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Conclusions Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff. PMID:23445565

  8. SU-E-J-92: CERR: New Tools to Analyze Image Registration Precision.

    PubMed

    Apte, A; Wang, Y; Oh, J; Saleh, Z; Deasy, J

    2012-06-01

    To present new tools in CERR (The Computational Environment for Radiotherapy Research) to analyze image registration and other software updates/additions. CERR continues to be a key environment (cited more than 129 times to date) for numerous RT-research studies involving outcomes modeling, prototyping algorithms for segmentation, and registration, experiments with phantom dosimetry, IMRT research, etc. Image registration is one of the key technologies required in many research studies. CERR has been interfaced with popular image registration frameworks like Plastimatch and ITK. Once the images have been autoregistered, CERR provides tools to analyze the accuracy of registration using the following innovative approaches (1)Distance Discordance Histograms (DDH), described in detail in a separate paper and (2)'MirrorScope', explained as follows: for any view plane the 2-d image is broken up into a 2d grid of medium-sized squares. Each square contains a right-half, which is the reference image, and a left-half, which is the mirror flipped version of the overlay image. The user can increase or decrease the size of this grid to control the resolution of the analysis. Other updates to CERR include tools to extract image and dosimetric features programmatically and storage in a central database and tools to interface with Statistical analysis software like SPSS and Matlab Statistics toolbox. MirrorScope was compared on various examples, including 'perfect' registration examples and 'artificially translated' registrations. for 'perfect' registration, the patterns obtained within each circles are symmetric, and are easily, visually recognized as aligned. For registrations that are off, the patterns obtained in the circles located in the regions of imperfections show unsymmetrical patterns that are easily recognized. The new updates to CERR further increase its utility for RT-research. Mirrorscope is a visually intuitive method of monitoring the accuracy of image registration that improves on the visual confusion of standard methods. © 2012 American Association of Physicists in Medicine.

  9. Design of the VISITOR Tool: A Versatile ImpulSive Interplanetary Trajectory OptimizeR

    NASA Technical Reports Server (NTRS)

    Corpaccioli, Luca; Linskens, Harry; Komar, David R.

    2014-01-01

    The design of trajectories for interplanetary missions represents one of the most complex and important problems to solve during conceptual space mission design. To facilitate conceptual mission sizing activities, it is essential to obtain sufficiently accurate trajectories in a fast and repeatable manner. To this end, the VISITOR tool was developed. This tool modularly augments a patched conic MGA-1DSM model with a mass model, launch window analysis, and the ability to simulate more realistic arrival and departure operations. This was implemented in MATLAB, exploiting the built-in optimization tools and vector analysis routines. The chosen optimization strategy uses a grid search and pattern search, an iterative variable grid method. A genetic algorithm can be selectively used to improve search space pruning, at the cost of losing the repeatability of the results and increased computation time. The tool was validated against seven flown missions: the average total mission (Delta)V offset from the nominal trajectory was 9.1%, which was reduced to 7.3% when using the genetic algorithm at the cost of an increase in computation time by a factor 5.7. It was found that VISITOR was well-suited for the conceptual design of interplanetary trajectories, while also facilitating future improvements due to its modular structure.

  10. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  11. Vector production in an academic environment: a tool to assess production costs.

    PubMed

    Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan; Cornetta, Kenneth

    2013-02-01

    Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production.

  12. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  13. Vector Production in an Academic Environment: A Tool to Assess Production Costs

    PubMed Central

    Boeke, Aaron; Doumas, Patrick; Reeves, Lilith; McClurg, Kyle; Bischof, Daniela; Sego, Lina; Auberry, Alisha; Tatikonda, Mohan

    2013-01-01

    Abstract Generating gene and cell therapy products under good manufacturing practices is a complex process. When determining the cost of these products, researchers must consider the large number of supplies used for manufacturing and the personnel and facility costs to generate vector and maintain a cleanroom facility. To facilitate cost estimates, the Indiana University Vector Production Facility teamed with the Indiana University Kelley School of Business to develop a costing tool that, in turn, provides pricing. The tool is designed in Microsoft Excel and is customizable to meet the needs of other core facilities. It is available from the National Gene Vector Biorepository. The tool allows cost determinations using three different costing methods and was developed in an effort to meet the A21 circular requirements for U.S. core facilities performing work for federally funded projects. The costing tool analysis reveals that the cost of vector production does not have a linear relationship with batch size. For example, increasing the production from 9 to18 liters of a retroviral vector product increases total costs a modest 1.2-fold rather than doubling in total cost. The analysis discussed in this article will help core facilities and investigators plan a cost-effective strategy for gene and cell therapy production. PMID:23360377

  14. Conceptual Design and Structural Optimization of NASA Environmentally Responsible Aviation (ERA) Hybrid Wing Body Aircraft

    NASA Technical Reports Server (NTRS)

    Quinlan, Jesse R.; Gern, Frank H.

    2016-01-01

    Simultaneously achieving the fuel consumption and noise reduction goals set forth by NASA's Environmentally Responsible Aviation (ERA) project requires innovative and unconventional aircraft concepts. In response, advanced hybrid wing body (HWB) aircraft concepts have been proposed and analyzed as a means of meeting these objectives. For the current study, several HWB concepts were analyzed using the Hybrid wing body Conceptual Design and structural optimization (HCDstruct) analysis code. HCDstruct is a medium-fidelity finite element based conceptual design and structural optimization tool developed to fill the critical analysis gap existing between lower order structural sizing approaches and detailed, often finite element based sizing methods for HWB aircraft concepts. Whereas prior versions of the tool used a half-model approach in building the representative finite element model, a full wing-tip-to-wing-tip modeling capability was recently added to HCDstruct, which alleviated the symmetry constraints at the model centerline in place of a free-flying model and allowed for more realistic center body, aft body, and wing loading and trim response. The latest version of HCDstruct was applied to two ERA reference cases, including the Boeing Open Rotor Engine Integration On an HWB (OREIO) concept and the Boeing ERA-0009H1 concept, and results agreed favorably with detailed Boeing design data and related Flight Optimization System (FLOPS) analyses. Following these benchmark cases, HCDstruct was used to size NASA's ERA HWB concepts and to perform a related scaling study.

  15. Fully automated analysis of multi-resolution four-channel micro-array genotyping data

    NASA Astrophysics Data System (ADS)

    Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.

    2006-03-01

    We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.

  16. Development of Response Surface Models for Rapid Analysis & Multidisciplinary Optimization of Launch Vehicle Design Concepts

    NASA Technical Reports Server (NTRS)

    Unal, Resit

    1999-01-01

    Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.

  17. Environmental control and life support system analysis tools for the Space Station era

    NASA Technical Reports Server (NTRS)

    Blakely, R. L.; Rowell, L. F.

    1984-01-01

    This paper describes the concept of a developing emulation, simulation, sizing, and technology assessment program (ESSTAP) which can be used effectively for the various functional disciplines (structures, power, ECLSS, etc.) beginning with the initial system selection and conceptual design processes and continuing on through the mission operation and growth phases of the Space Station for the purpose of minimizing overall program costs. It will discuss the basic requirements for these tools, as currently envisioned for the Environmental Control and Life Support System (ECLSS), identifying their intended and potential uses and applications, and present examples and status of several representative tools. The development and applications of a Space Station Atmospheric Revitalization Subsystem (ARS) demonstration model to be used for concent verification will also be discussed.

  18. Determination and representation of electric charge distributions associated with adverse weather conditions

    NASA Technical Reports Server (NTRS)

    Rompala, John T.

    1992-01-01

    Algorithms are presented for determining the size and location of electric charges which model storm systems and lightning strikes. The analysis utilizes readings from a grid of ground level field mills and geometric constraints on parameters to arrive at a representative set of charges. This set is used to generate three dimensional graphical depictions of the set as well as contour maps of the ground level electrical environment over the grid. The composite, analytic and graphic package is demonstrated and evaluated using controlled input data and archived data from a storm system. The results demonstrate the packages utility as: an operational tool in appraising adverse weather conditions; a research tool in studies of topics such as storm structure, storm dynamics, and lightning; and a tool in designing and evaluating grid systems.

  19. Development and Demonstration of a Computational Tool for the Analysis of Particle Vitiation Effects in Hypersonic Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Perkins, Hugh Douglas

    2010-01-01

    In order to improve the understanding of particle vitiation effects in hypersonic propulsion test facilities, a quasi-one dimensional numerical tool was developed to efficiently model reacting particle-gas flows over a wide range of conditions. Features of this code include gas-phase finite-rate kinetics, a global porous-particle combustion model, mass, momentum and energy interactions between phases, and subsonic and supersonic particle drag and heat transfer models. The basic capabilities of this tool were validated against available data or other validated codes. To demonstrate the capabilities of the code a series of computations were performed for a model hypersonic propulsion test facility and scramjet. Parameters studied were simulated flight Mach number, particle size, particle mass fraction and particle material.

  20. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  1. Analysis of Polder Polarization Measurements During Astex and Eucrex Experiments

    NASA Technical Reports Server (NTRS)

    Chen, Hui; Han, Qingyuan; Chou, Joyce; Welch, Ronald M.

    1997-01-01

    Polarization is more sensitive than intensity to cloud microstructure such as the particle size and shape, and multiple scattering does not wash out features in polarization as effectively as it does in the intensity. Polarization measurements, particularly in the near IR, are potentially a valuable tool for cloud identification and for studies of the microphysics of clouds. The POLDER instrument is designed to provide wide field of view bidirectional images in polarized light. During the ASTEX-SOFIA campaign on June 12th, 1992, over the Atlantic Ocean (near the Azores Islands), images of homogeneous thick stratocumulus cloud fields were acquired. During the EUCREX'94 (April, 1994) campaign, the POLDER instrument was flying over the region of Brittany (France), taking observations of cirrus clouds. This study involves model studies and data analysis of POLDER observations. Both models and data analysis show that POLDER can be used to detect cloud thermodynamic phases. Model results show that polarized reflection in the Lamda =0.86 micron band is sensitive to cloud droplet sizes but not to cloud optical thickness. Comparison between model and data analysis reveals that cloud droplet sizes during ASTEX are about 5 microns, which agrees very well with the results of in situ measurements (4-5 microns). Knowing the retrieved cloud droplet sizes, the total reflected intensity of the POLDER measurements then can be used to retrieve cloud optical thickness. The close agreement between data analysis and model results during ASTEX also suggests the homogeneity of the cloud layer during that campaign.

  2. TOFSIMS-P: a web-based platform for analysis of large-scale TOF-SIMS data.

    PubMed

    Yun, So Jeong; Park, Ji-Won; Choi, Il Ju; Kang, Byeongsoo; Kim, Hark Kyun; Moon, Dae Won; Lee, Tae Geol; Hwang, Daehee

    2011-12-15

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) has been a useful tool to profile secondary ions from the near surface region of specimens with its high molecular specificity and submicrometer spatial resolution. However, the TOF-SIMS analysis of even a moderately large size of samples has been hampered due to the lack of tools for automatically analyzing the huge amount of TOF-SIMS data. Here, we present a computational platform to automatically identify and align peaks, find discriminatory ions, build a classifier, and construct networks describing differential metabolic pathways. To demonstrate the utility of the platform, we analyzed 43 data sets generated from seven gastric cancer and eight normal tissues using TOF-SIMS. A total of 87 138 ions were detected from the 43 data sets by TOF-SIMS. We selected and then aligned 1286 ions. Among them, we found the 66 ions discriminating gastric cancer tissues from normal ones. Using these 66 ions, we then built a partial least square-discriminant analysis (PLS-DA) model resulting in a misclassification error rate of 0.024. Finally, network analysis of the 66 ions showed disregulation of amino acid metabolism in the gastric cancer tissues. The results show that the proposed framework was effective in analyzing TOF-SIMS data from a moderately large size of samples, resulting in discrimination of gastric cancer tissues from normal tissues and identification of biomarker candidates associated with the amino acid metabolism.

  3. The importance of integrated left atrial evaluation: From hypertension to heart failure with preserved ejection fraction.

    PubMed

    Beltrami, Matteo; Palazzuoli, Alberto; Padeletti, Luigi; Cerbai, Elisabetta; Coiro, Stefano; Emdin, Michele; Marcucci, Rossella; Morrone, Doralisa; Cameli, Matteo; Savino, Ketty; Pedrinelli, Roberto; Ambrosio, Giuseppe

    2018-02-01

    Functional analysis and measurement of left atrium are an integral part of cardiac evaluation, and they represent a key element during non-invasive analysis of diastolic function in patients with hypertension (HT) and/or heart failure with preserved ejection fraction (HFpEF). However, diastolic dysfunction remains quite elusive regarding classification, and atrial size and function are two key factors for left ventricular (LV) filling evaluation. Chronic left atrial (LA) remodelling is the final step of chronic intra-cavitary pressure overload, and it accompanies increased neurohormonal, proarrhythmic and prothrombotic activities. In this systematic review, we aim to purpose a multi-modality approach for LA geometry and function analysis, which integrates diastolic flow with LA characteristics and remodelling through application of both traditional and new diagnostic tools. The most important studies published in the literature on LA size, function and diastolic dysfunction in patients with HFpEF, HT and/or atrial fibrillation (AF) are considered and discussed. In HFpEF and HT, pulsed and tissue Doppler assessments are useful tools to estimate LV filling pressure, atrio-ventricular coupling and LV relaxation but they need to be enriched with LA evaluation in terms of morphology and function. An integrated evaluation should be also applied to patients with a high arrhythmic risk, in whom eccentric LA remodelling and higher LA stiffness are associated with a greater AF risk. Evaluation of LA size, volume, function and structure are mandatory in the management of patients with HT, HFpEF and AF. A multi-modality approach could provide additional information, identifying subjects with more severe LA remodelling. Left atrium assessment deserves an accurate study inside the cardiac imaging approach and optimised measurement with established cut-offs need to be better recognised through multicenter studies. © 2017 John Wiley & Sons Ltd.

  4. Integrated Modeling Tools for Thermal Analysis and Applications

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Needels, Laura; Papalexandris, Miltiadis

    1999-01-01

    Integrated modeling of spacecraft systems is a rapidly evolving area in which multidisciplinary models are developed to design and analyze spacecraft configurations. These models are especially important in the early design stages where rapid trades between subsystems can substantially impact design decisions. Integrated modeling is one of the cornerstones of two of NASA's planned missions in the Origins Program -- the Next Generation Space Telescope (NGST) and the Space Interferometry Mission (SIM). Common modeling tools for control design and opto-mechanical analysis have recently emerged and are becoming increasingly widely used. A discipline that has been somewhat less integrated, but is nevertheless of critical concern for high precision optical instruments, is thermal analysis and design. A major factor contributing to this mild estrangement is that the modeling philosophies and objectives for structural and thermal systems typically do not coincide. Consequently the tools that are used in these discplines suffer a degree of incompatibility, each having developed along their own evolutionary path. Although standard thermal tools have worked relatively well in the past. integration with other disciplines requires revisiting modeling assumptions and solution methods. Over the past several years we have been developing a MATLAB based integrated modeling tool called IMOS (Integrated Modeling of Optical Systems) which integrates many aspects of structural, optical, control and dynamical analysis disciplines. Recent efforts have included developing a thermal modeling and analysis capability, which is the subject of this article. Currently, the IMOS thermal suite contains steady state and transient heat equation solvers, and the ability to set up the linear conduction network from an IMOS finite element model. The IMOS code generates linear conduction elements associated with plates and beams/rods of the thermal network directly from the finite element structural model. Conductances for temperature varying materials are accommodated. This capability both streamlines the process of developing the thermal model from the finite element model, and also makes the structural and thermal models compatible in the sense that each structural node is associated with a thermal node. This is particularly useful when the purpose of the analysis is to predict structural deformations due to thermal loads. The steady state solver uses a restricted step size Newton method, and the transient solver is an adaptive step size implicit method applicable to general differential algebraic systems. Temperature dependent conductances and capacitances are accommodated by the solvers. In addition to discussing the modeling and solution methods. applications where the thermal modeling is "in the loop" with sensitivity analysis, optimization and optical performance drawn from our experiences with the Space Interferometry Mission (SIM), and the Next Generation Space Telescope (NGST) are presented.

  5. A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit

    NASA Technical Reports Server (NTRS)

    Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.

    2016-01-01

    Suboptimal suit fit is a known risk factor for crewmember shoulder injury. Suit fit assessment is however prohibitively time consuming and cannot be generalized across wide variations of body shapes and poses. In this work, we have developed a new design tool based on the statistical analysis of body shape scans. This tool is aimed at predicting the skin deformation and shape variations for any body size and shoulder pose for a target population. This new process, when incorporated with CAD software, will enable virtual suit fit assessments, predictively quantifying the contact volume, and clearance between the suit and body surface at reduced time and cost.

  6. Evaluating biomarkers for prognostic enrichment of clinical trials.

    PubMed

    Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R

    2017-12-01

    A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.

  7. Using GIS to analyze animal movements in the marine environment

    USGS Publications Warehouse

    Hooge, Philip N.; Eichenlaub, William M.; Solomon, Elizabeth K.; Kruse, Gordon H.; Bez, Nicolas; Booth, Anthony; Dorn, Martin W.; Hills, Susan; Lipcius, Romuald N.; Pelletier, Dominique; Roy, Claude; Smith, Stephen J.; Witherell, David B.

    2001-01-01

    Advanced methods for analyzing animal movements have been little used in the aquatic research environment compared to the terrestrial. In addition, despite obvious advantages of integrating geographic information systems (GIS) with spatial studies of animal movement behavior, movement analysis tools have not been integrated into GIS for either aquatic or terrestrial environments. We therefore developed software that integrates one of the most commonly used GIS programs (ArcView®) with a large collection of animal movement analysis tools. This application, the Animal Movement Analyst Extension (AMAE), can be loaded as an extension to ArcView® under multiple operating system platforms (PC, Unix, and Mac OS). It contains more than 50 functions, including parametric and nonparametric home range analyses, random walk models, habitat analyses, point and circular statistics, tests of complete spatial randomness, tests for autocorrelation and sample size, point and line manipulation tools, and animation tools. This paper describes the use of these functions in analyzing animal location data; some limited examples are drawn from a sonic-tracking study of Pacific halibut (Hippoglossus stenolepis) in Glacier Bay, Alaska. The extension is available on the Internet at www.absc.usgs.gov/glba/gistools/index.htm.

  8. Particle size analysis on density, surface morphology and specific capacitance of carbon electrode from rubber wood sawdust

    NASA Astrophysics Data System (ADS)

    Taer, E.; Kurniasih, B.; Sari, F. P.; Zulkifli, Taslim, R.; Sugianto, Purnama, A.; Apriwandi, Susanti, Y.

    2018-02-01

    The particle size analysis for supercapacitor carbon electrodes from rubber wood sawdust (SGKK) has been done successfully. The electrode particle size was reviewed against the properties such as density, degree of crystallinity, surface morphology and specific capacitance. The variations in particle size were made by different treatment on the grinding and sieving process. The sample particle size was distinguished as 53-100 µm for 20 h (SA), 38-53 µm for 20 h (SB) and < 38 µm with variations of grinding time for 40 h (SC) and 80 h (SD) respectively. All of the samples were activated by 0.4 M KOH solution. Carbon electrodes were carbonized at temperature of 600oC in N2 gas environment and then followed by CO2 gas activation at a temperature of 900oC for 2 h. The densities for each variation in the particle size were 1.034 g cm-3, 0.849 g cm-3, 0.892 g cm-3 and 0.982 g cm-3 respectively. The morphological study identified the distance between the particles more closely at 38-53 µm (SB) particle size. The electrochemical properties of supercapacitor cells have been investigated using electrochemical methods such as impedance spectroscopy and charge-discharge at constant current using Solatron 1280 tools. Electrochemical properties testing results have shown SB samples with a particle size of 38-53 µm produce supercapacitor cells with optimum capacitive performance.

  9. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  10. Cutting tool form compensation system and method

    DOEpatents

    Barkman, W.E.; Babelay, E.F. Jr.; Klages, E.J.

    1993-10-19

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed. 9 figures.

  11. Cutting tool form compensaton system and method

    DOEpatents

    Barkman, William E.; Babelay, Jr., Edwin F.; Klages, Edward J.

    1993-01-01

    A compensation system for a computer-controlled machining apparatus having a controller and including a cutting tool and a workpiece holder which are movable relative to one another along a preprogrammed path during a machining operation utilizes a camera and a vision computer for gathering information at a preselected stage of a machining operation relating to the actual shape and size of the cutting edge of the cutting tool and for altering the preprogrammed path in accordance with detected variations between the actual size and shape of the cutting edge and an assumed size and shape of the cutting edge. The camera obtains an image of the cutting tool against a background so that the cutting tool and background possess contrasting light intensities, and the vision computer utilizes the contrasting light intensities of the image to locate points therein which correspond to points along the actual cutting edge. Following a series of computations involving the determining of a tool center from the points identified along the tool edge, the results of the computations are fed to the controller where the preprogrammed path is altered as aforedescribed.

  12. Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meintz, Andrew; Prohaska, Robert; Konan, Arnaud

    System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles (EVs). This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by NREL called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of chargingmore » locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less

  13. Two-Dimensional Neutronic and Fuel Cycle Analysis of the Transatomic Power Molten Salt Reactor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Betzler, Benjamin R.; Powers, Jeffrey J.; Worrall, Andrew

    2017-01-15

    This status report presents the results from the first phase of the collaboration between Transatomic Power Corporation (TAP) and Oak Ridge National Laboratory (ORNL) to provide neutronic and fuel cycle analysis of the TAP core design through the Department of Energy Gateway for Accelerated Innovation in Nuclear, Nuclear Energy Voucher program. The TAP design is a molten salt reactor using movable moderator rods to shift the neutron spectrum in the core from mostly epithermal at beginning of life to thermal at end of life. Additional developments in the ChemTriton modeling and simulation tool provide the critical moderator-to-fuel ratio searches andmore » time-dependent parameters necessary to simulate the continuously changing physics in this complex system. Results from simulations with these tools show agreement with TAP-calculated performance metrics for core lifetime, discharge burnup, and salt volume fraction, verifying the viability of reducing actinide waste production with this design. Additional analyses of time step sizes, mass feed rates and enrichments, and isotopic removals provide additional information to make informed design decisions. This work further demonstrates capabilities of ORNL modeling and simulation tools for analysis of molten salt reactor designs and strongly positions this effort for the upcoming three-dimensional core analysis.« less

  14. Imaging mass spectrometry data reduction: automated feature identification and extraction.

    PubMed

    McDonnell, Liam A; van Remoortere, Alexandra; de Velde, Nico; van Zeijl, René J M; Deelder, André M

    2010-12-01

    Imaging MS now enables the parallel analysis of hundreds of biomolecules, spanning multiple molecular classes, which allows tissues to be described by their molecular content and distribution. When combined with advanced data analysis routines, tissues can be analyzed and classified based solely on their molecular content. Such molecular histology techniques have been used to distinguish regions with differential molecular signatures that could not be distinguished using established histologic tools. However, its potential to provide an independent, complementary analysis of clinical tissues has been limited by the very large file sizes and large number of discrete variables associated with imaging MS experiments. Here we demonstrate data reduction tools, based on automated feature identification and extraction, for peptide, protein, and lipid imaging MS, using multiple imaging MS technologies, that reduce data loads and the number of variables by >100×, and that highlight highly-localized features that can be missed using standard data analysis strategies. It is then demonstrated how these capabilities enable multivariate analysis on large imaging MS datasets spanning multiple tissues. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.

  15. Evaluation of the Effectiveness of Stormwater Decision Support Tools for Infrastructure Selection and the Barriers to Implementation

    NASA Astrophysics Data System (ADS)

    Spahr, K.; Hogue, T. S.

    2016-12-01

    Selecting the most appropriate green, gray, and / or hybrid system for stormwater treatment and conveyance can prove challenging to decision markers across all scales, from site managers to large municipalities. To help streamline the selection process, a multi-disciplinary team of academics and professionals is developing an industry standard for selecting and evaluating the most appropriate stormwater management technology for different regions. To make the tool more robust and comprehensive, life-cycle cost assessment and optimization modules will be included to evaluate non-monetized and ecosystem benefits of selected technologies. Initial work includes surveying advisory board members based in cities that use existing decision support tools in their infrastructure planning process. These surveys will qualify the decisions currently being made and identify challenges within the current planning process across a range of hydroclimatic regions and city size. Analysis of social and other non-technical barriers to adoption of the existing tools is also being performed, with identification of regional differences and institutional challenges. Surveys will also gage the regional appropriateness of certain stormwater technologies based off experiences in implementing stormwater treatment and conveyance plans. In additional to compiling qualitative data on existing decision support tools, a technical review of components of the decision support tool used will be performed. Gaps in each tool's analysis, like the lack of certain critical functionalities, will be identified and ease of use will be evaluated. Conclusions drawn from both the qualitative and quantitative analyses will be used to inform the development of the new decision support tool and its eventual dissemination.

  16. [Application of asymmetrical flow field-flow fractionation for size characterization of low density lipoprotein in egg yolk plasma].

    PubMed

    Zhang, Wenhui; Cai, Chunxue; Wang, Jing; Mao, Zhen; Li, Yueqiu; Ding, Liang; Shen, Shigang; Dou, Haiyang

    2017-08-08

    Home-made asymmetrical flow field-flow fractionation (AF4) system, online coupled with ultraviolet/visible (UV/Vis) detector was employed for the separation and size characterization of low density lipoprotein (LDL) in egg yolk plasma. At close to natural condition of egg yolk, the effects of cross flow rate, sample loading, and type of membrane on the size distribution of LDL were investigated. Under the optimal operation conditions, AF4-UV/Vis provides the size distribution of LDL. Moreover, the precision of AF4-UV/Vis method proposed in this work for the analysis of LDL in egg yolk plasma was evaluated. The intra-day precisions were 1.3% and 1.9% ( n =7) and the inter-day precisions were 2.4% and 2.3% ( n =7) for the elution peak height and elution peak area of LDL, respectively. Results reveal that AF4-UV/Vis is a useful tool for the separation and size characterization of LDL in egg yolk plasma.

  17. Evaluation of the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing acute and chronic wounds.

    PubMed

    Choi, Edmond P H; Chin, Weng Yee; Wan, Eric Y F; Lam, Cindy L K

    2016-05-01

    To examine the internal and external responsiveness of the Pressure Ulcer Scale for Healing (PUSH) tool for assessing the healing progress in acute and chronic wounds. It is important to establish the responsiveness of instruments used in conducting wound care assessments to ensure that they are able to capture changes in wound healing accurately over time. Prospective longitudinal observational study. The key study instrument was the PUSH tool. Internal responsiveness was assessed using paired t-testing and effect size statistics. External responsiveness was assessed using multiple linear regression. All new patients with at least one eligible acute or chronic wound, enrolled in the Nurse and Allied Health Clinic-Wound Care programme between 1 December 2012 - 31 March 2013 were included for analysis (N = 541). Overall, the PUSH tool was able to detect statistically significant changes in wound healing between baseline and discharge. The effect size statistics were large. The internal responsiveness of the PUSH tool was confirmed in patients with a variety of different wound types including venous ulcers, pressure ulcers, neuropathic ulcers, burns and scalds, skin tears, surgical wounds and traumatic wounds. After controlling for age, gender and wound type, subjects in the 'wound improved but not healed' group had a smaller change in PUSH scores than those in the 'wound healed' group. Subjects in the 'wound static or worsened' group had the smallest change in PUSH scores. The external responsiveness was confirmed. The internal and external responsiveness of the PUSH tool confirmed that it can be used to track the healing progress of both acute and chronic wounds. © 2016 The Authors. Journal of Advanced Nursing Published by John Wiley & Sons Ltd.

  18. Integrated Design Engineering Analysis (IDEA) Environment - Aerodynamics, Aerothermodynamics, and Thermal Protection System Integration Module

    NASA Technical Reports Server (NTRS)

    Kamhawi, Hilmi N.

    2011-01-01

    This report documents the work performed during from March 2010 October 2011. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed environment using the Adaptive Modeling Language (AML) as the underlying framework. This report will focus on describing the work done in the area of extending the aerodynamics, and aerothermodynamics module using S/HABP, CBAERO, PREMIN and LANMIN. It will also detail the work done integrating EXITS as the TPS sizing tool.

  19. Repliscan: a tool for classifying replication timing regions.

    PubMed

    Zynda, Gregory J; Song, Jawon; Concia, Lorenzo; Wear, Emily E; Hanley-Bowdoin, Linda; Thompson, William F; Vaughn, Matthew W

    2017-08-07

    Replication timing experiments that use label incorporation and high throughput sequencing produce peaked data similar to ChIP-Seq experiments. However, the differences in experimental design, coverage density, and possible results make traditional ChIP-Seq analysis methods inappropriate for use with replication timing. To accurately detect and classify regions of replication across the genome, we present Repliscan. Repliscan robustly normalizes, automatically removes outlying and uninformative data points, and classifies Repli-seq signals into discrete combinations of replication signatures. The quality control steps and self-fitting methods make Repliscan generally applicable and more robust than previous methods that classify regions based on thresholds. Repliscan is simple and effective to use on organisms with different genome sizes. Even with analysis window sizes as small as 1 kilobase, reliable profiles can be generated with as little as 2.4x coverage.

  20. Treatment Effect on Recidivism for Juveniles Who Have Sexually Offended: a Multilevel Meta-Analysis.

    PubMed

    Ter Beek, Ellis; Spruit, Anouk; Kuiper, Chris H Z; van der Rijken, Rachel E A; Hendriks, Jan; Stams, Geert Jan J M

    2018-04-01

    The current study investigated the effect on recidivism of treatment aimed at juveniles who have sexually offended. It also assessed the potential moderating effect of type of recidivism, and several treatment, participant and study characteristics. In total, 14 published and unpublished primary studies, making use of a comparison group and reporting on official recidivism rates, were included in a multilevel meta-analysis. This resulted in the use of 77 effect sizes, and 1726 participants. A three-level meta-analytic model was used to calculate the combined effect sizes (Cohens d) and to perform moderator analyses. Study quality was assessed with the EPHPP Quality Assessment Tool for Quantitative Studies. A moderate effect size was found (d = 0.37), indicating that the treatment groups achieved an estimated relative reduction in recidivism of 20.5% as compared to comparison groups. However, after controlling for publication bias, a significant treatment effect was no longer found. Type of recidivism did not moderate the effect of treatment, indicating that treatment groups were equally effective for all types of recidivism. Also, no moderating effects of participant or treatment characteristics were found. Regarding study characteristics, a shorter follow up time showed a trend for larger effect sizes, and the effect size calculation based on proportions yielded larger effect sizes than calculation via mean frequency of offending. Implications for future research and clinical practice are discussed.

  1. Advanced Transportation System Studies Technical Area 2 (TA-2) Heavy Lift Launch Vehicle Development Contract. Volume 2; Technical Results

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The purpose of the Advanced Transportation System Studies (ATSS) Technical Area 2 (TA-2) Heavy Lift Launch Vehicle Development contract was to provide advanced launch vehicle concept definition and analysis to assist NASA in the identification of future launch vehicle requirements. Contracted analysis activities included vehicle sizing and performance analysis, subsystem concept definition, propulsion subsystem definition (foreign and domestic), ground operations and facilities analysis, and life cycle cost estimation. This document is Volume 2 of the final report for the contract. It provides documentation of selected technical results from various TA-2 analysis activities, including a detailed narrative description of the SSTO concept assessment results, a user's guide for the associated SSTO sizing tools, an SSTO turnaround assessment report, an executive summary of the ground operations assessments performed during the first year of the contract, a configuration-independent vehicle health management system requirements report, a copy of all major TA-2 contract presentations, a copy of the FLO launch vehicle final report, and references to Pratt & Whitney's TA-2 sponsored final reports regarding the identification of Russian main propulsion technologies.

  2. MIQuant – Semi-Automation of Infarct Size Assessment in Models of Cardiac Ischemic Injury

    PubMed Central

    Esteves, Tiago; de Pina, Maria de Fátima; Guedes, Joana G.; Freire, Ana; Quelhas, Pedro; Pinto-do-Ó, Perpétua

    2011-01-01

    Background The cardiac regenerative potential of newly developed therapies is traditionally evaluated in rodent models of surgically induced myocardial ischemia. A generally accepted key parameter for determining the success of the applied therapy is the infarct size. Although regarded as a gold standard method for infarct size estimation in heart ischemia, histological planimetry is time-consuming and highly variable amongst studies. The purpose of this work is to contribute towards the standardization and simplification of infarct size assessment by providing free access to a novel semi-automated software tool. The acronym MIQuant was attributed to this application. Methodology/Principal Findings Mice were subject to permanent coronary artery ligation and the size of chronic infarcts was estimated by area and midline-length methods using manual planimetry and with MIQuant. Repeatability and reproducibility of MIQuant scores were verified. The validation showed high correlation (r midline length = 0.981; r area = 0.970 ) and agreement (Bland-Altman analysis), free from bias for midline length and negligible bias of 1.21% to 3.72% for area quantification. Further analysis demonstrated that MIQuant reduced by 4.5-fold the time spent on the analysis and, importantly, MIQuant effectiveness is independent of user proficiency. The results indicate that MIQuant can be regarded as a better alternative to manual measurement. Conclusions We conclude that MIQuant is a reliable and an easy-to-use software for infarct size quantification. The widespread use of MIQuant will contribute towards the standardization of infarct size assessment across studies and, therefore, to the systematization of the evaluation of cardiac regenerative potential of emerging therapies. PMID:21980376

  3. A Summary of the NASA Design Environment for Novel Vertical Lift Vehicles (DELIVER) Project

    NASA Technical Reports Server (NTRS)

    Theodore, Colin R.

    2018-01-01

    The number of new markets and use cases being developed for vertical take-off and landing vehicles continues to explode, including the highly publicized urban air taxi and package deliver applications. There is an equally exploding variety of novel vehicle configurations and sizes that are being proposed to fill these new market applications. The challenge for vehicle designers is that there is currently no easy and consistent way to go from a compelling mission or use case to a vehicle that is best configured and sized for the particular mission. This is because the availability of accurate and validated conceptual design tools for these novel types and sizes of vehicles have not kept pace with the new markets and vehicles themselves. The Design Environment for Novel Vertical Lift Vehicles (DELIVER) project was formulated to address this vehicle design challenge by demonstrating the use of current conceptual design tools, that have been used for decades to design and size conventional rotorcraft, applied to these novel vehicle types, configurations and sizes. In addition to demonstrating the applicability of current design and sizing tools to novel vehicle configurations and sizes, DELIVER also demonstrated the addition of key transformational technologies of noise, autonomy, and hybrid-electric and all-electric propulsion into the vehicle conceptual design process. Noise is key for community acceptance, autonomy and the need to operate autonomously are key for efficient, reliable and safe operations, and electrification of the propulsion system is a key enabler for these new vehicle types and sizes. This paper provides a summary of the DELIVER project and shows the applicability of current conceptual design and sizing tools novel vehicle configurations and sizes that are being proposed for urban air taxi and package delivery type applications.

  4. Evidence based management of polyps of the gall bladder: A systematic review of the risk factors of malignancy.

    PubMed

    Bhatt, Nikita R; Gillis, Amy; Smoothey, Craig O; Awan, Faisal N; Ridgway, Paul F

    2016-10-01

    There are no evidence-based guidelines to dictate when Gallbladder Polyps (GBPs) of varying sizes should be resected. To identify factors that accurately predict malignant disease in GBP; to provide an evidence-based algorithm for management. A systematic review following PRISMA guidelines was performed using terms "gallbladder polyps" AND "polypoid lesion of gallbladder", from January 1993 and September 2013. Inclusion criteria required histopathological report or follow-up of 2 years. RTI-IB tool was used for quality analysis. Correlation with GBP size and malignant potential was analysed using Euclidean distance; a logistics mixed effects model was used for assessing independent risk factors for malignancy. Fifty-three articles were included in review. Data from 21 studies was pooled for analysis. Optimum size cut-off for resection of GBPs was 10 mm. Probability of malignancy is approximately zero at size <4.15 mm. Patient age >50 years, sessile and single polyps were independent risk factors for malignancy. For polyps sized 4 mm-10 mm, a risk assessment model was formulated. This review and analysis has provided an evidence-based algorithm for the management of GBPs. Longitudinal studies are needed to better understand the behaviour of polyps <10 mm, that are not at a high risk of malignancy, but may change over time. Copyright © 2016 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.

  5. Study of mesoporous CdS-quantum-dot-sensitized TiO2 films by using X-ray photoelectron spectroscopy and AFM

    PubMed Central

    Wojcieszak, Robert; Raj, Gijo

    2014-01-01

    Summary CdS quantum dots were grown on mesoporous TiO2 films by successive ionic layer adsorption and reaction processes in order to obtain CdS particles of various sizes. AFM analysis shows that the growth of the CdS particles is a two-step process. The first step is the formation of new crystallites at each deposition cycle. In the next step the pre-deposited crystallites grow to form larger aggregates. Special attention is paid to the estimation of the CdS particle size by X-ray photoelectron spectroscopy (XPS). Among the classical methods of characterization the XPS model is described in detail. In order to make an attempt to validate the XPS model, the results are compared to those obtained from AFM analysis and to the evolution of the band gap energy of the CdS nanoparticles as obtained by UV–vis spectroscopy. The results showed that XPS technique is a powerful tool in the estimation of the CdS particle size. In conjunction with these results, a very good correlation has been found between the number of deposition cycles and the particle size. PMID:24605274

  6. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  7. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

  8. Depth reversals in stereoscopic displays driven by apparent size

    NASA Astrophysics Data System (ADS)

    Sacher, Gunnar; Hayes, Amy; Thornton, Ian M.; Sereno, Margaret E.; Malony, Allen D.

    1998-04-01

    In visual scenes, depth information is derived from a variety of monocular and binocular cues. When in conflict, a monocular cue is sometimes able to override the binocular information. We examined the accuracy of relative depth judgments in orthographic, stereoscopic displays and found that perceived relative size can override binocular disparity as a depth cue in a situation where the relative size information is itself generated from disparity information, not from retinal size difference. A size discrimination task confirmed the assumption that disparity information was perceived and used to generate apparent size differences. The tendency for the apparent size cue to override disparity information can be modulated by varying the strength of the apparent size cue. In addition, an analysis of reaction times provides supporting evidence for this novel depth reversal effect. We believe that human perception must be regarded as an important component of stereoscopic applications. Hence, if applications are to be effective and accurate, it is necessary to take into account the richness and complexity of the human visual perceptual system that interacts with them. We discuss implications of this and similar research for human performance in virtual environments, the design of visual presentations for virtual worlds, and the design of visualization tools.

  9. Engine System Model Development for Nuclear Thermal Propulsion

    NASA Technical Reports Server (NTRS)

    Nelson, Karl W.; Simpson, Steven P.

    2006-01-01

    In order to design, analyze, and evaluate conceptual Nuclear Thermal Propulsion (NTP) engine systems, an improved NTP design and analysis tool has been developed. The NTP tool utilizes the Rocket Engine Transient Simulation (ROCETS) system tool and many of the routines from the Enabler reactor model found in Nuclear Engine System Simulation (NESS). Improved non-nuclear component models and an external shield model were added to the tool. With the addition of a nearly complete system reliability model, the tool will provide performance, sizing, and reliability data for NERVA-Derived NTP engine systems. A new detailed reactor model is also being developed and will replace Enabler. The new model will allow more flexibility in reactor geometry and include detailed thermal hydraulics and neutronics models. A description of the reactor, component, and reliability models is provided. Another key feature of the modeling process is the use of comprehensive spreadsheets for each engine case. The spreadsheets include individual worksheets for each subsystem with data, plots, and scaled figures, making the output very useful to each engineering discipline. Sample performance and sizing results with the Enabler reactor model are provided including sensitivities. Before selecting an engine design, all figures of merit must be considered including the overall impacts on the vehicle and mission. Evaluations based on key figures of merit of these results and results with the new reactor model will be performed. The impacts of clustering and external shielding will also be addressed. Over time, the reactor model will be upgraded to design and analyze other NTP concepts with CERMET and carbide fuel cores.

  10. Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.

    PubMed

    Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M

    2007-01-01

    Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).

  11. Improvement of the tool life of a micro-end mill using nano-sized SiC/Ni electroplating method.

    PubMed

    Park, Shinyoung; Kim, Kwang-Su; Roh, Ji Young; Jang, Gyu-Beom; Ahn, Sung-Hoon; Lee, Caroline Sunyong

    2012-04-01

    High mechanical properties of a tungsten carbide micro-end-mill tool was achieved by extending its tool life by electroplating nano-sized SiC particles (< 100 nm) that had a hardness similar to diamond in a nickel-based material. The co-electroplating method on the surface of the micro-end-mill tool was applied using SiC particles and Ni particles. Organic additives (saccharin and ammonium chloride) were added in a Watts bath to improve the nickel matrix density in the electroplating bath and to smooth the surface of the co-electroplating. The morphology of the coated nano-sized SiC particles and the composition were measured using Scanning Electron Microscope and Energy Dispersive Spectrometer. As the Ni/SiC co-electroplating layer was applied, the hardness and friction coefficient improved by 50%. Nano-sized SiC particles with 7 wt% were deposited on the surface of the micro-end mill while the Ni matrix was smoothed by adding organic additives. The tool life of the Ni/SiC co-electroplating coating on the micro-end mill was at least 25% longer than that of the existing micro-end mills without Ni/SiC co-electroplating. Thus, nano-sized SiC/Ni coating by electroplating significantly improves the mechanical properties of tungsten carbide micro-end mills.

  12. NASA Tech Briefs, December 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topics covered include: Video Mosaicking for Inspection of Gas Pipelines; Shuttle-Data-Tape XML Translator; Highly Reliable, High-Speed, Unidirectional Serial Data Links; Data-Analysis System for Entry, Descent, and Landing; Hybrid UV Imager Containing Face-Up AlGaN/GaN Photodiodes; Multiple Embedded Processors for Fault-Tolerant Computing; Hybrid Power Management; Magnetometer Based on Optoelectronic Microwave Oscillator; Program Predicts Time Courses of Human/ Computer Interactions; Chimera Grid Tools; Astronomer's Proposal Tool; Conservative Patch Algorithm and Mesh Sequencing for PAB3D; Fitting Nonlinear Curves by Use of Optimization Techniques; Tool for Viewing Faults Under Terrain; Automated Synthesis of Long Communication Delays for Testing; Solving Nonlinear Euler Equations With Arbitrary Accuracy; Self-Organizing-Map Program for Analyzing Multivariate Data; Tool for Sizing Analysis of the Advanced Life Support System; Control Software for a High-Performance Telerobot; Java Radar Analysis Tool; Architecture for Verifiable Software; Tool for Ranking Research Options; Enhanced, Partially Redundant Emergency Notification System; Close-Call Action Log Form; Task Description Language; Improved Small-Particle Powders for Plasma Spraying; Bonding-Compatible Corrosion Inhibitor for Rinsing Metals; Wipes, Coatings, and Patches for Detecting Hydrazines; Rotating Vessels for Growing Protein Crystals; Oscillating-Linear-Drive Vacuum Compressor for CO2; Mechanically Biased, Hinged Pairs of Piezoelectric Benders; Apparatus for Precise Indium-Bump Bonding of Microchips; Radiation Dosimetry via Automated Fluorescence Microscopy; Multistage Magnetic Separator of Cells and Proteins; Elastic-Tether Suits for Artificial Gravity and Exercise; Multichannel Brain-Signal-Amplifying and Digitizing System; Ester-Based Electrolytes for Low-Temperature Li-Ion Cells; Hygrometer for Detecting Water in Partially Enclosed Volumes; Radio-Frequency Plasma Cleaning of a Penning Malmberg Trap; Reduction of Flap Side Edge Noise - the Blowing Flap; and Preventing Accidental Ignition of Upper-Stage Rocket Motors.

  13. The theory of interface slicing

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a new tool which was developed to facilitate reuse-based software engineering, by addressing the following problems, needs, and issues: (1) size of systems incorporating reused modules; (2) knowledge requirements for program modification; (3) program understanding for reverse engineering; (4) module granularity and domain management; and (5) time and space complexity of conventional slicing. The definition of a form of static program analysis called interface slicing is addressed.

  14. Detection of genomic rearrangements in cucumber using genomecmp software

    NASA Astrophysics Data System (ADS)

    Kulawik, Maciej; Pawełkowicz, Magdalena Ewa; Wojcieszek, Michał; PlÄ der, Wojciech; Nowak, Robert M.

    2017-08-01

    Comparative genomic by increasing information about the genomes sequences available in the databases is a rapidly evolving science. A simple comparison of the general features of genomes such as genome size, number of genes, and chromosome number presents an entry point into comparative genomic analysis. Here we present the utility of the new tool genomecmp for finding rearrangements across the compared sequences and applications in plant comparative genomics.

  15. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    PubMed

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  16. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis

    PubMed Central

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-01-01

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387

  17. Diagnostic concordance between mobile interfaces and conventional workstations for emergency imaging assessment.

    PubMed

    Venson, José Eduardo; Bevilacqua, Fernando; Berni, Jean; Onuki, Fabio; Maciel, Anderson

    2018-05-01

    Mobile devices and software are now available with sufficient computing power, speed and complexity to allow for real-time interpretation of radiology exams. In this paper, we perform a multivariable user study that investigates concordance of image-based diagnoses provided using mobile devices on the one hand and conventional workstations on the other hand. We performed a between-subjects task-analysis using CT, MRI and radiography datasets. Moreover, we investigated the adequacy of the screen size, image quality, usability and the availability of the tools necessary for the analysis. Radiologists, members of several teams, participated in the experiment under real work conditions. A total of 64 studies with 93 main diagnoses were analyzed. Our results showed that 56 cases were classified with complete concordance (87.69%), 5 cases with almost complete concordance (7.69%) and 1 case (1.56%) with partial concordance. Only 2 studies presented discordance between the reports (3.07%). The main reason to explain the cause of those disagreements was the lack of multiplanar reconstruction tool in the mobile viewer. Screen size and image quality had no direct impact on the mobile diagnosis process. We concluded that for images from emergency modalities, a mobile interface provides accurate interpretation and swift response, which could benefit patients' healthcare. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. The choice of prior distribution for a covariance matrix in multivariate meta-analysis: a simulation study.

    PubMed

    Hurtado Rúa, Sandra M; Mazumdar, Madhu; Strawderman, Robert L

    2015-12-30

    Bayesian meta-analysis is an increasingly important component of clinical research, with multivariate meta-analysis a promising tool for studies with multiple endpoints. Model assumptions, including the choice of priors, are crucial aspects of multivariate Bayesian meta-analysis (MBMA) models. In a given model, two different prior distributions can lead to different inferences about a particular parameter. A simulation study was performed in which the impact of families of prior distributions for the covariance matrix of a multivariate normal random effects MBMA model was analyzed. Inferences about effect sizes were not particularly sensitive to prior choice, but the related covariance estimates were. A few families of prior distributions with small relative biases, tight mean squared errors, and close to nominal coverage for the effect size estimates were identified. Our results demonstrate the need for sensitivity analysis and suggest some guidelines for choosing prior distributions in this class of problems. The MBMA models proposed here are illustrated in a small meta-analysis example from the periodontal field and a medium meta-analysis from the study of stroke. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suresh, Niraj; Stephens, Sean A.; Adams, Lexor

    Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and forest management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving the plant. X ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. Our group at the Environmental Molecular Sciences Laboratory (EMSL) has developed an XCT-based tool to image and quantitatively analyze plant root structures in their native soil environment. XCT data collected on amore » Prairie dropseed (Sporobolus heterolepis) specimen was used to visualize its root structure. A combination of open-source software RooTrak and DDV were employed to segment the root from the soil, and calculate its isosurface, respectively. Our own computer script named 3DRoot-SV was developed and used to calculate root volume and surface area from a triangular mesh. The process utilizing a unique combination of tools, from imaging to quantitative root analysis, including the 3DRoot-SV computer script, is described.« less

  20. An Experimental Study of Cutting Performances of Worn Picks

    NASA Astrophysics Data System (ADS)

    Dogruoz, Cihan; Bolukbasi, Naci; Rostami, Jamal; Acar, Cemil

    2016-01-01

    The best means to assess rock cuttability and efficiency of cutting process for using mechanical excavation is specific energy (SE), measured in full-scale rock cutting test. This is especially true for the application of roadheaders, often fitted with drag-type cutting tools. Radial picks or drag bits are changed during the operation as they reach a certain amount of wear and become blunt. In this study, full-scale cutting tests in different sedimentary rock types with bits having various degree of wear were used to evaluate the influence of bit wear on cutting forces and specific energy. The relationship between the amount of wear as represented by the size of the wear flats at the tip of the bit, and cutting forces as well as specific energy was examined. The influence of various rock properties such as mineral content, uniaxial compressive strength, tensile strength, indentation index, shore hardness, Schmidt hammer hardness, and density with required SE of cutting using different levels of tool wear was also studied. The preliminary analysis of the data shows that the mean cutting forces increase 2-3 times and SE by 4-5 times when cutting with 4 mm wear flat as compared to cutting with new or sharp wedge shape bits. The grain size distribution of the muck for cutting different rock types and different level of bit wear was analyzed and discussed. The best fit prediction models for SE based on statistical analysis of laboratory test results are introduced. The model can be used for estimating the performance of mechanical excavators using radial tools, especially roadheaders, continuous miners and longwall drum shearers.

  1. Assessing the effect of elevated carbon dioxide on soil carbon: a comparison of four meta-analyses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungate, B. A.; van Groenigen, K.; Six, J.

    2009-08-01

    Soil is the largest reservoir of organic carbon (C) in the terrestrial biosphere and soil C has a relatively long mean residence time. Rising atmospheric carbon dioxide (CO{sub 2}) concentrations generally increase plant growth and C input to soil, suggesting that soil might help mitigate atmospheric CO{sub 2} rise and global warming. But to what extent mitigation will occur is unclear. The large size of the soil C pool not only makes it a potential buffer against rising atmospheric CO{sub 2}, but also makes it difficult to measure changes amid the existing background. Meta-analysis is one tool that can overcomemore » the limited power of single studies. Four recent meta-analyses addressed this issue but reached somewhat different conclusions about the effect of elevated CO{sub 2} on soil C accumulation, especially regarding the role of nitrogen (N) inputs. Here, we assess the extent of differences between these conclusions and propose a new analysis of the data. The four meta-analyses included different studies, derived different effect size estimates from common studies, used different weighting functions and metrics of effect size, and used different approaches to address nonindependence of effect sizes. Although all factors influenced the mean effect size estimates and subsequent inferences, the approach to independence had the largest influence. We recommend that meta-analysts critically assess and report choices about effect size metrics and weighting functions, and criteria for study selection and independence. Such decisions need to be justified carefully because they affect the basis for inference. Our new analysis, with a combined data set, confirms that the effect of elevated CO{sub 2} on net soil C accumulation increases with the addition of N fertilizers. Although the effect at low N inputs was not significant, statistical power to detect biogeochemically important effect sizes at low N is limited, even with meta-analysis, suggesting the continued need for long-term experiments.« less

  2. Integration of a Portfolio-based Approach to Evaluate Aerospace R and D Problem Formulation Into a Parametric Synthesis Tool

    NASA Astrophysics Data System (ADS)

    Oza, Amit R.

    The focus of this study is to improve R&D effectiveness towards aerospace and defense planning in the early stages of the product development lifecycle. Emphasis is on: correct formulation of a decision problem, with special attention to account for data relationships between the individual design problem and the system capability required to size the aircraft, understanding of the meaning of the acquisition strategy objective and subjective data requirements that are required to arrive at a balanced analysis and/or "correct" mix of technology projects, understanding the meaning of the outputs that can be created from the technology analysis, and methods the researcher can use at effectively support decisions at the acquisition and conceptual design levels through utilization of a research and development portfolio strategy. The primary objectives of this study are to: (1) determine what strategy should be used to initialize conceptual design parametric sizing processes during requirements analysis for the materiel solution analysis stage of the product development lifecycle when utilizing data already constructed in the latter phase when working with a generic database management system synthesis tool integration architecture for aircraft design , and (2) assess how these new data relationships can contribute for innovative decision-making when solving acquisition hardware/technology portfolio problems. As such, an automated composable problem formulation system is developed to consider data interactions for the system architecture that manages acquisition pre-design concept refinement portfolio management, and conceptual design parametric sizing requirements. The research includes a way to: • Formalize the data storage and implement the data relationship structure with a system architecture automated through a database management system. • Allow for composable modeling, in terms of level of hardware abstraction, for the product model, mission model, and operational constraint model data blocks in the pre-design stages. • Allow the product model, mission model, and operational constraint model to be cross referenced with a generic aircraft synthesis capability to identify disciplinary analysis methods and processes. • Allow for matching, comparison, and balancing of the aircraft hardware portfolio to the associated developmental and technology risk metrics. • Allow for visualization technology portfolio decision space. The problem formulation architecture is finally implemented and verified for a generic hypersonic vehicle research demonstrator where a portfolio of technology hardware are measured for developmental and technology risks, prioritized by the researcher risk constraints, and the data generated delivered to a novel aircraft synthesis tool to confirm vehicle feasibility.

  3. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  4. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  5. Alssat Development Status and Its Applications in Trade Studies

    NASA Technical Reports Server (NTRS)

    Yeh, H. Y. (Jannivine); Brown, Cheryl B.; Jeng, Frank F.; Lin, Chin H.; Ewert, Michael K.

    2004-01-01

    The development of the Advanced Life Support (ALS) Sizing Analysis Tool (ALSSAT) using Microsoft® Excel was initiated by the Crew and Thermal Systems Division (CTSD) of Johnson Space Center (JSC) in 1997 to support the ALS and Exploration Offices in Environmental Control and Life Support System (ECLSS) design and studies. It aids the user in performing detailed sizing of the ECLSS based on suggested default values or user inputs for different combinations of the ALS regenerative system technologies (Ref. 1, 2). This analysis tool will assist the user in performing ECLSS preliminary design and trade studies as well as system optimization efficiently and economically. Since ALSSAT's latest publication in ICES 2001 (Ref. 1) describing the development of ALSSAT with its Air Revitalization Subsystem (ARS), Water Management Subsystem (WMS), and Biomass Subsystem (Biomass) mass balance sheets, ALSSAT has been expanded to include mass balance and sizing models for the remaining three ALS subsystems, namely, the Solid Waste Management Subsystem (SWMS), the Food Management Subsystem (FMS), and the Thermal Control Subsystem (TCS). The external interfaces, including the Extravehicular Activities (EVA) and Human Accommodations (HA), were implemented into ALSSAT in 2002. The overall mass balance sheet, which integrates the six ALS subsystems and the external interfaces applicable to the ECLSS, was also developed. In 2003, ALSSAT was upgraded to include the consideration of redundancy and contingency options in the ECLSS, as well as more ALS regenerative technology selections. ALSSAT has been used for the Metric Calculation for FY02 and FY03 (Ref. 3). Several trade studies were conducted in 2003. The analytical results will be presented in this paper.

  6. Habitability Designs for Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Woolford, Barbara

    2006-01-01

    NASA's space human factors team is contributing to the habitability of the Crew Exploration Vehicle (CEV), which will take crews to low Earth orbit, and dock there with additional vehicles to go on to the moon's surface. They developed a task analysis for operations and for self-sustenance (sleeping, eating, hygiene), and estimated the volumes required for performing the various tasks and for the associated equipment, tools and supplies. Rough volumetric mockups were built for crew evaluations. Trade studies were performed to determine the size and location of windows. The habitability analysis also contributes to developing concepts of operations by identifying constraints on crew time. Recently completed studies provided stowage concepts, tools for assessing lighting constraints, and approaches to medical procedure development compatible with the tight space and absence of gravity. New work will be initiated to analyze design concepts and verify that equipment and layouts do meet requirements.

  7. Effects of research tool patents on biotechnology innovation in a developing country: A case study of South Korea

    PubMed Central

    Kang, Kyung-Nam; Ryu, Tae-Kyu; Lee, Yoon-Sik

    2009-01-01

    Background Concerns have recently been raised about the negative effects of patents on innovation. In this study, the effects of patents on innovations in the Korean biotech SMEs (small and medium-sized entrepreneurs) were examined using survey data and statistical analysis. Results The survey results of this study provided some evidence that restricted access problems have occurred even though their frequency was not high. Statistical analysis revealed that difficulties in accessing patented research tools were not negatively correlated with the level of innovation performance and attitudes toward the patent system. Conclusion On the basis of the results of this investigation in combination with those of previous studies, we concluded that although restricted access problems have occurred, this has not yet deterred innovation in Korea. However, potential problems do exist, and the effects of restricted access should be constantly scrutinized. PMID:19321013

  8. Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meintz, Andrew; Prohaska, Robert; Konan, Arnaud

    System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles. This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by the National Renewable Energy Laboratory called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variablemore » number of charging locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less

  9. Effects of research tool patents on biotechnology innovation in a developing country: a case study of South Korea.

    PubMed

    Kang, Kyung-Nam; Ryu, Tae-Kyu; Lee, Yoon-Sik

    2009-03-26

    Concerns have recently been raised about the negative effects of patents on innovation. In this study, the effects of patents on innovations in the Korean biotech SMEs (small and medium-sized entrepreneurs) were examined using survey data and statistical analysis. The survey results of this study provided some evidence that restricted access problems have occurred even though their frequency was not high. Statistical analysis revealed that difficulties in accessing patented research tools were not negatively correlated with the level of innovation performance and attitudes toward the patent system. On the basis of the results of this investigation in combination with those of previous studies, we concluded that although restricted access problems have occurred, this has not yet deterred innovation in Korea. However, potential problems do exist, and the effects of restricted access should be constantly scrutinized.

  10. Validation of the World Health Organization tool for situational analysis to assess emergency and essential surgical care at district hospitals in Ghana.

    PubMed

    Osen, Hayley; Chang, David; Choo, Shelly; Perry, Henry; Hesse, Afua; Abantanga, Francis; McCord, Colin; Chrouser, Kristin; Abdullah, Fizan

    2011-03-01

    The World Health Organization (WHO) Tool for Situational Analysis to Assess Emergency and Essential Surgical Care (hereafter called the WHO Tool) has been used in more than 25 countries and is the largest effort to assess surgical care in the world. However, it has not yet been independently validated. Test-retest reliability is one way to validate the degree to which tests instruments are free from random error. The aim of the present field study was to determine the test-retest reliability of the WHO Tool. The WHO Tool was mailed to 10 district hospitals in Ghana. Written instructions were provided along with a letter from the Ghana Health Services requesting the hospital administrator to complete the survey tool. After ensuring delivery and completion of the forms, the study team readministered the WHO Tool at the time of an on-site visit less than 1 month later. The results of the two tests were compared to calculate kappa statistics for each of the 152 questions in the WHO Tool. The kappa statistic is a statistical measure of the degree of agreement above what would be expected based on chance alone. Ten hospitals were surveyed twice over a short interval (i.e., less than 1 month). Weighted and unweighted kappa statistics were calculated for 152 questions. The median unweighted kappa for the entire survey was 0.43 (interquartile range 0-0.84). The infrastructure section (24 questions) had a median kappa of 0.81; the human resources section (13 questions) had a median kappa of 0.77; the surgical procedures section (67 questions) had a median kappa of 0.00; and the emergency surgical equipment section (48 questions) had a median kappa of 0.81. Hospital capacity survey questions related to infrastructure characteristics had high reliability. However, questions related to process of care had poor reliability and may benefit from supplemental data gathered by direct observation. Limitations to the study include the small sample size: 10 district hospitals in a single country. Consistent and high correlations calculated from the field testing within the present analysis suggest that the WHO Tool for Situational Analysis is a reliable tool where it measures structure and setting, but it should be revised for measuring process of care.

  11. Inverse Statistics and Asset Allocation Efficiency

    NASA Astrophysics Data System (ADS)

    Bolgorian, Meysam

    In this paper using inverse statistics analysis, the effect of investment horizon on the efficiency of portfolio selection is examined. Inverse statistics analysis is a general tool also known as probability distribution of exit time that is used for detecting the distribution of the time in which a stochastic process exits from a zone. This analysis was used in Refs. 1 and 2 for studying the financial returns time series. This distribution provides an optimal investment horizon which determines the most likely horizon for gaining a specific return. Using samples of stocks from Tehran Stock Exchange (TSE) as an emerging market and S&P 500 as a developed market, effect of optimal investment horizon in asset allocation is assessed. It is found that taking into account the optimal investment horizon in TSE leads to more efficiency for large size portfolios while for stocks selected from S&P 500, regardless of portfolio size, this strategy does not only not produce more efficient portfolios, but also longer investment horizons provides more efficiency.

  12. Scanning electron microscopical and cross-sectional analysis of extraterrestrial carbonaceous nanoglobules

    NASA Astrophysics Data System (ADS)

    Garvie, Laurence A. J.; Baumgardner, Grant; Buseck, Peter R.

    2008-05-01

    Carbonaceous nanoglobules are ubiquitous in carbonaceous chondrite (CC) meteorites. The Tagish Lake (C2) meteorite is particularly intriguing in containing an abundance of nanoglobules, with a wider range of forms and sizes than encountered in other CC meteorites. Previous studies by transmission electron microscopy (TEM) have provided a wealth of information on chemistry and structure. In this study low voltage scanning electron microscopy (SEM) was used to characterize the globule forms and external structures. The internal structure of the globules was investigated after sectioning by focused ion beam (FIB) milling. The FIB-SEM analysis shows that the globules range from solid to hollow. Some hollow globules show a central open core, with adjoining smaller cores. The FIB with an SEM is a valuable tool for the analysis of extraterrestrial materials, even of sub-micron-sized "soft" carbonaceous particles. The rapid site-specific cross-sectioning capabilities of the FIB allow the preservation of the internal morphology of the nanoglobules, with minimal damage or alteration of the unsectioned areas.

  13. Challenges in Resolution for IC Failure Analysis

    NASA Astrophysics Data System (ADS)

    Martinez, Nick

    1999-10-01

    Resolution is becoming more and more of a challenge in the world of Failure Analysis in integrated circuits. This is a result of the ongoing size reduction in microelectronics. Determining the cause of a failure depends upon being able to find the responsible defect. The time it takes to locate a given defect is extremely important so that proper corrective actions can be taken. The limits of current microscopy tools are being pushed. With sub-micron feature sizes and even smaller killing defects, optical microscopes are becoming obsolete. With scanning electron microscopy (SEM), the resolution is high but the voltage involved can make these small defects transparent due to the large mean-free path of incident electrons. In this presentation, I will give an overview of the use of inspection methods in Failure Analysis and show example studies of my work as an Intern student at Texas Instruments. 1. Work at Texas Instruments, Stafford, TX, was supported by TI. 2. Work at Texas Tech University, was supported by NSF Grant DMR9705498.

  14. Human factors model concerning the man-machine interface of mining crewstations

    NASA Technical Reports Server (NTRS)

    Rider, James P.; Unger, Richard L.

    1989-01-01

    The U.S. Bureau of Mines is developing a computer model to analyze the human factors aspect of mining machine operator compartments. The model will be used as a research tool and as a design aid. It will have the capability to perform the following: simulated anthropometric or reach assessment, visibility analysis, illumination analysis, structural analysis of the protective canopy, operator fatigue analysis, and computation of an ingress-egress rating. The model will make extensive use of graphics to simplify data input and output. Two dimensional orthographic projections of the machine and its operator compartment are digitized and the data rebuilt into a three dimensional representation of the mining machine. Anthropometric data from either an individual or any size population may be used. The model is intended for use by equipment manufacturers and mining companies during initial design work on new machines. In addition to its use in machine design, the model should prove helpful as an accident investigation tool and for determining the effects of machine modifications made in the field on the critical areas of visibility and control reach ability.

  15. Tips and tricks for preparing lampbrush chromosome spreads from Xenopus tropicalis oocytes.

    PubMed

    Penrad-Mobayed, May; Kanhoush, Rasha; Perrin, Caroline

    2010-05-01

    Due to their large size and fine organization, lampbrush chromosomes (LBCs) of amphibian oocytes have been for decades one of the favorite tools of biologists for the analysis of transcriptional and post-transcriptional processes at the cytological level. The emergence of the diploid Xenopus tropicalis amphibian as a model organism for vertebrate developmental genetics and the accumulation of sequence data made available by its recent genomic sequencing, strongly revive the interest of LBCs as a powerful tool to study genes expressed during oogenesis. We describe here a detailed protocol for preparing LBCs from X. tropicalis oocyte and give practical advice to encourage a large number of researchers to become familiar with these chromosomes.

  16. Allogeneic cell therapy bioprocess economics and optimization: downstream processing decisions.

    PubMed

    Hassan, Sally; Simaria, Ana S; Varadaraju, Hemanthram; Gupta, Siddharth; Warren, Kim; Farid, Suzanne S

    2015-01-01

    To develop a decisional tool to identify the most cost effective process flowsheets for allogeneic cell therapies across a range of production scales. A bioprocess economics and optimization tool was built to assess competing cell expansion and downstream processing (DSP) technologies. Tangential flow filtration was generally more cost-effective for the lower cells/lot achieved in planar technologies and fluidized bed centrifugation became the only feasible option for handling large bioreactor outputs. DSP bottlenecks were observed at large commercial lot sizes requiring multiple large bioreactors. The DSP contribution to the cost of goods/dose ranged between 20-55%, and 50-80% for planar and bioreactor flowsheets, respectively. This analysis can facilitate early decision-making during process development.

  17. Assessing therapeutic relevance of biologically interesting, ampholytic substances based on their physicochemical and spectral characteristics with chemometric tools

    NASA Astrophysics Data System (ADS)

    Judycka, U.; Jagiello, K.; Bober, L.; Błażejowski, J.; Puzyn, T.

    2018-06-01

    Chemometric tools were applied to investigate the biological behaviour of ampholytic substances in relation to their physicochemical and spectral properties. Results of the Principal Component Analysis suggest that size of molecules and their electronic and spectral characteristics are the key properties required to predict therapeutic relevance of the compounds examined. These properties were used for developing the structure-activity classification model. The classification model allows assessing the therapeutic behaviour of ampholytic substances on the basis of solely values of descriptors that can be obtained computationally. Thus, the prediction is possible without necessity of carrying out time-consuming and expensive laboratory tests, which is its main advantage.

  18. Nanohole optical tweezers in heterogeneous mixture analysis

    NASA Astrophysics Data System (ADS)

    Hacohen, Noa; Ip, Candice J. X.; Laxminarayana, Gurunatha K.; DeWolf, Timothy S.; Gordon, Reuven

    2017-08-01

    Nanohole optical trapping is a tool that has been shown to analyze proteins at the single molecule level using pure samples. The next step is to detect and study single molecules with dirty samples. We demonstrate that using our double nanohole optical tweezing configuration, single particles in an egg white solution can be classified when trapped. Different sized molecules provide different signal variations in their trapped state, allowing the proteins to be statistically characterized. Root mean squared variation and trap stiffness are methods used on trapped signals to distinguish between the different proteins. This method to isolate and determine single molecules in heterogeneous samples provides huge potential to become a reliable tool for use within biomedical and scientific communities.

  19. ServAR: An augmented reality tool to guide the serving of food.

    PubMed

    Rollo, Megan E; Bucher, Tamara; Smith, Shamus P; Collins, Clare E

    2017-05-12

    Accurate estimation of food portion size is a difficult task. Visual cues are important mediators of portion size and therefore technology-based aids may assist consumers when serving and estimating food portions. The current study evaluated the usability and impact on estimation error of standard food servings of a novel augmented reality food serving aid, ServAR. Participants were randomised into one of three groups: 1) no information/aid (control); 2) verbal information on standard serving sizes; or 3) ServAR, an aid which overlayed virtual food servings over a plate using a tablet computer. Participants were asked to estimate the standard serving sizes of nine foods (broccoli, carrots, cauliflower, green beans, kidney beans, potato, pasta, rice, and sweetcorn) using validated food replicas. Wilcoxon signed-rank tests compared median served weights of each food to reference standard serving size weights. Percentage error was used to compare the estimation of serving size accuracy between the three groups. All participants also performed a usability test using the ServAR tool to guide the serving of one randomly selected food. Ninety adults (78.9% female; a mean (95%CI) age 25.8 (24.9-26.7) years; BMI 24.2 (23.2-25.2) kg/m 2 ) completed the study. The median servings were significantly different to the reference portions for five foods in the ServAR group, compared to eight foods in the information only group and seven foods for the control group. The cumulative proportion of total estimations per group within ±10%, ±25% and ±50% of the reference portion was greater for those using ServAR (30.7, 65.2 and 90.7%; respectively), compared to the information only group (19.6, 47.4 and 77.4%) and control group (10.0, 33.7 and 68.9%). Participants generally found the ServAR tool easy to use and agreed that it showed potential to support optimal portion size selection. However, some refinements to the ServAR tool are required to improve the user experience. Use of the augmented reality tool improved accuracy and consistency of estimating standard serve sizes compared to the information only and control conditions. ServAR demonstrates potential as a practical tool to guide the serving of food. Further evaluation across a broad range of foods, portion sizes and settings is warranted.

  20. Toyz: A framework for scientific analysis of large datasets and astronomical images

    NASA Astrophysics Data System (ADS)

    Moolekamp, F.; Mamajek, E.

    2015-11-01

    As the size of images and data products derived from astronomical data continues to increase, new tools are needed to visualize and interact with that data in a meaningful way. Motivated by our own astronomical images taken with the Dark Energy Camera (DECam) we present Toyz, an open source Python package for viewing and analyzing images and data stored on a remote server or cluster. Users connect to the Toyz web application via a web browser, making it ​a convenient tool for students to visualize and interact with astronomical data without having to install any software on their local machines. In addition it provides researchers with an easy-to-use tool that allows them to browse the files on a server and quickly view very large images (>2 Gb) taken with DECam and other cameras with a large FOV and create their own visualization tools that can be added on as extensions to the default Toyz framework.

  1. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France

    PubMed Central

    Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8–7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities. PMID:27191164

  2. Morphometric Assessment of Convergent Tool Technology and Function during the Early Middle Palaeolithic: The Case of Payre, France.

    PubMed

    Chacón, M Gema; Détroit, Florent; Coudenneau, Aude; Moncel, Marie-Hélène

    2016-01-01

    There appears to be little doubt as to the existence of an intentional technological resolve to produce convergent tools during the Middle Palaeolithic. However, the use of these pieces as pointed tools is still subject to debate: i.e., handheld tool vs. hafted tool. Present-day technological analysis has begun to apply new methodologies in order to quantify shape variability and to decipher the role of the morphology of these pieces in relation to function; for instance, geometric morphometric analyses have recently been applied with successful results. This paper presents a study of this type of analysis on 37 convergent tools from level Ga of Payre site (France), dated to MIS 8-7. These pieces are non-standardized knapping products produced by discoidal and orthogonal core technologies. Moreover, macro-wear studies attest to various activities on diverse materials with no evidence of hafting or projectile use. The aim of this paper is to test the geometric morphometric approach on non-standardized artefacts applying the Elliptical Fourier analysis (EFA) to 3D contours and to assess the potential relationship between size and shape, technology and function. This study is innovative in that it is the first time that this method, considered to be a valuable complement for describing technological and functional attributes, is applied to 3D contours of lithic products. Our results show that this methodology ensures a very good degree of accuracy in describing shape variations of the sharp edges of technologically non-standardized convergent tools. EFA on 3D contours indicates variations in deviations of the outline along the third dimension (i.e., dorso-ventrally) and yields quantitative and insightful information on the actual shape variations of tools. Several statistically significant relationships are found between shape variation and use-wear attributes, though the results emphasize the large variability of the shape of the convergent tools, which, in general, does not show a strong direct association with technological features and function. This is in good agreement with the technological context of this chronological period, characterized by a wide diversity of non-standardized tools adapted to multipurpose functions for varied subsistence activities.

  3. Laser-induced incandescence of titania nanoparticles synthesized in a flame

    NASA Astrophysics Data System (ADS)

    Cignoli, F.; Bellomunno, C.; Maffi, S.; Zizak, G.

    2009-09-01

    Laser induced incandescence experiments were carried out in a flame reactor during titania nanoparticle synthesis. The structure of the reactor employed allowed for a rather smooth particle growth along the flame axis, with limited mixing of different size particles. Particle incandescence was excited by the 4th harmonic of a Nd:YAG laser. The radiation emitted from the particles was recorded in time and checked by spectral analysis. Results were compared with measurements from transmission electron microscopy of samples taken at the same locations probed by incandescence. This was done covering a portion of the flame length within which a particle size growth of a factor of about four was detected . The incandescence decay time was found to increase monotonically with particle size. The attainment of a process control tool in nanoparticle flame synthesis appears to be realistic.

  4. Family history tools in primary care: does one size fit all?

    PubMed

    Wilson, B J; Carroll, J C; Allanson, J; Little, J; Etchegary, H; Avard, D; Potter, B K; Castle, D; Grimshaw, J M; Chakraborty, P

    2012-01-01

    Family health history (FHH) has potential value in many health care settings. This review discusses the potential uses of FHH information in primary care and the need for tools to be designed accordingly. We developed a framework in which the attributes of FHH tools are mapped against these different purposes. It contains 7 attributes mapped against 5 purposes. In considering different FHH tool purposes, it is apparent that different attributes become more or less important, and that tools for different purposes require different implementation and evaluation strategies. The context in which a tool is used is also relevant to its effectiveness. For FHH tools, it is unlikely that 'one size fits all', although appreciation of different purposes, users and contexts should facilitate the development of different applications from single FHH platforms. Copyright © 2012 S. Karger AG, Basel.

  5. Intentional defect array wafers: their practical use in semiconductor control and monitoring systems

    NASA Astrophysics Data System (ADS)

    Emami, Iraj; McIntyre, Michael; Retersdorf, Michael

    2003-07-01

    In the competitive world of semiconductor manufacturing today, control of the process and manufacturing equipment is paramount to success of the business. Consistent with the need for rapid development of process technology, is a need for development wiht respect to equipment control including defect metrology tools. Historical control methods for defect metrology tools included a raw count of defects detected on a characterized production or test wafer with little or not regard to the attributes of the detected defects. Over time, these characterized wafers degrade with multiple passes on the tools and handling requiring the tool owner to create and characterize new samples periodically. With the complex engineering software analysis systems used today, there is a strong reliance on the accuracy of defect size, location, and classification in order to provide the best value when correlating the in line to sort type of data. Intentional Defect Array (IDA) wafers were designed and manufacturered at International Sematech (ISMT) in Austin, Texas and is a product of collaboration between ISMT member companies and suppliers of advanced defect inspection equipment. These wafers provide the use with known defect types and sizes in predetermined locations across the entire wafer. The wafers are designed to incorporate several desired flows and use critical dimensions consistent with current and future technology nodes. This paper briefly describes the design of the IDA wafer and details many practical applications in the control of advanced defect inspection equipment.

  6. Assessing attachment in school-aged children: Do the School-Age Assessment of Attachment and Family Drawings work together as complementary tools?

    PubMed

    Carr-Hopkins, Rebecca; De Burca, Calem; Aldridge, Felicity A

    2017-07-01

    Our goal was to identify an assessment package that could improve treatment planning for troubled children and their families. To assess the validity of our tools, we tested the relations among the School-Age Assessment of Attachment, the Family Drawing and children's risk status. We used the Dynamic-Maturational Model of Attachment and Adaptation to interpret the assessments in the hope of identifying a gradient of risk, and explore whether a new coding method improved the validity of Family Drawings and their utility as a tool to complement the School-Age Assessment of Attachment. The participants were 89 children, aged between 5 and 12 years; 32 children were involved with mental health services or child protection. Each child completed a School-Age Assessment of Attachment and a Family Drawing. Both assessments differentiated between clinical and normative referrals with moderate effect sizes when dichotomizing risk versus non-risk attachment. When the analysis incorporated a gradient of six attachment classifications, the effect sizes decreased, but specificity of risk increased. The School-Age Assessment of Attachment had greater validity for discriminating risk, and type of risk, than the Family Drawings. With a School-Age Assessment of Attachment and family history, the Family Drawing can provide information about distress that some children do not provide verbally. Integration of the two assessment tools alongside information about parental and family functioning appears to be the key to formulating children's problems.

  7. Chitosan-Graphene Oxide 3D scaffolds as Promising Tools for Bone Regeneration in Critical-Size Mouse Calvarial Defects.

    PubMed

    Hermenean, Anca; Codreanu, Ada; Herman, Hildegard; Balta, Cornel; Rosu, Marcel; Mihali, Ciprian Valentin; Ivan, Alexandra; Dinescu, Sorina; Ionita, Mariana; Costache, Marieta

    2017-11-30

    Limited self-regenerating capacity of human skeleton makes the reconstruction of critical size bone defect a significant challenge for clinical practice. Aimed for regenerating bone tissues, this study was designed to investigate osteogenic differentiation, along with bone repair capacity of 3D chitosan (CHT) scaffolds enriched with graphene oxide (GO) in critical-sized mouse calvarial defect. Histopathological/histomorphometry and scanning electron microscopy(SEM) analysis of the implants revealed larger amount of new bone in the CHT/GO-filled defects compared with CHT alone (p < 0.001). When combined with GO, CHT scaffolds synergistically promoted the increase of alkaline phosphatase activity both in vitro and in vivo experiments. This enhanced osteogenesis was corroborated with increased expression of bone morphogenetic protein (BMP) and Runx-2 up to week 4 post-implantation, which showed that GO facilitates the differentiation of osteoprogenitor cells. Meanwhile, osteogenesis was promoted by GO at the late stage as well, as indicated by the up-regulation of osteopontin and osteocalcin at week 8 and overexpressed at week 18, for both markers. Our data suggest that CHT/GO biomaterial could represent a promising tool for the reconstruction of large bone defects, without using exogenous living cells or growth factors.

  8. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis.

    PubMed

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-08-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000-2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL.

  9. The Effects of Mobile-Computer-Supported Collaborative Learning: Meta-Analysis and Critical Synthesis

    PubMed Central

    Sung, Yao-Ting; Yang, Je-Ming; Lee, Han-Yueh

    2017-01-01

    One of the trends in collaborative learning is using mobile devices for supporting the process and products of collaboration, which has been forming the field of mobile-computer-supported collaborative learning (mCSCL). Although mobile devices have become valuable collaborative learning tools, evaluative evidence for their substantial contributions to collaborative learning is still scarce. The present meta-analysis, which included 48 peer-reviewed journal articles and doctoral dissertations written over a 16-year period (2000–2015) involving 5,294 participants, revealed that mCSCL has produced meaningful improvements for collaborative learning, with an overall mean effect size of 0.516. Moderator variables, such as domain subject, group size, teaching method, intervention duration, and reward method were related to different effect sizes. The results provided implications for future research and practice, such as suggestions on how to appropriately use the functionalities of mobile devices, how to best leverage mCSCL through effective group learning mechanisms, and what outcome variables should be included in future studies to fully elucidate the process and products of mCSCL. PMID:28989193

  10. Cognitive rehabilitation in schizophrenia: a quantitative analysis of controlled studies.

    PubMed

    Krabbendam, Lydia; Aleman, André

    2003-09-01

    Cognitive rehabilitation is now recognized as an important tool in the treatment of schizophrenia, and findings in this area are emerging rapidly. There is a need for a systematic review of the effects of the different training programs. To review quantitatively the controlled studies on cognitive rehabilitation in schizophrenia for the effect of training on performance on tasks other than those practiced in the training procedure. A meta-analysis was conducted on 12 controlled studies of cognitive rehabilitation in schizophrenia taking into account the effects of type of rehabilitation approach (rehearsal or strategy learning) and duration of training. The mean weighted effect size was 0.45, with a 95% confidence interval from 0.26 to 0.64. Effect sizes differed slightly, depending on rehabilitation approach, in favor of strategy learning, but this difference did not reach statistical significance. Duration of training did not influence effect size. Cognitive rehabilitation can improve task performance in patients with schizophrenia and this effect is apparent on tasks outside those practiced during the training procedure. Future studies should include more real-world outcomes and perform longitudinal evaluations.

  11. Human Fear Chemosignaling: Evidence from a Meta-Analysis.

    PubMed

    de Groot, Jasper H B; Smeets, Monique A M

    2017-10-01

    Alarm pheromones are widely used in the animal kingdom. Notably, there are 26 published studies (N = 1652) highlighting a human capacity to communicate fear, stress, and anxiety via body odor from one person (66% males) to another (69% females). The question is whether the findings of this literature reflect a true effect, and what the average effect size is. These questions were answered by combining traditional meta-analysis with novel meta-analytical tools, p-curve analysis and p-uniform-techniques that could indicate whether findings are likely to reflect a true effect based on the distribution of P-values. A traditional random-effects meta-analysis yielded a small-to-moderate effect size (Hedges' g: 0.36, 95% CI: 0.31-0.41), p-curve analysis showed evidence diagnostic of a true effect (ps < 0.0001), and there was no evidence for publication bias. This meta-analysis did not assess the internal validity of the current studies; yet, the combined results illustrate the statistical robustness of a field in human olfaction dealing with the human capacity to communicate certain emotions (fear, stress, anxiety) via body odor. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Interactive Visual Analysis within Dynamic Ocean Models

    NASA Astrophysics Data System (ADS)

    Butkiewicz, T.

    2012-12-01

    The many observation and simulation based ocean models available today can provide crucial insights for all fields of marine research and can serve as valuable references when planning data collection missions. However, the increasing size and complexity of these models makes leveraging their contents difficult for end users. Through a combination of data visualization techniques, interactive analysis tools, and new hardware technologies, the data within these models can be made more accessible to domain scientists. We present an interactive system that supports exploratory visual analysis within large-scale ocean flow models. The currents and eddies within the models are illustrated using effective, particle-based flow visualization techniques. Stereoscopic displays and rendering methods are employed to ensure that the user can correctly perceive the complex 3D structures of depth-dependent flow patterns. Interactive analysis tools are provided which allow the user to experiment through the introduction of their customizable virtual dye particles into the models to explore regions of interest. A multi-touch interface provides natural, efficient interaction, with custom multi-touch gestures simplifying the otherwise challenging tasks of navigating and positioning tools within a 3D environment. We demonstrate the potential applications of our visual analysis environment with two examples of real-world significance: Firstly, an example of using customized particles with physics-based behaviors to simulate pollutant release scenarios, including predicting the oil plume path for the 2010 Deepwater Horizon oil spill disaster. Secondly, an interactive tool for plotting and revising proposed autonomous underwater vehicle mission pathlines with respect to the surrounding flow patterns predicted by the model; as these survey vessels have extremely limited energy budgets, designing more efficient paths allows for greater survey areas.

  13. The Role of Dog Population Management in Rabies Elimination—A Review of Current Approaches and Future Opportunities

    PubMed Central

    Taylor, Louise H.; Wallace, Ryan M.; Balaram, Deepashree; Lindenmayer, Joann M.; Eckery, Douglas C.; Mutonono-Watkiss, Beryl; Parravani, Ellie; Nel, Louis H.

    2017-01-01

    Free-roaming dogs and rabies transmission are integrally linked across many low-income countries, and large unmanaged dog populations can be daunting to rabies control program planners. Dog population management (DPM) is a multifaceted concept that aims to improve the health and well-being of free-roaming dogs, reduce problems they may cause, and may also aim to reduce dog population size. In theory, DPM can facilitate more effective rabies control. Community engagement focused on promoting responsible dog ownership and better veterinary care could improve the health of individual animals and dog vaccination coverage, thus reducing rabies transmission. Humane DPM tools, such as sterilization, could theoretically reduce dog population turnover and size, allowing rabies vaccination coverage to be maintained more easily. However, it is important to understand local dog populations and community attitudes toward them in order to determine whether and how DPM might contribute to rabies control and which DPM tools would be most successful. In practice, there is very limited evidence of DPM tools achieving reductions in the size or turnover of dog populations in canine rabies-endemic areas. Different DPM tools are frequently used together and combined with rabies vaccinations, but full impact assessments of DPM programs are not usually available, and therefore, evaluation of tools is difficult. Surgical sterilization is the most frequently documented tool and has successfully reduced dog population size and turnover in a few low-income settings. However, DPM programs are mostly conducted in urban settings and are usually not government funded, raising concerns about their applicability in rural settings and sustainability over time. Technical demands, costs, and the time necessary to achieve population-level impacts are major barriers. Given their potential value, we urgently need more evidence of the effectiveness of DPM tools in the context of canine rabies control. Cheaper, less labor-intensive tools for dog sterilization will be extremely valuable in realizing the potential benefits of reduced population turnover and size. No one DPM tool will fit all situations, but if DPM objectives are achieved dog populations may be stabilized or even reduced, facilitating higher dog vaccination coverages that will benefit rabies elimination efforts. PMID:28740850

  14. Visual readability analysis: how to make your writings easier to read.

    PubMed

    Oelke, Daniela; Spretke, David; Stoffel, Andreas; Keim, Daniel A

    2012-05-01

    We present a tool that is specifically designed to support a writer in revising a draft version of a document. In addition to showing which paragraphs and sentences are difficult to read and understand, we assist the reader in understanding why this is the case. This requires features that are expressive predictors of readability, and are also semantically understandable. In the first part of the paper, we, therefore, discuss a semiautomatic feature selection approach that is used to choose appropriate measures from a collection of 141 candidate readability features. In the second part, we present the visual analysis tool VisRA, which allows the user to analyze the feature values across the text and within single sentences. Users can choose between different visual representations accounting for differences in the size of the documents and the availability of information about the physical and logical layout of the documents. We put special emphasis on providing as much transparency as possible to ensure that the user can purposefully improve the readability of a sentence. Several case studies are presented that show the wide range of applicability of our tool. Furthermore, an in-depth evaluation assesses the quality of the measure and investigates how well users do in revising a text with the help of the tool.

  15. Phase transition of charged-AdS black holes and quasinormal modes: A time domain analysis

    NASA Astrophysics Data System (ADS)

    Chabab, M.; El Moumni, H.; Iraoui, S.; Masmar, K.

    2017-10-01

    In this work, we investigate the time evolution of a massless scalar perturbation around small and large RN-AdS4 black holes for the purpose of probing the thermodynamic phase transition. We show that below the critical point the scalar perturbation decays faster with increasing of the black hole size, both for small and large black hole phases. Our analysis of the time profile of quasinormal mode reveals a sharp distinction between the behaviors of both phases, providing a reliable tool to probe the black hole phase transition. However at the critical point P=Pc, as the black hole size extends, we note that the damping time increases and the perturbation decays faster, the oscillation frequencies raise either in small and large black hole phase. In this case the time evolution approach fails to track the AdS4 black hole phase.

  16. Spin Testing of Superalloy Disks With Dual Grain Structure

    NASA Technical Reports Server (NTRS)

    Hefferman, Tab M.

    2006-01-01

    This 24-month program was a joint effort between Allison Advanced Development Company (AADC), General Electric Aircraft (GEAE), and NASA Glenn Research Center (GRC). AADC led the disk and spin hardware design and analysis utilizing existing Rolls-Royce turbine disk forging tooling. Testing focused on spin testing four disks: two supplied by GEAE and two by AADC. The two AADC disks were made of Alloy 10, and each was subjected to a different heat treat process: one producing dual microstructure with coarse grain size at the rim and fine grain size at the bore and the other produced single fine grain structure throughout. The purpose of the spin tests was to provide data for evaluation of the impact of dual grain structure on disk overspeed integrity (yielding) and rotor burst criteria. The program culminated with analysis and correlation of the data to current rotor overspeed criteria and advanced criteria required for dual structure disks.

  17. Analyses of the Integration of Carbon Dioxide Removal Assembly, Compressor, Accumulator and Sabatier Carbon Dioxide Reduction Assembly

    NASA Technical Reports Server (NTRS)

    Jeng, Frank F.; Lafuse, Sharon; Smith, Frederick D.; Lu, Sao-Dung; Knox, James C.; Campbell, Mellssa L.; Scull, Timothy D.; Green Steve

    2010-01-01

    A tool has been developed by the Sabatier Team for analyzing/optimizing CO2 removal assembly, CO2 compressor size, its operation logic, water generation from Sabatier, utilization of CO2 from crew metabolic output, and Hz from oxygen generation assembly. Tests had been conducted using CDRA/Simulation compressor set-up at MSFC in 2003. Analysis of test data has validated CO2 desorption rate profile, CO2 compressor performance, CO2 recovery and CO2 vacuum vent in CDRA desorption. Optimizing the compressor size and compressor operation logic for an integrated closed air revitalization system Is being conducted by the Sabatier Team.

  18. Personal computer study of finite-difference methods for the transonic small disturbance equation

    NASA Technical Reports Server (NTRS)

    Bland, Samuel R.

    1989-01-01

    Calculation of unsteady flow phenomena requires careful attention to the numerical treatment of the governing partial differential equations. The personal computer provides a convenient and useful tool for the development of meshes, algorithms, and boundary conditions needed to provide time accurate solution of these equations. The one-dimensional equation considered provides a suitable model for the study of wave propagation in the equations of transonic small disturbance potential flow. Numerical results for effects of mesh size, extent, and stretching, time step size, and choice of far-field boundary conditions are presented. Analysis of the discretized model problem supports these numerical results. Guidelines for suitable mesh and time step choices are given.

  19. Small numbers, disclosure risk, security, and reliability issues in Web-based data query systems.

    PubMed

    Rudolph, Barbara A; Shah, Gulzar H; Love, Denise

    2006-01-01

    This article describes the process for developing consensus guidelines and tools for releasing public health data via the Web and highlights approaches leading agencies have taken to balance disclosure risk with public dissemination of reliable health statistics. An agency's choice of statistical methods for improving the reliability of released data for Web-based query systems is based upon a number of factors, including query system design (dynamic analysis vs preaggregated data and tables), population size, cell size, data use, and how data will be supplied to users. The article also describes those efforts that are necessary to reduce the risk of disclosure of an individual's protected health information.

  20. NAP: The Network Analysis Profiler, a web tool for easier topological analysis and comparison of medium-scale biological networks.

    PubMed

    Theodosiou, Theodosios; Efstathiou, Georgios; Papanikolaou, Nikolas; Kyrpides, Nikos C; Bagos, Pantelis G; Iliopoulos, Ioannis; Pavlopoulos, Georgios A

    2017-07-14

    Nowadays, due to the technological advances of high-throughput techniques, Systems Biology has seen a tremendous growth of data generation. With network analysis, looking at biological systems at a higher level in order to better understand a system, its topology and the relationships between its components is of a great importance. Gene expression, signal transduction, protein/chemical interactions, biomedical literature co-occurrences, are few of the examples captured in biological network representations where nodes represent certain bioentities and edges represent the connections between them. Today, many tools for network visualization and analysis are available. Nevertheless, most of them are standalone applications that often (i) burden users with computing and calculation time depending on the network's size and (ii) focus on handling, editing and exploring a network interactively. While such functionality is of great importance, limited efforts have been made towards the comparison of the topological analysis of multiple networks. Network Analysis Provider (NAP) is a comprehensive web tool to automate network profiling and intra/inter-network topology comparison. It is designed to bridge the gap between network analysis, statistics, graph theory and partially visualization in a user-friendly way. It is freely available and aims to become a very appealing tool for the broader community. It hosts a great plethora of topological analysis methods such as node and edge rankings. Few of its powerful characteristics are: its ability to enable easy profile comparisons across multiple networks, find their intersection and provide users with simplified, high quality plots of any of the offered topological characteristics against any other within the same network. It is written in R and Shiny, it is based on the igraph library and it is able to handle medium-scale weighted/unweighted, directed/undirected and bipartite graphs. NAP is available at http://bioinformatics.med.uoc.gr/NAP .

  1. The decomposition of deformation: New metrics to enhance shape analysis in medical imaging.

    PubMed

    Varano, Valerio; Piras, Paolo; Gabriele, Stefano; Teresi, Luciano; Nardinocchi, Paola; Dryden, Ian L; Torromeo, Concetta; Puddu, Paolo E

    2018-05-01

    In landmarks-based Shape Analysis size is measured, in most cases, with Centroid Size. Changes in shape are decomposed in affine and non affine components. Furthermore the non affine component can be in turn decomposed in a series of local deformations (partial warps). If the extent of deformation between two shapes is small, the difference between Centroid Size and m-Volume increment is barely appreciable. In medical imaging applied to soft tissues bodies can undergo very large deformations, involving large changes in size. The cardiac example, analyzed in the present paper, shows changes in m-Volume that can reach the 60%. We show here that standard Geometric Morphometrics tools (landmarks, Thin Plate Spline, and related decomposition of the deformation) can be generalized to better describe the very large deformations of biological tissues, without losing a synthetic description. In particular, the classical decomposition of the space tangent to the shape space in affine and non affine components is enriched to include also the change in size, in order to give a complete description of the tangent space to the size-and-shape space. The proposed generalization is formulated by means of a new Riemannian metric describing the change in size as change in m-Volume rather than change in Centroid Size. This leads to a redefinition of some aspects of the Kendall's size-and-shape space without losing Kendall's original formulation. This new formulation is discussed by means of simulated examples using 2D and 3D platonic shapes as well as a real example from clinical 3D echocardiographic data. We demonstrate that our decomposition based approaches discriminate very effectively healthy subjects from patients affected by Hypertrophic Cardiomyopathy. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. The effect of grinding on the mechanical behavior of Y-TZP ceramics: A systematic review and meta-analyses.

    PubMed

    Pereira, G K R; Fraga, S; Montagner, A F; Soares, F Z M; Kleverlaan, C J; Valandro, L F

    2016-10-01

    The aim of this study was to systematically review the literature to assess the effect of grinding on the mechanical properties, structural stability and superficial characteristics of Y-TZP ceramics. The MEDLINE via PubMed and Web of Science (ISI - Web of Knowledge) electronic databases were searched with included peer-reviewed publications in English language and with no publication year limit. From 342 potentially eligible studies, 73 were selected for full-text analysis, 30 were included in the systematic review with 20 considered in the meta-analysis. Two reviewers independently selected the studies, extracted the data, and assessed the risk of bias. Statistical analyses were performed using RevMan 5.1, with random effects model, at a significance level of 0.05. A descriptive analysis considering phase transformation, Y-TZP grain size, Vickers hardness, residual stress and aging of all included studies were executed. Four outcomes were considered in the meta-analyses (factor: grinding x as-sintered) in global and subgroups analyses (grinding tool, grit-size and cooling) for flexural strength and roughness (Ra) data. A significant difference (p<0.05) was observed in the global analysis for strength, favoring as-sintered; subgroup analyses revealed that different parameters lead to different effects on strength. In the global analysis for roughness, a significant difference (p<0.05) was observed between conditions, favoring grinding; subgroup analyses revealed that different parameters also lead to different effects on roughness. High heterogeneity was found in some comparisons. Generally grinding promotes decrease in strength and increase in roughness of Y-TZP ceramics. However, the use of a grinding tool that allows greater accuracy of the movement (i.e. contra angle hand-pieces coupled to slowspeed turbines), small grit size (<50μm) and the use of plenty coolant seem to be the main factors to decrease the defect introduction and allow the occurrence of the toughening transformation mechanism, decreasing the risk of deleterious impact on Y-TZP mechanical properties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Droplet sizing instrumentation used for icing research: Operation, calibration, and accuracy

    NASA Technical Reports Server (NTRS)

    Hovenac, Edward A.

    1989-01-01

    The accuracy of the Forward Scattering Spectrometer Probe (FSSP) is determined using laboratory tests, wind tunnel comparisons, and computer simulations. Operation in an icing environment is discussed and a new calibration device for the FSSP (the rotating pinhole) is demonstrated to be a valuable tool. Operation of the Optical Array Probe is also presented along with a calibration device (the rotating reticle) which is suitable for performing detailed analysis of that instrument.

  4. Advanced Concepts for Composite Structure Joints and Attachment Fittings. Volume I. Design and Evaluation.

    DTIC Science & Technology

    1981-11-01

    interlaminar tension). The analysis is also influenced by other factors such as bolt location, washer/bolt size, fastener pattern, laminate thickness, corner...to reduce the cost of tooling were also studied. These include: * Pultrusion dies for under $5, 000 * Stable, accurate, low-cost chopped-fiber phenolic ...fittings were state-of- the-art methods developed for laminated composite plates, shells, beams, and columns as used in analyses of discontinuities, edge

  5. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  6. Technology Challenges in Small UAV Development

    NASA Technical Reports Server (NTRS)

    Logan, Michael J.; Vranas, Thomas L.; Motter, Mark; Shams, Qamar; Pollock, Dion S.

    2005-01-01

    Development of highly capable small UAVs present unique challenges for technology protagonists. Size constraints, the desire for ultra low cost and/or disposable platforms, lack of capable design and analysis tools, and unique mission requirements all add to the level of difficulty in creating state-of-the-art small UAVs. This paper presents the results of several small UAV developments, the difficulties encountered, and proposes a list of technology shortfalls that need to be addressed.

  7. Experiences running NASTRAN on the Microvax 2 computer

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.; Mitchell, Reginald S.

    1987-01-01

    The MicroVAX operates NASTRAN so well that the only detectable difference in its operation compared to an 11/780 VAX is in the execution time. On the modest installation described here, the engineer has all of the tools he needs to do an excellent job of analysis. System configuration decisions, system sizing, preparation of the system disk, definition of user quotas, installation, monitoring of system errors, and operation policies are discussed.

  8. Body Size Is a Significant Predictor of Congruency in Species Richness Patterns: A Meta-Analysis of Aquatic Studies

    PubMed Central

    Velghe, Katherine; Gregory-Eaves, Irene

    2013-01-01

    Biodiversity losses over the next century are predicted to result in alterations of ecosystem functions that are on par with other major drivers of global change. Given the seriousness of this issue, there is a need to effectively monitor global biodiversity. Because performing biodiversity censuses of all taxonomic groups is prohibitively costly, indicator groups have been studied to estimate the biodiversity of different taxonomic groups. Quantifying cross-taxon congruence is a method of evaluating the assumption that the diversity of one taxonomic group can be used to predict the diversity of another. To improve the predictive ability of cross-taxon congruence in aquatic ecosystems, we evaluated whether body size, measured as the ratio of average body length between organismal groups, is a significant predictor of their cross-taxon biodiversity congruence. To test this hypothesis, we searched the published literature and screened for studies that used species richness correlations as their metric of cross-taxon congruence. We extracted 96 correlation coefficients from 16 studies, which encompassed 784 inland water bodies. With these correlation coefficients, we conducted a categorical meta-analysis, grouping data based on the body size ratio of organisms. Our results showed that cross-taxon congruence is variable among sites and between different groups (r values ranging between −0.53 to 0.88). In addition, our quantitative meta-analysis demonstrated that organisms most similar in body size showed stronger species richness correlations than organisms which differed increasingly in size (radj 2 = 0.94, p = 0.02). We propose that future studies applying biodiversity indicators in aquatic ecosystems consider functional traits such as body size, so as to increase their success at predicting the biodiversity of taxonomic groups where cost-effective conservation tools are needed. PMID:23468903

  9. TeraStitcher - A tool for fast automatic 3D-stitching of teravoxel-sized microscopy images

    PubMed Central

    2012-01-01

    Background Further advances in modern microscopy are leading to teravoxel-sized tiled 3D images at high resolution, thus increasing the dimension of the stitching problem of at least two orders of magnitude. The existing software solutions do not seem adequate to address the additional requirements arising from these datasets, such as the minimization of memory usage and the need to process just a small portion of data. Results We propose a free and fully automated 3D Stitching tool designed to match the special requirements coming out of teravoxel-sized tiled microscopy images that is able to stitch them in a reasonable time even on workstations with limited resources. The tool was tested on teravoxel-sized whole mouse brain images with micrometer resolution and it was also compared with the state-of-the-art stitching tools on megavoxel-sized publicy available datasets. This comparison confirmed that the solutions we adopted are suited for stitching very large images and also perform well on datasets with different characteristics. Indeed, some of the algorithms embedded in other stitching tools could be easily integrated in our framework if they turned out to be more effective on other classes of images. To this purpose, we designed a software architecture which separates the strategies that use efficiently memory resources from the algorithms which may depend on the characteristics of the acquired images. Conclusions TeraStitcher is a free tool that enables the stitching of Teravoxel-sized tiled microscopy images even on workstations with relatively limited resources of memory (<8 GB) and processing power. It exploits the knowledge of approximate tile positions and uses ad-hoc strategies and algorithms designed for such very large datasets. The produced images can be saved into a multiresolution representation to be efficiently retrieved and processed. We provide TeraStitcher both as standalone application and as plugin of the free software Vaa3D. PMID:23181553

  10. Spark and HPC for High Energy Physics Data Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc

    A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less

  11. The development of internet based ship design support system for small and medium sized shipyards

    NASA Astrophysics Data System (ADS)

    Shin, Sung-Chul; Lee, Soon-Sup; Kang, Dong-Hoon; Lee, Kyung-Ho

    2012-03-01

    In this paper, a prototype of ship basic planning system is implemented for the small and medium sized shipyards based on the internet technology and concurrent engineering concept. The system is designed from the user requirements. Consequently, standardized development environment and tools are selected. These tools are used for the system development to define and evaluate core application technologies. The system will contribute to increasing competitiveness of small and medium sized shipyards in the 21st century industrial en-vironment.

  12. [Study of the reliability in one dimensional size measurement with digital slit lamp microscope].

    PubMed

    Wang, Tao; Qi, Chaoxiu; Li, Qigen; Dong, Lijie; Yang, Jiezheng

    2010-11-01

    To study the reliability of digital slit lamp microscope as a tool for quantitative analysis in one dimensional size measurement. Three single-blinded observers acquired and repeatedly measured the images with a size of 4.00 mm and 10.00 mm on the vernier caliper, which simulatated the human eye pupil and cornea diameter under China-made digital slit lamp microscope in the objective magnification of 4 times, 10 times, 16 times, 25 times, 40 times and 4 times, 10 times, 16 times, respectively. The correctness and precision of measurement were compared. The images with 4 mm size were measured by three investigators and the average values were located between 3.98 to 4.06. For the images with 10.00 mm size, the average values fell within 10.00 ~ 10.04. Measurement results of 4.00 mm images showed, except A4, B25, C16 and C25, significant difference was noted between the measured value and the true value. Regarding measurement results of 10.00 mm iamges indicated, except A10, statistical significance was found between the measured value and the true value. In terms of comparing the results of the same size measured at different magnifications by the same investigator, except for investigators A's measurements of 10.00 mm dimension, the measurement results by all the remaining investigators presented statistical significance at different magnifications. Compared measurements of the same size with different magnifications, measurements of 4.00 mm in 4-fold magnification had no significant difference among the investigators', the remaining results were statistically significant. The coefficient of variation of all measurement results were less than 5%; as magnification increased, the coefficient of variation decreased. The measurement of digital slit lamp microscope in one-dimensional size has good reliability,and should be performed for reliability analysis before used for quantitative analysis to reduce systematic errors.

  13. Asymptotic analysis of SPTA-based algorithms for no-wait flow shop scheduling problem with release dates.

    PubMed

    Ren, Tao; Zhang, Chuan; Lin, Lin; Guo, Meiting; Xie, Xionghang

    2014-01-01

    We address the scheduling problem for a no-wait flow shop to optimize total completion time with release dates. With the tool of asymptotic analysis, we prove that the objective values of two SPTA-based algorithms converge to the optimal value for sufficiently large-sized problems. To further enhance the performance of the SPTA-based algorithms, an improvement scheme based on local search is provided for moderate scale problems. New lower bound is presented for evaluating the asymptotic optimality of the algorithms. Numerical simulations demonstrate the effectiveness of the proposed algorithms.

  14. Asymptotic Analysis of SPTA-Based Algorithms for No-Wait Flow Shop Scheduling Problem with Release Dates

    PubMed Central

    Ren, Tao; Zhang, Chuan; Lin, Lin; Guo, Meiting; Xie, Xionghang

    2014-01-01

    We address the scheduling problem for a no-wait flow shop to optimize total completion time with release dates. With the tool of asymptotic analysis, we prove that the objective values of two SPTA-based algorithms converge to the optimal value for sufficiently large-sized problems. To further enhance the performance of the SPTA-based algorithms, an improvement scheme based on local search is provided for moderate scale problems. New lower bound is presented for evaluating the asymptotic optimality of the algorithms. Numerical simulations demonstrate the effectiveness of the proposed algorithms. PMID:24764774

  15. Three-dimensional murine airway segmentation in micro-CT images

    NASA Astrophysics Data System (ADS)

    Shi, Lijun; Thiesse, Jacqueline; McLennan, Geoffrey; Hoffman, Eric A.; Reinhardt, Joseph M.

    2007-03-01

    Thoracic imaging for small animals has emerged as an important tool for monitoring pulmonary disease progression and therapy response in genetically engineered animals. Micro-CT is becoming the standard thoracic imaging modality in small animal imaging because it can produce high-resolution images of the lung parenchyma, vasculature, and airways. Segmentation, measurement, and visualization of the airway tree is an important step in pulmonary image analysis. However, manual analysis of the airway tree in micro-CT images can be extremely time-consuming since a typical dataset is usually on the order of several gigabytes in size. Automated and semi-automated tools for micro-CT airway analysis are desirable. In this paper, we propose an automatic airway segmentation method for in vivo micro-CT images of the murine lung and validate our method by comparing the automatic results to manual tracing. Our method is based primarily on grayscale morphology. The results show good visual matches between manually segmented and automatically segmented trees. The average true positive volume fraction compared to manual analysis is 91.61%. The overall runtime for the automatic method is on the order of 30 minutes per volume compared to several hours to a few days for manual analysis.

  16. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    PubMed

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  17. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies

    PubMed Central

    Dols, W. Stuart.; Emmerich, Steven J.; Polidoro, Brian J.

    2016-01-01

    Building energy analysis tools are available in many forms that provide the ability to address a broad spectrum of energy-related issues in various combinations. Often these tools operate in isolation from one another, making it difficult to evaluate the interactions between related phenomena and interacting systems, forcing oversimplified assumptions to be made about various phenomena that could otherwise be addressed directly with another tool. One example of such interdependence is the interaction between heat transfer, inter-zone airflow and indoor contaminant transport. In order to better address these interdependencies, the National Institute of Standards and Technology (NIST) has developed an updated version of the multi-zone airflow and contaminant transport modelling tool, CONTAM, along with a set of utilities to enable coupling of the full CONTAM model with the TRNSYS simulation tool in a more seamless manner and with additional capabilities that were previously not available. This paper provides an overview of these new capabilities and applies them to simulating a medium-size office building. These simulations address the interaction between whole-building energy, airflow and contaminant transport in evaluating various ventilation strategies including natural and demand-controlled ventilation. Practical Application CONTAM has been in practical use for many years allowing building designers, as well as IAQ and ventilation system analysts, to simulate the complex interactions between building physical layout and HVAC system configuration in determining building airflow and contaminant transport. It has been widely used to design and analyse smoke management systems and evaluate building performance in response to chemical, biological and radiological events. While CONTAM has been used to address design and performance of buildings implementing energy conserving ventilation systems, e.g., natural and hybrid, this new coupled simulation capability will enable users to apply the tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO2-based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system. PMID:27099405

  18. [Association between productivity, list size, patient and practice characteristics in general practice].

    PubMed

    Olsen, Kim Rose; Sørensen, Torben Højmark; Gyrd-Hansen, Dorte

    2010-04-19

    Due to shortage of general practitioners, it may be necessary to improve productivity. We assess the association between productivity, list size and patient- and practice characteristics. A regression approach is used to perform productivity analysis based on national register data and survey data for 1,758 practices. Practices are divided into four groups according to list size and productivity. Statistical tests are used to assess differences in patient- and practice characteristics. There is a significant, positive correlation between list size and productivity (p < 0.01). Nevertheless, 19% of the practices have a list size below and a productivity above mean sample values. These practices have relatively demanding patients (older, low socioeconomic status, high use of pharmaceuticals) and they are frequently located in areas with limited access to specialized care and have a low use of assisting personnel. 13% of the practices have a list size above and a productivity below mean sample values. These practices have relatively less demanding patients, are located in areas with good access to specialized care, and have a high use of assisting personnel. Lists and practice characteristics have substantial influence on both productivity and list size. Adjusting list size to external factors seems to be an effective tool to increase productivity in general practice.

  19. Effectiveness of patient simulation in nursing education: meta-analysis.

    PubMed

    Shin, Sujin; Park, Jin-Hwa; Kim, Jung-Hee

    2015-01-01

    The use of simulation as an educational tool is becoming increasingly prevalent in nursing education, and a variety of simulators are utilized. Based on the results of these studies, nursing facilitators must find ways to promote effective learning among students in clinical practice and classrooms. To identify the best available evidence about the effects of patient simulation in nursing education through a meta-analysis. This study explores quantitative evidence published in the electronic databases: EBSCO, Medline, ScienceDirect, and ERIC. Using a search strategy, we identified 2503 potentially relevant articles. Twenty studies were included in the final analysis. We found significant post-intervention improvements in various domains for participants who received simulation education compared to the control groups, with a pooled random-effects standardized mean difference of 0.71, which is a medium-to-large effect size. In the subgroup analysis, we found that simulation education in nursing had benefits, in terms of effect sizes, when the effects were evaluated through performance, the evaluation outcome was psychomotor skills, the subject of learning was clinical, learners were clinical nurses and senior undergraduate nursing students, and simulators were high fidelity. These results indicate that simulation education demonstrated medium to large effect sizes and could guide nurse educators with regard to the conditions under which patient simulation is more effective than traditional learning methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Effect of preventive zinc supplementation on linear growth in children under 5 years of age in developing countries: a meta-analysis of studies for input to the lives saved tool

    PubMed Central

    2011-01-01

    Introduction Zinc plays an important role in cellular growth, cellular differentiation and metabolism. The results of previous meta-analyses evaluating effect of zinc supplementation on linear growth are inconsistent. We have updated and evaluated the available evidence according to Grading of Recommendations, Assessment, Development and Evaluation (GRADE) criteria and tried to explain the difference in results of the previous reviews. Methods A literature search was done on PubMed, Cochrane Library, IZiNCG database and WHO regional data bases using different terms for zinc and linear growth (height). Data were abstracted in a standardized form. Data were analyzed in two ways i.e. weighted mean difference (effect size) and pooled mean difference for absolute increment in length in centimeters. Random effect models were used for these pooled estimates. We have given our recommendations for effectiveness of zinc supplementation in the form of absolute increment in length (cm) in zinc supplemented group compared to control for input to Live Saves Tool (LiST). Results There were thirty six studies assessing the effect of zinc supplementation on linear growth in children < 5 years from developing countries. In eleven of these studies, zinc was given in combination with other micronutrients (iron, vitamin A, etc). The final effect size after pooling all the data sets (zinc ± iron etc) showed a significant positive effect of zinc supplementation on linear growth [Effect size: 0.13 (95% CI 0.04, 0.21), random model] in the developing countries. A subgroup analysis by excluding those data sets where zinc was supplemented in combination with iron showed a more pronounced effect of zinc supplementation on linear growth [Weighed mean difference 0.19 (95 % CI 0.08, 0.30), random model]. A subgroup analysis from studies that reported actual increase in length (cm) showed that a dose of 10 mg zinc/day for duration of 24 weeks led to a net a gain of 0.37 (±0.25) cm in zinc supplemented group compared to placebo. This estimate is recommended for inclusion in Lives Saved Tool (LiST) model. Conclusions Zinc supplementation has a significant positive effect on linear growth, especially when administered alone, and should be included in national strategies to reduce stunting in children < 5 years of age in developing countries. PMID:21501440

  1. Emission sources estimation of size-segregated suburban aerosols measured in continental part of Balkan region using PMF5.0 multivariate receptor model

    NASA Astrophysics Data System (ADS)

    Petrovic, Srdjan; Đuričić-Milanković, Jelena; Anđelković, Ivan; Pantelić, Ana; Gambaro, Andrea; Đorđević, Dragana

    2017-04-01

    Using Low-Pressure Cascade Impactors by Dr Berner size segregated particulate matter in the size ranges: 0.27 ≤ Dp ≤ 0.53 μm, 0.53 ≤ Dp ≤ 1.06 μm, 1.06 ≤ Dp ≤ 2.09 μm, 2.09 ≤ Dp ≤ 4.11 μm, 4.11 ≤ Dp ≤ 8.11 μm and 8.11 ≤ Dp ≤ 16 μm were collected. Forty-eight-hour size segregated particulate matter samples from atmospheric aerosols in the sub-urban site of Belgrade were measured during two years (in 2012th to 2013in). ICP-MS was used to quantify next elements: Ag, Al, As, Ba, Be, Ca, Cd, Co, Cr, Cu, Fe, K, Hg, Na, Ni, Mg, Mn, Mo, Pb, Se, Sb, Ti, Tl, V and Zn. In order to examine the number of sources and their fingerprints, EPA PMF 5.0 multivariate receptor tool was used. Error estimation methods (bootstrap, displacement, and bootstrap enhanced by displacement) in the analysis of the obtained solutions have enabled proper detection of the number and types of sources. This analysis of the results indicated the existence of four main sources that contribute to air pollution in the suburban area of Belgrade.

  2. Global Sensitivity Analysis with Small Sample Sizes: Ordinary Least Squares Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Michael J.; Liu, Wei; Sivaramakrishnan, Raghu

    2016-12-21

    A new version of global sensitivity analysis is developed in this paper. This new version coupled with tools from statistics, machine learning, and optimization can devise small sample sizes that allow for the accurate ordering of sensitivity coefficients for the first 10-30 most sensitive chemical reactions in complex chemical-kinetic mechanisms, and is particularly useful for studying the chemistry in realistic devices. A key part of the paper is calibration of these small samples. Because these small sample sizes are developed for use in realistic combustion devices, the calibration is done over the ranges of conditions in such devices, with amore » test case being the operating conditions of a compression ignition engine studied earlier. Compression ignition engines operate under low-temperature combustion conditions with quite complicated chemistry making this calibration difficult, leading to the possibility of false positives and false negatives in the ordering of the reactions. So an important aspect of the paper is showing how to handle the trade-off between false positives and false negatives using ideas from the multiobjective optimization literature. The combination of the new global sensitivity method and the calibration are sample sizes a factor of approximately 10 times smaller than were available with our previous algorithm.« less

  3. Inferring species roles in metacommunity structure from species co-occurrence networks

    PubMed Central

    Borthagaray, Ana I.; Arim, Matías; Marquet, Pablo A.

    2014-01-01

    A long-standing question in community ecology is what determines the identity of species that coexist across local communities or metacommunity assembly. To shed light upon this question, we used a network approach to analyse the drivers of species co-occurrence patterns. In particular, we focus on the potential roles of body size and trophic status as determinants of metacommunity cohesion because of their link to resource use and dispersal ability. Small-sized individuals at low-trophic levels, and with limited dispersal potential, are expected to form highly linked subgroups, whereas large-size individuals at higher trophic positions, and with good dispersal potential, will foster the spatial coupling of subgroups and the cohesion of the whole metacommunity. By using modularity analysis, we identified six modules of species with similar responses to ecological conditions and high co-occurrence across local communities. Most species either co-occur with species from a single module or are connectors of the whole network. Among the latter are carnivorous species of intermediate body size, which by virtue of their high incidence provide connectivity to otherwise isolated communities playing the role of spatial couplers. Our study also demonstrates that the incorporation of network tools to the analysis of metacommunity ecology can help unveil the mechanisms underlying patterns and processes in metacommunity assembly. PMID:25143039

  4. (Sample) Size Matters: Best Practices for Defining Error in Planktic Foraminiferal Proxy Records

    NASA Astrophysics Data System (ADS)

    Lowery, C.; Fraass, A. J.

    2016-02-01

    Paleoceanographic research is a vital tool to extend modern observational datasets and to study the impact of climate events for which there is no modern analog. Foraminifera are one of the most widely used tools for this type of work, both as paleoecological indicators and as carriers for geochemical proxies. However, the use of microfossils as proxies for paleoceanographic conditions brings about a unique set of problems. This is primarily due to the fact that groups of individual foraminifera, which usually live about a month, are used to infer average conditions for time periods ranging from hundreds to tens of thousands of years. Because of this, adequate sample size is very important for generating statistically robust datasets, particularly for stable isotopes. In the early days of stable isotope geochemistry, instrumental limitations required hundreds of individual foraminiferal tests to return a value. This had the fortunate side-effect of smoothing any seasonal to decadal changes within the planktic foram population. With the advent of more sensitive mass spectrometers, smaller sample sizes have now become standard. While this has many advantages, the use of smaller numbers of individuals to generate a data point has lessened the amount of time averaging in the isotopic analysis and decreased precision in paleoceanographic datasets. With fewer individuals per sample, the differences between individual specimens will result in larger variation, and therefore error, and less precise values for each sample. Unfortunately, most (the authors included) do not make a habit of reporting the error associated with their sample size. We have created an open-source model in R to quantify the effect of sample sizes under various realistic and highly modifiable parameters (calcification depth, diagenesis in a subset of the population, improper identification, vital effects, mass, etc.). For example, a sample in which only 1 in 10 specimens is diagenetically altered can be off by >0.3‰ δ18O VPDB, or 1°C. Here, we demonstrate the use of this tool to quantify error in micropaleontological datasets, and suggest best practices for minimizing error when generating stable isotope data with foraminifera.

  5. Body size distributions signal a regime shift in a lake ...

    EPA Pesticide Factsheets

    Communities of organisms, from mammals to microorganisms, have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at multiple spatial and temporal scales. In this study, we assessed whether body size patterns serve as an indicator of a threshold between alternative regimes. Over the past 7000 years, the biological communities of Foy Lake (Montana,USA) have undergone a major regime shift owing to climate change. We used a palaeoecological record of diatom communities to estimate diatom sizes, and then analysed the discontinuous distribution of organism sizes over time. We used Bayesian classification and regression tree models to determine that all time intervals exhibited aggregations of sizes separated by gaps in the distribution and found a significant change in diatom body size distributions approximately 150 years before the identified ecosystem regime shift. We suggest that discontinuity analysis is a useful addition to the suite of tools for the detection of early warning signals of regime shifts. Communities of organisms from mammals to microorganisms have discontinuous distributions of body size. This pattern of size structuring is a conservative trait of community organization and is a product of processes that occur at discrete spatial and temporal scales within ecosystems. Here, a paleoecological record of diatom community change is use

  6. Exergy analysis of large-scale helium liquefiers: Evaluating design trade-offs

    NASA Astrophysics Data System (ADS)

    Thomas, Rijo Jacob; Ghosh, Parthasarathi; Chowdhury, Kanchan

    2014-01-01

    It is known that higher heat exchanger area, more number of expanders with higher efficiency and more involved configuration with multi-pressure compression system increase the plant efficiency of a helium liquefier. However, they involve higher capital investment and larger size. Using simulation software Aspen Hysys v 7.0 and exergy analysis as the tool of analysis, authors have attempted to identify various trade-offs while selecting the number of stages, the pressure levels in compressor, the cold-end configuration, the heat exchanger surface area, the maximum allowable pressure drop in heat exchangers, the efficiency of expanders, the parallel/series connection of expanders etc. Use of more efficient cold ends reduces the number of refrigeration stages and the size of the plant. For achieving reliability along with performance, a configuration with a combination of expander and Joule-Thomson valve is found to be a better choice for cold end. Use of multi-pressure system is relevant only when the number of refrigeration stages is more than 5. Arrangement of expanders in series reduces the number of expanders as well as the heat exchanger size with slight expense of plant efficiency. Superior heat exchanger (having less pressure drop per unit heat transfer area) results in only 5% increase of plant performance even when it has 100% higher heat exchanger surface area.

  7. Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases

    NASA Astrophysics Data System (ADS)

    Grant, Glenn Edwin

    Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.

  8. Discrimination of particulate matter emission sources using stochastic methods

    NASA Astrophysics Data System (ADS)

    Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek

    2016-12-01

    Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.

  9. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis

    PubMed Central

    Razi Naqvi, K.

    2014-01-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens’ theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells. PMID:24761307

  10. Screening hypochromism (sieve effect) in red blood cells: a quantitative analysis.

    PubMed

    Razi Naqvi, K

    2014-04-01

    Multiwavelength UV-visible spectroscopy, Kramers-Kronig analysis, and several other experimental and theoretical tools have been applied over the last several decades to fathom absorption and scattering of light by suspensions of micron-sized pigmented particles, including red blood cells, but a satisfactory quantitative analysis of the difference between the absorption spectra of suspension of intact and lysed red blood cells is still lacking. It is stressed that such a comparison is meaningful only if the pertinent spectra are free from, or have been corrected for, scattering losses, and it is shown that Duysens' theory can, whereas that of Vekshin cannot, account satisfactorily for the observed hypochromism of suspensions of red blood cells.

  11. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.

    Numerical simulations of laser-plasma wakefield (particle) accelerators model the acceleration of electrons trapped in plasma oscillations (wakes) left behind when an intense laser pulse propagates through the plasma. The goal of these simulations is to better understand the process involved in plasma wake generation and how electrons are trapped and accelerated by the wake. Understanding of such accelerators, and their development, offer high accelerating gradients, potentially reducing size and cost of new accelerators. One operating regime of interest is where a trapped subset of electrons loads the wake and forms an isolated group of accelerated particles with low spread inmore » momentum and position, desirable characteristics for many applications. The electrons trapped in the wake may be accelerated to high energies, the plasma gradient in the wake reaching up to a gigaelectronvolt per centimeter. High-energy electron accelerators power intense X-ray radiation to terahertz sources, and are used in many applications including medical radiotherapy and imaging. To extract information from the simulation about the quality of the beam, a typical approach is to examine plots of the entire dataset, visually determining the adequate parameters necessary to select a subset of particles, which is then further analyzed. This procedure requires laborious examination of massive data sets over many time steps using several plots, a routine that is unfeasible for large data collections. Demand for automated analysis is growing along with the volume and size of simulations. Current 2D LWFA simulation datasets are typically between 1GB and 100GB in size, but simulations in 3D are of the order of TBs. The increase in the number of datasets and dataset sizes leads to a need for automatic routines to recognize particle patterns as particle bunches (beam of electrons) for subsequent analysis. Because of the growth in dataset size, the application of machine learning techniques for scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis. Previously, we have described the application of a set of algorithms to automate the data analysis and classification of particle beams in the LWFA simulation data, identifying locations with high density of high energy particles. These algorithms detected high density locations (nodes) in each time step, i.e. maximum points on the particle distribution for only one spatial variable. Each node was correlated to a node in previous or later time steps by linking these nodes according to a pruned minimum spanning tree (PMST). We call the PMST representation 'a lifetime diagram', which is a graphical tool to show temporal information of high dense groups of particles in the longitudinal direction for the time series. Electron bunch compactness was described by another step of the processing, designed to partition each time step, using fuzzy clustering, into a fixed number of clusters.« less

  12. AN Fitting Reconditioning Tool

    NASA Technical Reports Server (NTRS)

    Lopez, Jason

    2011-01-01

    A tool was developed to repair or replace AN fittings on the shuttle external tank (ET). (The AN thread is a type of fitting used to connect flexible hoses and rigid metal tubing that carry fluid. It is a U.S. military-derived specification agreed upon by the Army and Navy, hence AN.) The tool is used on a drill and is guided by a pilot shaft that follows the inside bore. The cutting edge of the tool is a standard-size replaceable insert. In the typical Post Launch Maintenance/Repair process for the AN fittings, the six fittings are removed from the ET's GUCP (ground umbilical carrier plate) for reconditioning. The fittings are inspected for damage to the sealing surface per standard operations maintenance instructions. When damage is found on the sealing surface, the condition is documented. A new AN reconditioning tool is set up to cut and remove the surface damage. It is then inspected to verify the fitting still meets drawing requirements. The tool features a cone-shaped interior at 36.5 , and may be adjusted at a precise angle with go-no-go gauges to insure that the cutting edge could be adjusted as it wore down. One tool, one setting block, and one go-no-go gauge were fabricated. At the time of this reporting, the tool has reconditioned/returned to spec 36 AN fittings with 100-percent success of no leakage. This tool provides a quick solution to repair a leaky AN fitting. The tool could easily be modified with different-sized pilot shafts to different-sized fittings.

  13. Expading fluvial remote sensing to the riverscape: Mapping depth and grain size on the Merced River, California

    NASA Astrophysics Data System (ADS)

    Richardson, Ryan T.

    This study builds upon recent research in the field of fluvial remote sensing by applying techniques for mapping physical attributes of rivers. Depth, velocity, and grain size are primary controls on the types of habitat present in fluvial ecosystems. This thesis focuses on expanding fluvial remote sensing to larger spatial extents and sub-meter resolutions, which will increase our ability to capture the spatial heterogeneity of habitat at a resolution relevant to individual salmonids and an extent relevant to species. This thesis consists of two chapters, one focusing on expanding the spatial extent over which depth can be mapped using Optimal Band Ratio Analysis (OBRA) and the other developing general relations for mapping grain size from three-dimensional topographic point clouds. The two chapters are independent but connected by the overarching goal of providing scientists and managers more useful tools for quantifying the amount and quality of salmonid habitat via remote sensing. The OBRA chapter highlights the true power of remote sensing to map depths from hyperspectral images as a central component of watershed scale analysis, while also acknowledging the great challenges involved with increasing spatial extent. The grain size mapping chapter establishes the first general relations for mapping grain size from roughness using point clouds. These relations will significantly reduce the time needed in the field by eliminating the need for independent measurements of grain size for calibrating the roughness-grain size relationship and thus making grain size mapping with SFM more cost effective for river restoration and monitoring. More data from future studies are needed to refine these relations and establish their validity and generality. In conclusion, this study adds to the rapidly growing field of fluvial remote sensing and could facilitate river research and restoration.

  14. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    PubMed

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license.

  15. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    PubMed Central

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with circadian rhythms. The software iNA is freely available as executable binaries for Linux, MacOSX and Microsoft Windows, as well as the full source code under an open source license. PMID:22723865

  16. What Is the Proportion of Studies Reporting Patient and Practitioner Satisfaction with Software Support Tools Used in the Management of Knee Pain and Is This Related to Sample Size, Effect Size, and Journal Impact Factor?

    PubMed

    Bright, Philip; Hambly, Karen

    2017-12-21

    E-health software tools have been deployed in managing knee conditions. Reporting of patient and practitioner satisfaction in studies regarding e-health usage is not widely explored. The objective of this review was to identify studies describing patient and practitioner satisfaction with software use concerning knee pain. A computerized search was undertaken: four electronic databases were searched from January 2007 until January 2017. Key words were decision dashboard, clinical decision, Web-based resource, evidence support, and knee. Full texts were scanned for effect of size reporting and satisfaction scales from participants and practitioners. Binary regression was run; impact factor and sample size were predictors with indicators for satisfaction and effect size reporting as dependent variables. Seventy-seven articles were retrieved; 37 studies were included in final analysis. Ten studies reported patient satisfaction ratings (27.8%): a single study reported both patient and practitioner satisfaction (2.8%). Randomized control trials were the most common design (35%) and knee osteoarthritis the most prevalent condition (38%). Electronic patient-reported outcome measures and Web-based training were the most common interventions. No significant dependency was found within the regression models (p > 0.05). The proportion of reporting of patient satisfaction was low; practitioner satisfaction was poorly represented. There may be implications for the suitability of administering e-health, a medium for capturing further meta-evidence needs to be established and used as best practice for implicated studies in future. This is the first review of its kind to address patient and practitioner satisfaction with knee e-health.

  17. Tool for Generation of MAC/GMC Representative Unit Cell for CMC/PMC Analysis

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Pineda, Evan J.

    2016-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) 4.0. This tool is especially useful in analyzing ceramic matrix composites (CMCs), where higher fidelity with improved accuracy of local response is needed. The tool, however, can be used for analyzing polymer matrix composites (PMCs) as well. MAC/GMC 4.0 is a composite material and laminate analysis software developed at NASA Glenn Research Center. The software package has been built around the concept of the generalized method of cells (GMC). The computer code is developed with a user friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermomechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that generates a number of different user-defined repeating unit cells (RUCs). In addition, the code has provisions for generation of a MAC/GMC-compatible input text file that can be merged with any MAC/GMC input file tailored to analyze composite materials. Although the primary intention was to address the three different constituents and phases that are usually present in CMCs-namely, fibers, matrix, and interphase-it can be easily modified to address two-phase polymer matrix composite (PMC) materials where an interphase is absent. Currently, the tool capability includes generation of RUCs for square packing, hexagonal packing, and random fiber packing as well as RUCs based on actual composite micrographs. All these options have the fibers modeled as having a circular cross-sectional area. In addition, a simplified version of RUC is provided where the fibers are treated as having a square cross section and are distributed randomly. This RUC facilitates a speedy analysis using the higher fidelity version of GMC known as HFGMC. The first four mentioned options above support uniform subcell discretization. The last one has variable subcell sizes due to the primary intention of keeping the RUC size to a minimum to gain the speed ups using the higher fidelity version of MAC. The code is implemented within the MATLAB (The Mathworks, Inc., Natick, MA) developmental framework; however, a standalone application that does not need a priori MATLAB installation is also created with the aid of the MATLAB compiler.

  18. Powder compression mechanics of spray-dried lactose nanocomposites.

    PubMed

    Hellrup, Joel; Nordström, Josefina; Mahlin, Denny

    2017-02-25

    The aim of this study was to investigate the structural impact of the nanofiller incorporation on the powder compression mechanics of spray-dried lactose. The lactose was co-spray-dried with three different nanofillers, that is, cellulose nanocrystals, sodium montmorillonite and fumed silica, which led to lower micron-sized nanocomposite particles with varying structure and morphology. The powder compression mechanics of the nanocomposites and physical mixtures of the neat spray-dried components were evaluated by a rational evaluation method with compression analysis as a tool, using the Kawakita equation and the Shapiro-Konopicky-Heckel equation. Particle rearrangement dominated the initial compression profiles due to the small particle size of the materials. The strong contribution of particle rearrangement in the materials with fumed silica continued throughout the whole compression profile, which prohibited an in-depth material characterization. However, the lactose/cellulose nanocrystals and the lactose/sodium montmorillonite nanocomposites demonstrated high yield pressure compared with the physical mixtures indicating increased particle hardness upon composite formation. This increase has likely to do with a reinforcement of the nanocomposite particles by skeleton formation of the nanoparticles. In summary, the rational evaluation of mechanical properties done by applying powder compression analysis proved to be a valuable tool for mechanical evaluation for this type of spray-dried composite materials, unless they demonstrate particle rearrangement throughout the whole compression profile. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Data mining for rapid prediction of facility fit and debottlenecking of biomanufacturing facilities.

    PubMed

    Yang, Yang; Farid, Suzanne S; Thornhill, Nina F

    2014-06-10

    Higher titre processes can pose facility fit challenges in legacy biopharmaceutical purification suites with capacities originally matched to lower titre processes. Bottlenecks caused by mismatches in equipment sizes, combined with process fluctuations upon scale-up, can result in discarding expensive product. This paper describes a data mining decisional tool for rapid prediction of facility fit issues and debottlenecking of biomanufacturing facilities exposed to batch-to-batch variability and higher titres. The predictive tool comprised advanced multivariate analysis techniques to interrogate Monte Carlo stochastic simulation datasets that mimicked batch fluctuations in cell culture titres, step yields and chromatography eluate volumes. A decision tree classification method, CART (classification and regression tree) was introduced to explore the impact of these process fluctuations on product mass loss and reveal the root causes of bottlenecks. The resulting pictorial decision tree determined a series of if-then rules for the critical combinations of factors that lead to different mass loss levels. Three different debottlenecking strategies were investigated involving changes to equipment sizes, using higher capacity chromatography resins and elution buffer optimisation. The analysis compared the impact of each strategy on mass output, direct cost of goods per gram and processing time, as well as consideration of extra capital investment and space requirements. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Tissue enrichment analysis for C. elegans genomics.

    PubMed

    Angeles-Albores, David; N Lee, Raymond Y; Chan, Juancarlos; Sternberg, Paul W

    2016-09-13

    Over the last ten years, there has been explosive development in methods for measuring gene expression. These methods can identify thousands of genes altered between conditions, but understanding these datasets and forming hypotheses based on them remains challenging. One way to analyze these datasets is to associate ontologies (hierarchical, descriptive vocabularies with controlled relations between terms) with genes and to look for enrichment of specific terms. Although Gene Ontology (GO) is available for Caenorhabditis elegans, it does not include anatomical information. We have developed a tool for identifying enrichment of C. elegans tissues among gene sets and generated a website GUI where users can access this tool. Since a common drawback to ontology enrichment analyses is its verbosity, we developed a very simple filtering algorithm to reduce the ontology size by an order of magnitude. We adjusted these filters and validated our tool using a set of 30 gold standards from Expression Cluster data in WormBase. We show our tool can even discriminate between embryonic and larval tissues and can even identify tissues down to the single-cell level. We used our tool to identify multiple neuronal tissues that are down-regulated due to pathogen infection in C. elegans. Our Tissue Enrichment Analysis (TEA) can be found within WormBase, and can be downloaded using Python's standard pip installer. It tests a slimmed-down C. elegans tissue ontology for enrichment of specific terms and provides users with a text and graphic representation of the results.

  1. Percutaneous Radiofrequency Ablation of Colorectal Cancer Liver Metastases: Factors Affecting Outcomes—A 10-year Experience at a Single Center

    PubMed Central

    Shady, Waleed; Petre, Elena N.; Gonen, Mithat; Erinjeri, Joseph P.; Brown, Karen T.; Covey, Anne M.; Alago, William; Durack, Jeremy C.; Maybody, Majid; Brody, Lynn A.; Siegelbaum, Robert H.; D’Angelica, Michael I.; Jarnagin, William R.; Solomon, Stephen B.; Kemeny, Nancy E.

    2016-01-01

    Purpose To identify predictors of oncologic outcomes after percutaneous radiofrequency ablation (RFA) of colorectal cancer liver metastases (CLMs) and to describe and evaluate a modified clinical risk score (CRS) adapted for ablation as a patient stratification and prognostic tool. Materials and Methods This study consisted of a HIPAA-compliant institutional review board–approved retrospective review of data in 162 patients with 233 CLMs treated with percutaneous RFA between December 2002 and December 2012. Contrast material–enhanced CT was used to assess technique effectiveness 4–8 weeks after RFA. Patients were followed up with contrast-enhanced CT every 2–4 months. Overall survival (OS) and local tumor progression–free survival (LTPFS) were calculated from the time of RFA by using the Kaplan-Meier method. Log-rank tests and Cox regression models were used for univariate and multivariate analysis to identify predictors of outcomes. Results Technique effectiveness was 94% (218 of 233). Median LTPFS was 26 months. At univariate analysis, predictors of shorter LTPFS were tumor size greater than 3 cm (P < .001), ablation margin size of 5 mm or less (P < .001), high modified CRS (P = .009), male sex (P = .03), and no history of prior hepatectomy (P = .04) or hepatic arterial infusion chemotherapy (P = .01). At multivariate analysis, only tumor size greater than 3 cm (P = .01) and margin size of 5 mm or less (P < .001) were independent predictors of shorter LTPFS. Median and 5-year OS were 36 months and 31%. At univariate analysis, predictors of shorter OS were tumor size larger than 3 cm (P = .005), carcinoembryonic antigen level greater than 30 ng/mL (P = .003), high modified CRS (P = .02), and extrahepatic disease (EHD) (P < .001). At multivariate analysis, tumor size greater than 3 cm (P = .006) and more than one site of EHD (P < .001) were independent predictors of shorter OS. Conclusion Tumor size of less than 3 cm and ablation margins greater than 5 mm are essential for satisfactory local tumor control. Tumor size of more than 3 cm and the presence of more than one site of EHD are associated with shorter OS. © RSNA, 2015 PMID:26267832

  2. Monitoring the Stability of Perfluorocarbon Nanoemulsions by Cryo-TEM Image Analysis and Dynamic Light Scattering

    PubMed Central

    Grapentin, Christoph; Barnert, Sabine; Schubert, Rolf

    2015-01-01

    Perfluorocarbon nanoemulsions (PFC-NE) are disperse systems consisting of nanoscale liquid perfluorocarbon droplets stabilized by an emulsifier, usually phospholipids. Perfluorocarbons are chemically inert and non-toxic substances that are exhaled after in vivo administration. The manufacture of PFC-NE can be done in large scales by means of high pressure homogenization or microfluidization. Originally investigated as oxygen carriers for cases of severe blood loss, their application nowadays is more focused on using them as marker agents in 19F Magnetic Resonance Imaging (19F MRI). 19F is scarce in organisms and thus PFC-NE are a promising tool for highly specific and non-invasive imaging of inflammation via 19F MRI. Neutrophils, monocytes and macrophages phagocytize PFC-NE and subsequently migrate to inflamed tissues. This technique has proven feasibility in numerous disease models in mice, rabbits and mini pigs. The translation to clinical trials in human needs the development of a stable nanoemulsion whose droplet size is well characterized over a long storage time. Usually dynamic light scattering (DLS) is applied as the standard method for determining particle sizes in the nanometer range. Our study uses a second method, analysis of transmission electron microscopy images of cryo-fixed samples (Cryo-TEM), to evaluate stability of PFC-NE in comparison to DLS. Four nanoemulsions of different composition are observed for one year. The results indicate that DLS alone cannot reveal the changes in particle size, but can even mislead to a positive estimation of stability. The combination with Cryo-TEM images gives more insight in the particulate evolution, both techniques supporting one another. The study is one further step in the development of analytical tools for the evaluation of a clinically applicable perfluorooctylbromide nanoemulsion. PMID:26098661

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horiike, S.; Okazaki, Y.

    This paper describes a performance estimation tool developed for modeling and simulation of open distributed energy management systems to support their design. The approach of discrete event simulation with detailed models is considered for efficient performance estimation. The tool includes basic models constituting a platform, e.g., Ethernet, communication protocol, operating system, etc. Application softwares are modeled by specifying CPU time, disk access size, communication data size, etc. Different types of system configurations for various system activities can be easily studied. Simulation examples show how the tool is utilized for the efficient design of open distributed energy management systems.

  4. Increasing marketability and profitability of product line thru PATRAN and NASTRAN

    NASA Technical Reports Server (NTRS)

    Hyatt, Art

    1989-01-01

    Starting with the design objective the operational cycle life of the Swaging Tool was increased. To accomplish this increase in cycle life without increasing the size or weight of the tool would be engineering achievement. However, not only was the operational cycle life increased between 2 to 10 times but simultaneously the size and weight of the Swage Tool was decreased by about 50 percent. This accomplishment now becomes an outstanding engineering achievement. This achievement was only possible because of the computerized Patran, Nastran and Medusa programs.

  5. Morphometric Identification of Queens, Workers and Intermediates in In Vitro Reared Honey Bees (Apis mellifera).

    PubMed

    De Souza, Daiana A; Wang, Ying; Kaftanoglu, Osman; De Jong, David; Amdam, Gro V; Gonçalves, Lionel S; Francoy, Tiago M

    2015-01-01

    In vitro rearing is an important and useful tool for honey bee (Apis mellifera L.) studies. However, it often results in intercastes between queens and workers, which are normally are not seen in hive-reared bees, except when larvae older than three days are grafted for queen rearing. Morphological classification (queen versus worker or intercastes) of bees produced by this method can be subjective and generally depends on size differences. Here, we propose an alternative method for caste classification of female honey bees reared in vitro, based on weight at emergence, ovariole number, spermatheca size and size and shape, and features of the head, mandible and basitarsus. Morphological measurements were made with both traditional morphometric and geometric morphometrics techniques. The classifications were performed by principal component analysis, using naturally developed queens and workers as controls. First, the analysis included all the characters. Subsequently, a new analysis was made without the information about ovariole number and spermatheca size. Geometric morphometrics was less dependent on ovariole number and spermatheca information for caste and intercaste identification. This is useful, since acquiring information concerning these reproductive structures requires time-consuming dissection and they are not accessible when abdomens have been removed for molecular assays or in dried specimens. Additionally, geometric morphometrics divided intercastes into more discrete phenotype subsets. We conclude that morphometric geometrics are superior to traditional morphometrics techniques for identification and classification of honey bee castes and intermediates.

  6. Design of experiments-based monitoring of critical quality attributes for the spray-drying process of insulin by NIR spectroscopy.

    PubMed

    Maltesen, Morten Jonas; van de Weert, Marco; Grohganz, Holger

    2012-09-01

    Moisture content and aerodynamic particle size are critical quality attributes for spray-dried protein formulations. In this study, spray-dried insulin powders intended for pulmonary delivery were produced applying design of experiments methodology. Near infrared spectroscopy (NIR) in combination with preprocessing and multivariate analysis in the form of partial least squares projections to latent structures (PLS) were used to correlate the spectral data with moisture content and aerodynamic particle size measured by a time of flight principle. PLS models predicting the moisture content were based on the chemical information of the water molecules in the NIR spectrum. Models yielded prediction errors (RMSEP) between 0.39% and 0.48% with thermal gravimetric analysis used as reference method. The PLS models predicting the aerodynamic particle size were based on baseline offset in the NIR spectra and yielded prediction errors between 0.27 and 0.48 μm. The morphology of the spray-dried particles had a significant impact on the predictive ability of the models. Good predictive models could be obtained for spherical particles with a calibration error (RMSECV) of 0.22 μm, whereas wrinkled particles resulted in much less robust models with a Q (2) of 0.69. Based on the results in this study, NIR is a suitable tool for process analysis of the spray-drying process and for control of moisture content and particle size, in particular for smooth and spherical particles.

  7. Morphometric Identification of Queens, Workers and Intermediates in In Vitro Reared Honey Bees (Apis mellifera)

    PubMed Central

    A. De Souza, Daiana; Wang, Ying; Kaftanoglu, Osman; De Jong, David; V. Amdam, Gro; S. Gonçalves, Lionel; M. Francoy, Tiago

    2015-01-01

    In vitro rearing is an important and useful tool for honey bee (Apis mellifera L.) studies. However, it often results in intercastes between queens and workers, which are normally are not seen in hive-reared bees, except when larvae older than three days are grafted for queen rearing. Morphological classification (queen versus worker or intercastes) of bees produced by this method can be subjective and generally depends on size differences. Here, we propose an alternative method for caste classification of female honey bees reared in vitro, based on weight at emergence, ovariole number, spermatheca size and size and shape, and features of the head, mandible and basitarsus. Morphological measurements were made with both traditional morphometric and geometric morphometrics techniques. The classifications were performed by principal component analysis, using naturally developed queens and workers as controls. First, the analysis included all the characters. Subsequently, a new analysis was made without the information about ovariole number and spermatheca size. Geometric morphometrics was less dependent on ovariole number and spermatheca information for caste and intercaste identification. This is useful, since acquiring information concerning these reproductive structures requires time-consuming dissection and they are not accessible when abdomens have been removed for molecular assays or in dried specimens. Additionally, geometric morphometrics divided intercastes into more discrete phenotype subsets. We conclude that morphometric geometrics are superior to traditional morphometrics techniques for identification and classification of honey bee castes and intermediates. PMID:25894528

  8. Advances in two photon scanning and scanless microscopy technologies for functional neural circuit imaging.

    PubMed

    Schultz, Simon R; Copeland, Caroline S; Foust, Amanda J; Quicke, Peter; Schuck, Renaud

    2017-01-01

    Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size.

  9. Desktop microsimulation: a tool to improve efficiency in the medical office practice.

    PubMed

    Montgomery, James B; Linville, Beth A; Slonim, Anthony D

    2013-01-01

    Because the economic crisis in the United States continues to have an impact on healthcare organizations, industry leaders must optimize their decision making. Discrete-event computer simulation is a quality tool with a demonstrated track record of improving the precision of analysis for process redesign. However, the use of simulation to consolidate practices and design efficiencies into an unfinished medical office building was a unique task. A discrete-event computer simulation package was used to model the operations and forecast future results for four orthopedic surgery practices. The scenarios were created to allow an evaluation of the impact of process change on the output variables of exam room utilization, patient queue size, and staff utilization. The model helped with decisions regarding space allocation and efficient exam room use by demonstrating the impact of process changes in patient queues at check-in/out, x-ray, and cast room locations when compared to the status quo model. The analysis impacted decisions on facility layout, patient flow, and staff functions in this newly consolidated practice. Simulation was found to be a useful tool for process redesign and decision making even prior to building occupancy. © 2011 National Association for Healthcare Quality.

  10. Advances in two photon scanning and scanless microscopy technologies for functional neural circuit imaging

    PubMed Central

    Schultz, Simon R.; Copeland, Caroline S.; Foust, Amanda J.; Quicke, Peter; Schuck, Renaud

    2017-01-01

    Recent years have seen substantial developments in technology for imaging neural circuits, raising the prospect of large scale imaging studies of neural populations involved in information processing, with the potential to lead to step changes in our understanding of brain function and dysfunction. In this article we will review some key recent advances: improved fluorophores for single cell resolution functional neuroimaging using a two photon microscope; improved approaches to the problem of scanning active circuits; and the prospect of scanless microscopes which overcome some of the bandwidth limitations of current imaging techniques. These advances in technology for experimental neuroscience have in themselves led to technical challenges, such as the need for the development of novel signal processing and data analysis tools in order to make the most of the new experimental tools. We review recent work in some active topics, such as region of interest segmentation algorithms capable of demixing overlapping signals, and new highly accurate algorithms for calcium transient detection. These advances motivate the development of new data analysis tools capable of dealing with spatial or spatiotemporal patterns of neural activity, that scale well with pattern size. PMID:28757657

  11. Critical Pitfalls in the use of BRAF Mutation as a Diagnostic Tool in Thyroid Nodules: a Case Report.

    PubMed

    Kuhn, Elisabetta; Ragazzi, Moira; Zini, Michele; Giordano, Davide; Nicoli, Davide; Piana, Simonetta

    2016-09-01

    Thyroid fine-needle aspiration (FNA) cytology is the primary tool for the diagnostic evaluation of thyroid nodules. BRAF mutation analysis is employed as an ancillary tool in indeterminate cases, as recommended by the American Thyroid Association management guidelines. Hereby, we report the case of a 73-year-old woman who presented an 8-mm-size, ill-defined, left thyroid nodule. FNA resulted "suspicious for papillary thyroid carcinoma". BRAF mutation status was analyzed, and somatic BRAF (V600E) mutation identified. The patient underwent a total thyroidectomy. At histological examination, the nodule was composed of Langerhans cells, admixed with many eosinophils. A final diagnosis of Langerhans cell histiocytosis of the thyroid was made. Our case emphasizes the critical diagnostic pitfalls due to the use of BRAF (V600E) mutation analysis in thyroid FNA. Notably, BRAF (V600E) mutation is common in melanoma, colorectal carcinoma, lung carcinoma, ovarian carcinoma, brain tumors, hairy cell leukemia, multiple myeloma, and histiocytoses. Therefore, in cases of indeterminate FNA with unclassifiable atypical cells BRAF (V600E) mutated, the possibility of a localization of hystiocytosis or a secondary thyroid malignancy should be taken into account.

  12. Surface modification of AISI H13 tool steel by laser cladding with NiTi powder

    NASA Astrophysics Data System (ADS)

    Norhafzan, B.; Aqida, S. N.; Chikarakara, E.; Brabazon, D.

    2016-04-01

    This paper presents laser cladding of NiTi powder on AISI H13 tool steel surface for surface properties enhancement. The cladding process was conducted using Rofin DC-015 diffusion-cooled CO2 laser system with wavelength of 10.6 µm. NiTi powder was pre-placed on H13 tool steel surface. The laser beam was focused with a spot size of 90 µm on the sample surface. Laser parameters were set to 1515 and 1138 W peak power, 18 and 24 % duty cycle and 2300-3500 Hz laser pulse repetition frequency. Hardness properties of the modified layer were characterized by Wilson Hardness tester. Metallographic study and chemical composition were conducted using field emission scanning electron microscope and energy-dispersive X-ray spectrometer (EDXS) analysis. Results showed that hardness of NiTi clad layer increased three times that of the substrate material. The EDXS analysis detected NiTi phase presence in the modified layer up to 9.8 wt%. The metallographic study shows high metallurgical bonding between substrate and modified layer. These findings are significant to both increased hardness and erosion resistance of high-wear-resistant components and elongating their lifetime.

  13. A catalog of automated analysis methods for enterprise models.

    PubMed

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  14. An architecture for genomics analysis in a clinical setting using Galaxy and Docker

    PubMed Central

    Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A

    2017-01-01

    Abstract Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. PMID:29048555

  15. An architecture for genomics analysis in a clinical setting using Galaxy and Docker.

    PubMed

    Digan, W; Countouris, H; Barritault, M; Baudoin, D; Laurent-Puig, P; Blons, H; Burgun, A; Rance, B

    2017-11-01

    Next-generation sequencing is used on a daily basis to perform molecular analysis to determine subtypes of disease (e.g., in cancer) and to assist in the selection of the optimal treatment. Clinical bioinformatics handles the manipulation of the data generated by the sequencer, from the generation to the analysis and interpretation. Reproducibility and traceability are crucial issues in a clinical setting. We have designed an approach based on Docker container technology and Galaxy, the popular bioinformatics analysis support open-source software. Our solution simplifies the deployment of a small-size analytical platform and simplifies the process for the clinician. From the technical point of view, the tools embedded in the platform are isolated and versioned through Docker images. Along the Galaxy platform, we also introduce the AnalysisManager, a solution that allows single-click analysis for biologists and leverages standardized bioinformatics application programming interfaces. We added a Shiny/R interactive environment to ease the visualization of the outputs. The platform relies on containers and ensures the data traceability by recording analytical actions and by associating inputs and outputs of the tools to EDAM ontology through ReGaTe. The source code is freely available on Github at https://github.com/CARPEM/GalaxyDocker. © The Author 2017. Published by Oxford University Press.

  16. ARM Data File Standards Version: 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kehoe, Kenneth; Beus, Sherman; Cialella, Alice

    2014-04-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a diverse data sets containing observational and derived data, currently accumulating at a rate of 30 TB of data and 150,000 different files per month (http://www.archive.arm.gov/stats/storage2.html). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document will enable development of automated analysis and discovery tools formore » the ever-growing volumes of data. It also will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and facilitate development of future capabilities for delivering data on demand that can be tailored explicitly to user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy that includes required and recommended standards.« less

  17. Factor Analysis of the Modified Sexual Adjustment Questionnaire-Male

    PubMed Central

    Wilmoth, Margaret C.; Hanlon, Alexandra L.; Ng, Lit Soo; Bruner, Debra W.

    2015-01-01

    Background and Purpose The Sexual Adjustment Questionnaire (SAQ) is used in National Cancer Institute–sponsored clinical trials as an outcome measure for sexual functioning. The tool was revised to meet the needs for a clinically useful, theory-based outcome measure for use in both research and clinical settings. This report describes the modifications and validity testing of the modified Sexual Adjustment Questionnaire-Male (mSAQ-Male). Methods This secondary analysis of data from a large Radiation Therapy Oncology Group trial employed principal axis factor analytic techniques in estimating validity of the revised tool. The sample size was 686; most subjects were White, older than the age 60 years, and with a high school education and a Karnofsky performance scale (KPS) score of greater than 90. Results A 16-item, 3-factor solution resulted from the factor analysis. The mSAQ-Male was also found to be sensitive to changes in physical sexual functioning as measured by the KPS. Conclusion The mSAQ-Male is a valid self-report measure of sexuality that can be used clinically to detect changes in male sexual functioning. PMID:25255676

  18. Forensic microradiology: micro-computed tomography (Micro-CT) and analysis of patterned injuries inside of bone.

    PubMed

    Thali, Michael J; Taubenreuther, Ulrike; Karolczak, Marek; Braun, Marcel; Brueschweiler, Walter; Kalender, Willi A; Dirnhofer, Richard

    2003-11-01

    When a knife is stabbed in bone, it leaves an impression in the bone. The characteristics (shape, size, etc.) may indicate the type of tool used to produce the patterned injury in bone. Until now it has been impossible in forensic sciences to document such damage precisely and non-destructively. Micro-computed tomography (Micro-CT) offers an opportunity to analyze patterned injuries of tool marks made in bone. Using high-resolution Micro-CT and computer software, detailed analysis of three-dimensional (3D) architecture has recently become feasible and allows microstructural 3D bone information to be collected. With adequate viewing software, data from 2D slice of an arbitrary plane can be extracted from 3D datasets. Using such software as a "digital virtual knife," the examiner can interactively section and analyze the 3D sample. Analysis of the bone injury revealed that Micro-CT provides an opportunity to correlate a bone injury to an injury-causing instrument. Even broken knife tips can be graphically and non-destructively assigned to a suspect weapon.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palanisamy, Giri

    The U.S. Department of Energy (DOE)’s Atmospheric Radiation Measurement (ARM) Climate Research Facility performs routine in situ and remote-sensing observations to provide a detailed and accurate description of the Earth atmosphere in diverse climate regimes. The result is a huge archive of diverse data sets containing observational and derived data, currently accumulating at a rate of 30 terabytes (TB) of data and 150,000 different files per month (http://www.archive.arm.gov/stats/). Continuing the current processing while scaling this to even larger sizes is extremely important to the ARM Facility and requires consistent metadata and data standards. The standards described in this document willmore » enable development of automated analysis and discovery tools for the ever growing data volumes. It will enable consistent analysis of the multiyear data, allow for development of automated monitoring and data health status tools, and allow future capabilities of delivering data on demand that can be tailored explicitly for the user needs. This analysis ability will only be possible if the data follows a minimum set of standards. This document proposes a hierarchy of required and recommended standards.« less

  20. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    NASA Astrophysics Data System (ADS)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  1. Integrated solar energy system optimization

    NASA Astrophysics Data System (ADS)

    Young, S. K.

    1982-11-01

    The computer program SYSOPT, intended as a tool for optimizing the subsystem sizing, performance, and economics of integrated wind and solar energy systems, is presented. The modular structure of the methodology additionally allows simulations when the solar subsystems are combined with conventional technologies, e.g., a utility grid. Hourly energy/mass flow balances are computed for interconnection points, yielding optimized sizing and time-dependent operation of various subsystems. The program requires meteorological data, such as insolation, diurnal and seasonal variations, and wind speed at the hub height of a wind turbine, all of which can be taken from simulations like the TRNSYS program. Examples are provided for optimization of a solar-powered (wind turbine and parabolic trough-Rankine generator) desalinization plant, and a design analysis for a solar powered greenhouse.

  2. Strain-Based Damage Determination Using Finite Element Analysis for Structural Health Management

    NASA Technical Reports Server (NTRS)

    Hochhalter, Jacob D.; Krishnamurthy, Thiagaraja; Aguilo, Miguel A.

    2016-01-01

    A damage determination method is presented that relies on in-service strain sensor measurements. The method employs a gradient-based optimization procedure combined with the finite element method for solution to the forward problem. It is demonstrated that strains, measured at a limited number of sensors, can be used to accurately determine the location, size, and orientation of damage. Numerical examples are presented to demonstrate the general procedure. This work is motivated by the need to provide structural health management systems with a real-time damage characterization. The damage cases investigated herein are characteristic of point-source damage, which can attain critical size during flight. The procedure described can be used to provide prognosis tools with the current damage configuration.

  3. Critical appraisal of fundamental items in approved clinical trial research proposals in Mashhad University of Medical Sciences

    PubMed Central

    Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein

    2017-01-01

    Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials. PMID:29445703

  4. Critical appraisal of fundamental items in approved clinical trial research proposals in Mashhad University of Medical Sciences.

    PubMed

    Shakeri, Mohammad-Taghi; Taghipour, Ali; Sadeghi, Masoumeh; Nezami, Hossein; Amirabadizadeh, Ali-Reza; Bonakchi, Hossein

    2017-01-01

    Background: Writing, designing, and conducting a clinical trial research proposal has an important role in achieving valid and reliable findings. Thus, this study aimed at critically appraising fundamental information in approved clinical trial research proposals in Mashhad University of Medical Sciences (MUMS) from 2008 to 2014. Methods: This cross-sectional study was conducted on all 935 approved clinical trial research proposals in MUMS from 2008 to 2014. A valid and reliable as well as comprehensive, simple, and usable checklist in sessions with biostatisticians and methodologists, consisting of 11 main items as research tool, were used. Agreement rate between the reviewers of the proposals, who were responsible for data collection, was assessed during 3 sessions, and Kappa statistics was calculated at the last session as 97%. Results: More than 60% of the research proposals had a methodologist consultant, moreover, type of study or study design had been specified in almost all of them (98%). Appropriateness of study aims with hypotheses was not observed in a significant number of research proposals (585 proposals, 62.6%). The required sample size for 66.8% of the approved proposals was based on a sample size formula; however, in 25% of the proposals, sample size formula was not in accordance with the study design. Data collection tool was not selected appropriately in 55.2% of the approved research proposals. Type and method of randomization were unknown in 21% of the proposals and dealing with missing data had not been described in most of them (98%). Inclusion and exclusion criteria were (92%) fully and adequately explained. Moreover, 44% and 31% of the research proposals were moderate and weak in rank, respectively, with respect to the correctness of the statistical analysis methods. Conclusion: Findings of the present study revealed that a large portion of the approved proposals were highly biased or ambiguous with respect to randomization, blinding, dealing with missing data, data collection tool, sampling methods, and statistical analysis. Thus, it is essential to consult and collaborate with a methodologist in all parts of a proposal to control the possible and specific biases in clinical trials.

  5. Combining gas-phase electrophoretic mobility molecular analysis (GEMMA), light scattering, field flow fractionation and cryo electron microscopy in a multidimensional approach to characterize liposomal carrier vesicles

    PubMed Central

    Gondikas, Andreas; von der Kammer, Frank; Hofmann, Thilo; Marchetti-Deschmann, Martina; Allmaier, Günter; Marko-Varga, György; Andersson, Roland

    2017-01-01

    For drug delivery, characterization of liposomes regarding size, particle number concentrations, occurrence of low-sized liposome artefacts and drug encapsulation are of importance to understand their pharmacodynamic properties. In our study, we aimed to demonstrate the applicability of nano Electrospray Gas-Phase Electrophoretic Mobility Molecular Analyser (nES GEMMA) as a suitable technique for analyzing these parameters. We measured number-based particle concentrations, identified differences in size between nominally identical liposomal samples, and detected the presence of low-diameter material which yielded bimodal particle size distributions. Subsequently, we compared these findings to dynamic light scattering (DLS) data and results from light scattering experiments coupled to Asymmetric Flow-Field Flow Fractionation (AF4), the latter improving the detectability of smaller particles in polydisperse samples due to a size separation step prior detection. However, the bimodal size distribution could not be detected due to method inherent limitations. In contrast, cryo transmission electron microscopy corroborated nES GEMMA results. Hence, gas-phase electrophoresis proved to be a versatile tool for liposome characterization as it could analyze both vesicle size and size distribution. Finally, a correlation of nES GEMMA results with cell viability experiments was carried out to demonstrate the importance of liposome batch-to-batch control as low-sized sample components possibly impact cell viability. PMID:27639623

  6. Discussion about the use of the volume specific surface area (VSSA) as a criterion to identify nanomaterials according to the EU definition. Part two: experimental approach.

    PubMed

    Lecloux, André J; Atluri, Rambabu; Kolen'ko, Yury V; Deepak, Francis Leonard

    2017-10-12

    The first part of this study was dedicated to the modelling of the influence of particle shape, porosity and particle size distribution on the volume specific surface area (VSSA) values in order to check the applicability of this concept to the identification of nanomaterials according to the European Commission Recommendation. In this second part, experimental VSSA values are obtained for various samples from nitrogen adsorption isotherms and these values were used as a screening tool to identify and classify nanomaterials. These identification results are compared to the identification based on the 50% of particles with a size below 100 nm criterion applied to the experimental particle size distributions obtained by analysis of electron microscopy images on the same materials. It is concluded that the experimental VSSA values are able to identify nanomaterials, without false negative identification, if they have a mono-modal particle size, if the adsorption data cover the relative pressure range from 0.001 to 0.65 and if a simple, qualitative image of the particles by transmission or scanning electron microscopy is available to define their shape. The experimental conditions to obtain reliable adsorption data as well as the way to analyze the adsorption isotherms are described and discussed in some detail in order to help the reader in using the experimental VSSA criterion. To obtain the experimental VSSA values, the BET surface area can be used for non-porous particles, but for porous, nanostructured or coated nanoparticles, only the external surface of the particles, obtained by a modified t-plot approach, should be considered to determine the experimental VSSA and to avoid false positive identification of nanomaterials, only the external surface area being related to the particle size. Finally, the availability of experimental VSSA values together with particle size distributions obtained by electron microscopy gave the opportunity to check the representativeness of the two models described in the first part of this study. They were also used to calculate the VSSA values and these calculated values were compared to the experimental results. For narrow particle size distributions, both models give similar VSSA values quite comparable to the experimental ones. But when the particle size distribution broadens or is of multi-bimodal shape, as theoretically predicted, one model leads to VSSA values higher than the experimental ones while the other most often leads to VSSA values lower than the experimental ones. The experimental VSSA approach then appears as a reliable, simple screening tool to identify nano and non-nano-materials. The modelling approach cannot be used as a formal identification tool but could be useful to screen for potential effects of shape, polydispersity and size, for example to compare various possible nanoforms.

  7. Improving Flood Risk Management for California's Central Valley: How the State Developed a Toolbox for Large, System-wide Studies

    NASA Astrophysics Data System (ADS)

    Pingel, N.; Liang, Y.; Bindra, A.

    2016-12-01

    More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.

  8. LoRTE: Detecting transposon-induced genomic variants using low coverage PacBio long read sequences.

    PubMed

    Disdero, Eric; Filée, Jonathan

    2017-01-01

    Population genomic analysis of transposable elements has greatly benefited from recent advances of sequencing technologies. However, the short size of the reads and the propensity of transposable elements to nest in highly repeated regions of genomes limits the efficiency of bioinformatic tools when Illumina or 454 technologies are used. Fortunately, long read sequencing technologies generating read length that may span the entire length of full transposons are now available. However, existing TE population genomic softwares were not designed to handle long reads and the development of new dedicated tools is needed. LoRTE is the first tool able to use PacBio long read sequences to identify transposon deletions and insertions between a reference genome and genomes of different strains or populations. Tested against simulated and genuine Drosophila melanogaster PacBio datasets, LoRTE appears to be a reliable and broadly applicable tool to study the dynamic and evolutionary impact of transposable elements using low coverage, long read sequences. LoRTE is an efficient and accurate tool to identify structural genomic variants caused by TE insertion or deletion. LoRTE is available for download at http://www.egce.cnrs-gif.fr/?p=6422.

  9. MsViz: A Graphical Software Tool for In-Depth Manual Validation and Quantitation of Post-translational Modifications.

    PubMed

    Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo

    2017-08-04

    Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.

  10. Performance analysis and kernel size study of the Lynx real-time operating system

    NASA Technical Reports Server (NTRS)

    Liu, Yuan-Kwei; Gibson, James S.; Fernquist, Alan R.

    1993-01-01

    This paper analyzes the Lynx real-time operating system (LynxOS), which has been selected as the operating system for the Space Station Freedom Data Management System (DMS). The features of LynxOS are compared to other Unix-based operating system (OS). The tools for measuring the performance of LynxOS, which include a high-speed digital timer/counter board, a device driver program, and an application program, are analyzed. The timings for interrupt response, process creation and deletion, threads, semaphores, shared memory, and signals are measured. The memory size of the DMS Embedded Data Processor (EDP) is limited. Besides, virtual memory is not suitable for real-time applications because page swap timing may not be deterministic. Therefore, the DMS software, including LynxOS, has to fit in the main memory of an EDP. To reduce the LynxOS kernel size, the following steps are taken: analyzing the factors that influence the kernel size; identifying the modules of LynxOS that may not be needed in an EDP; adjusting the system parameters of LynxOS; reconfiguring the device drivers used in the LynxOS; and analyzing the symbol table. The reductions in kernel disk size, kernel memory size and total kernel size reduction from each step mentioned above are listed and analyzed.

  11. How Haptic Size Sensations Improve Distance Perception

    PubMed Central

    Battaglia, Peter W.; Kersten, Daniel; Schrater, Paul R.

    2011-01-01

    Determining distances to objects is one of the most ubiquitous perceptual tasks in everyday life. Nevertheless, it is challenging because the information from a single image confounds object size and distance. Though our brains frequently judge distances accurately, the underlying computations employed by the brain are not well understood. Our work illuminates these computions by formulating a family of probabilistic models that encompass a variety of distinct hypotheses about distance and size perception. We compare these models' predictions to a set of human distance judgments in an interception experiment and use Bayesian analysis tools to quantitatively select the best hypothesis on the basis of its explanatory power and robustness over experimental data. The central question is: whether, and how, human distance perception incorporates size cues to improve accuracy. Our conclusions are: 1) humans incorporate haptic object size sensations for distance perception, 2) the incorporation of haptic sensations is suboptimal given their reliability, 3) humans use environmentally accurate size and distance priors, 4) distance judgments are produced by perceptual “posterior sampling”. In addition, we compared our model's estimated sensory and motor noise parameters with previously reported measurements in the perceptual literature and found good correspondence between them. Taken together, these results represent a major step forward in establishing the computational underpinnings of human distance perception and the role of size information. PMID:21738457

  12. Point Analysis in Java applied to histological images of the perforant pathway: a user's account.

    PubMed

    Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán

    2008-01-01

    The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.

  13. A general computation model based on inverse analysis principle used for rheological analysis of W/O rapeseed and soybean oil emulsions

    NASA Astrophysics Data System (ADS)

    Vintila, Iuliana; Gavrus, Adinel

    2017-10-01

    The present research paper proposes the validation of a rigorous computation model used as a numerical tool to identify rheological behavior of complex emulsions W/O. Considering a three-dimensional description of a general viscoplastic flow it is detailed the thermo-mechanical equations used to identify fluid or soft material's rheological laws starting from global experimental measurements. Analyses are conducted for complex emulsions W/O having generally a Bingham behavior using the shear stress - strain rate dependency based on a power law and using an improved analytical model. Experimental results are investigated in case of rheological behavior for crude and refined rapeseed/soybean oils and four types of corresponding W/O emulsions using different physical-chemical composition. The rheological behavior model was correlated with the thermo-mechanical analysis of a plane-plane rheometer, oil content, chemical composition, particle size and emulsifier's concentration. The parameters of rheological laws describing the industrial oils and the W/O concentrated emulsions behavior were computed from estimated shear stresses using a non-linear regression technique and from experimental torques using the inverse analysis tool designed by A. Gavrus (1992-2000).

  14. Epiviz: a view inside the design of an integrated visual analysis software for genomics

    PubMed Central

    2015-01-01

    Background Computational and visual data analysis for genomics has traditionally involved a combination of tools and resources, of which the most ubiquitous consist of genome browsers, focused mainly on integrative visualization of large numbers of big datasets, and computational environments, focused on data modeling of a small number of moderately sized datasets. Workflows that involve the integration and exploration of multiple heterogeneous data sources, small and large, public and user specific have been poorly addressed by these tools. In our previous work, we introduced Epiviz, which bridges the gap between the two types of tools, simplifying these workflows. Results In this paper we expand on the design decisions behind Epiviz, and introduce a series of new advanced features that further support the type of interactive exploratory workflow we have targeted. We discuss three ways in which Epiviz advances the field of genomic data analysis: 1) it brings code to interactive visualizations at various different levels; 2) takes the first steps in the direction of collaborative data analysis by incorporating user plugins from source control providers, as well as by allowing analysis states to be shared among the scientific community; 3) combines established analysis features that have never before been available simultaneously in a genome browser. In our discussion section, we present security implications of the current design, as well as a series of limitations and future research steps. Conclusions Since many of the design choices of Epiviz are novel in genomics data analysis, this paper serves both as a document of our own approaches with lessons learned, as well as a start point for future efforts in the same direction for the genomics community. PMID:26328750

  15. Novel Tool Selection in Left Brain-Damaged Patients With Apraxia of Tool Use: A Study of Three Cases.

    PubMed

    Osiurak, François; Granjon, Marine; Bonnevie, Isabelle; Brogniart, Joël; Mechtouff, Laura; Benoit, Amandine; Nighoghossian, Norbert; Lesourd, Mathieu

    2018-05-01

    Recent evidence indicates that some left brain-damaged (LBD) patients have difficulties to use familiar tools because of the inability to reason about physical object properties. A fundamental issue is to understand the residual capacity of those LBD patients in tool selection. Three LBD patients with tool use disorders, three right brain-damaged (RBD) patients, and six matched healthy controls performed a novel tool selection task, consisting in extracting a target out from a box by selecting the relevant tool among eight, four, or two tools. Three criteria were manipulated to make relevant and irrelevant tools (size, rigidity, shape). LBD patients selected a greater number of irrelevant tools and had more difficulties to solve the task compared to RBD patients and controls. All participants committed more errors for selecting relevant tools based on rigidity and shape than size. In some LBD patients, the difficulties persisted even in the 2-Choice condition. Our findings confirm that tool use disorders result from impaired technical reasoning, leading patients to meet difficulties in selecting tools based on their physical properties. We also go further by showing that these difficulties can decrease as the choice is reduced, at least for some properties, opening new avenues for rehabilitation programs. (JINS, 2018, 24, 524-529).

  16. MultiMap: A Tool to Automatically Extract and Analyse Spatial Microscopic Data From Large Stacks of Confocal Microscopy Images

    PubMed Central

    Varando, Gherardo; Benavides-Piccione, Ruth; Muñoz, Alberto; Kastanauskaite, Asta; Bielza, Concha; Larrañaga, Pedro; DeFelipe, Javier

    2018-01-01

    The development of 3D visualization and reconstruction methods to analyse microscopic structures at different levels of resolutions is of great importance to define brain microorganization and connectivity. MultiMap is a new tool that allows the visualization, 3D segmentation and quantification of fluorescent structures selectively in the neuropil from large stacks of confocal microscopy images. The major contribution of this tool is the posibility to easily navigate and create regions of interest of any shape and size within a large brain area that will be automatically 3D segmented and quantified to determine the density of puncta in the neuropil. As a proof of concept, we focused on the analysis of glutamatergic and GABAergic presynaptic axon terminals in the mouse hippocampal region to demonstrate its use as a tool to provide putative excitatory and inhibitory synaptic maps. The segmentation and quantification method has been validated over expert labeled images of the mouse hippocampus and over two benchmark datasets, obtaining comparable results to the expert detections. PMID:29875639

  17. MultiMap: A Tool to Automatically Extract and Analyse Spatial Microscopic Data From Large Stacks of Confocal Microscopy Images.

    PubMed

    Varando, Gherardo; Benavides-Piccione, Ruth; Muñoz, Alberto; Kastanauskaite, Asta; Bielza, Concha; Larrañaga, Pedro; DeFelipe, Javier

    2018-01-01

    The development of 3D visualization and reconstruction methods to analyse microscopic structures at different levels of resolutions is of great importance to define brain microorganization and connectivity. MultiMap is a new tool that allows the visualization, 3D segmentation and quantification of fluorescent structures selectively in the neuropil from large stacks of confocal microscopy images. The major contribution of this tool is the posibility to easily navigate and create regions of interest of any shape and size within a large brain area that will be automatically 3D segmented and quantified to determine the density of puncta in the neuropil. As a proof of concept, we focused on the analysis of glutamatergic and GABAergic presynaptic axon terminals in the mouse hippocampal region to demonstrate its use as a tool to provide putative excitatory and inhibitory synaptic maps. The segmentation and quantification method has been validated over expert labeled images of the mouse hippocampus and over two benchmark datasets, obtaining comparable results to the expert detections.

  18. Bayes factor design analysis: Planning for compelling evidence.

    PubMed

    Schönbrodt, Felix D; Wagenmakers, Eric-Jan

    2018-02-01

    A sizeable literature exists on the use of frequentist power analysis in the null-hypothesis significance testing (NHST) paradigm to facilitate the design of informative experiments. In contrast, there is almost no literature that discusses the design of experiments when Bayes factors (BFs) are used as a measure of evidence. Here we explore Bayes Factor Design Analysis (BFDA) as a useful tool to design studies for maximum efficiency and informativeness. We elaborate on three possible BF designs, (a) a fixed-n design, (b) an open-ended Sequential Bayes Factor (SBF) design, where researchers can test after each participant and can stop data collection whenever there is strong evidence for either [Formula: see text] or [Formula: see text], and (c) a modified SBF design that defines a maximal sample size where data collection is stopped regardless of the current state of evidence. We demonstrate how the properties of each design (i.e., expected strength of evidence, expected sample size, expected probability of misleading evidence, expected probability of weak evidence) can be evaluated using Monte Carlo simulations and equip researchers with the necessary information to compute their own Bayesian design analyses.

  19. A Guerilla Guide to Common Problems in ‘Neurostatistics’: Essential Statistical Topics in Neuroscience

    PubMed Central

    Smith, Paul F.

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins. PMID:29371855

  20. A Guerilla Guide to Common Problems in 'Neurostatistics': Essential Statistical Topics in Neuroscience.

    PubMed

    Smith, Paul F

    2017-01-01

    Effective inferential statistical analysis is essential for high quality studies in neuroscience. However, recently, neuroscience has been criticised for the poor use of experimental design and statistical analysis. Many of the statistical issues confronting neuroscience are similar to other areas of biology; however, there are some that occur more regularly in neuroscience studies. This review attempts to provide a succinct overview of some of the major issues that arise commonly in the analyses of neuroscience data. These include: the non-normal distribution of the data; inequality of variance between groups; extensive correlation in data for repeated measurements across time or space; excessive multiple testing; inadequate statistical power due to small sample sizes; pseudo-replication; and an over-emphasis on binary conclusions about statistical significance as opposed to effect sizes. Statistical analysis should be viewed as just another neuroscience tool, which is critical to the final outcome of the study. Therefore, it needs to be done well and it is a good idea to be proactive and seek help early, preferably before the study even begins.

  1. Optimization of an RNA-Seq Differential Gene Expression Analysis Depending on Biological Replicate Number and Library Size

    PubMed Central

    Lamarre, Sophie; Frasse, Pierre; Zouine, Mohamed; Labourdette, Delphine; Sainderichin, Elise; Hu, Guojian; Le Berre-Anton, Véronique; Bouzayen, Mondher; Maza, Elie

    2018-01-01

    RNA-Seq is a widely used technology that allows an efficient genome-wide quantification of gene expressions for, for example, differential expression (DE) analysis. After a brief review of the main issues, methods and tools related to the DE analysis of RNA-Seq data, this article focuses on the impact of both the replicate number and library size in such analyses. While the main drawback of previous relevant studies is the lack of generality, we conducted both an analysis of a two-condition experiment (with eight biological replicates per condition) to compare the results with previous benchmark studies, and a meta-analysis of 17 experiments with up to 18 biological conditions, eight biological replicates and 100 million (M) reads per sample. As a global trend, we concluded that the replicate number has a larger impact than the library size on the power of the DE analysis, except for low-expressed genes, for which both parameters seem to have the same impact. Our study also provides new insights for practitioners aiming to enhance their experimental designs. For instance, by analyzing both the sensitivity and specificity of the DE analysis, we showed that the optimal threshold to control the false discovery rate (FDR) is approximately 2−r, where r is the replicate number. Furthermore, we showed that the false positive rate (FPR) is rather well controlled by all three studied R packages: DESeq, DESeq2, and edgeR. We also analyzed the impact of both the replicate number and library size on gene ontology (GO) enrichment analysis. Interestingly, we concluded that increases in the replicate number and library size tend to enhance the sensitivity and specificity, respectively, of the GO analysis. Finally, we recommend to RNA-Seq practitioners the production of a pilot data set to strictly analyze the power of their experimental design, or the use of a public data set, which should be similar to the data set they will obtain. For individuals working on tomato research, on the basis of the meta-analysis, we recommend at least four biological replicates per condition and 20 M reads per sample to be almost sure of obtaining about 1000 DE genes if they exist. PMID:29491871

  2. Analysis of Giga-size Earth Observation Data in Open Source GRASS GIS 7 - from Desktop to On-line Solutions.

    NASA Astrophysics Data System (ADS)

    Stepinski, T. F.; Mitasova, H.; Jasiewicz, J.; Neteler, M.; Gebbert, S.

    2014-12-01

    GRASS GIS is a leading open source GIS for geospatial analysis and modeling. In addition to being utilized as a desktop GIS it also serves as a processing engine for high performance geospatial computing for applications in diverse disciplines. The newly released GRASS GIS 7 supports big data analysis including temporal framework, image segmentation, watershed analysis, synchronized 2D/3D animations and many others. This presentation will focus on new GRASS GIS 7-powered tools for geoprocessing giga-size earth observation (EO) data using spatial pattern analysis. Pattern-based analysis connects to human visual perception of space as well as makes geoprocessing of giga-size EO data possible in an efficient and robust manner. GeoPAT is a collection of GRASS GIS 7 modules that fully integrates procedures for pattern representation of EO data and patterns similarity calculations with standard GIS tasks of mapping, maps overlay, segmentation, classification(Fig 1a), change detections etc. GeoPAT works very well on a desktop but it also underpins several GeoWeb applications (http://sil.uc.edu/ ) which allow users to do analysis on selected EO datasets without the need to download them. The GRASS GIS 7 temporal framework and high resolution visualizations will be illustrated using time series of giga-size, lidar-based digital elevation models representing the dynamics of North Carolina barrier islands over the past 15 years. The temporal framework supports efficient raster and vector data series analysis and simplifies data input for visual analysis of dynamic landscapes (Fig. 1b) allowing users to rapidly identify vulnerable locations, changes in built environment and eroding coastlines. Numerous improvements in GRASS GIS 7 were implemented to support terabyte size data processing for reconstruction of MODIS land surface temperature (LST) at 250m resolution using multiple regressions and PCA (Fig. 1c) . The new MODIS LST series (http://gis.cri.fmach.it/eurolst/) includes 4 maps per day since year 2000, provide improved data for the epidemiological predictions, viticulture, assessment of urban heat islands and numerous other applications. The presentation will conclude with outline of future development for big data interfaces to further enhance the web-based GRASS GIS data analysis.

  3. Electrical failure debug using interlayer profiling method

    NASA Astrophysics Data System (ADS)

    Yang, Thomas; Shen, Yang; Zhang, Yifan; Sweis, Jason; Lai, Ya-Chieh

    2017-03-01

    It is very well known that as technology nodes move to smaller sizes, the number of design rules increases while design structures become more regular and the process manufacturing steps have increased as well. Normal inspection tools can only monitor hard failures on a single layer. For electrical failures that happen due to inter layers misalignments, we can only detect them through testing. This paper will present a working flow for using pattern analysis interlayer profiling techniques to turn multiple layer physical info into group linked parameter values. Using this data analysis flow combined with an electrical model allows us to find critical regions on a layout for yield learning.

  4. Shuttle cryogenics supply system. Optimization study. Volume 5 B-4: Programmers manual for space shuttle orbit injection analysis (SOPSA)

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A computer program for space shuttle orbit injection propulsion system analysis (SOPSA) is described to show the operational characteristics and the computer system requirements. The program was developed as an analytical tool to aid in the preliminary design of propellant feed systems for the space shuttle orbiter main engines. The primary purpose of the program is to evaluate the propellant tank ullage pressure requirements imposed by the need to accelerate propellants rapidly during the engine start sequence. The SOPSA program will generate parametric feed system pressure histories and weight data for a range of nominal feedline sizes.

  5. Profiling analysis of low molecular weight heparins by multiple heart-cutting two dimensional chromatography with quadruple time-of-flight mass spectrometry.

    PubMed

    Ouyang, Yilan; Zeng, Yangyang; Rong, Yinxiu; Song, Yue; Shi, Lv; Chen, Bo; Yang, Xinlei; Xu, Naiyu; Linhardt, Robert J; Zhang, Zhenqing

    2015-09-01

    Low molecular weight heparins (LMWHs) are polydisperse and microheterogenous mixtures of polysaccharides used as anticoagulant drugs. Profiling analysis is important for obtaining deeper insights into the structure of LMWHs. Previous oligosaccharide mapping methods are relatively low resolution and are unable to show an entire picture of the structural complexity of LMWHs. In the current study a profiling method was developed relying on multiple heart-cutting, two-dimensional, ultrahigh performance liquid chromatography with quadruple time-of-flight mass spectrometry. This represents an efficient, automated, and robust approach for profiling LMWHs. Using size-exclusion chromatography and ion-pairing reversed-phase chromatography in a two-dimensional separation, LMW components of different sizes and LMW components of the same size but with different charges and polarities can be resolved, providing a more complete picture of a LMWH. Structural information on each component was then obtained with quadrupole time-of-flight mass spectrometry. More than 80 and 120 oligosaccharides were observed and unambiguously assigned from the LMWHs, nadroparin and enoxaparin, respectively. This method might be useful for quality control of LMWHs and as a powerful tool for heparin-related glycomics.

  6. Sensitivity analysis and metamodeling of a toolchain of models to help sizing vetetative filter strips in a watershed.

    NASA Astrophysics Data System (ADS)

    Lauvernet, Claire; Noll, Dorothea; Muñoz-Carpena, Rafael; Carluer, Nadia

    2014-05-01

    In Europe, environmental agencies do the finding a significant presence of contaminants in surface water, which is partly due to pesticide applications. Vegetative filter strips (VFS), often located along rivers, are a common tool among other buffer zones to reduce non point source pollution of water by reducing surface runoff. However, they need to be adapted to the agro-pedo-climatic conditions, both in terms of position and size, in order to be efficient. This is one of the roles of TOPPS-PROWADIS project which involves European experts and stakeholders to develop and recommend Best Management Practices (BMPs) to reduce pesticide transfer by drift or runoff in several European countries. In this context, Irstea developed a guide accompanying the use of different tools, which allows designing VFS by simulating their efficiency to limit transfers. It needs the user to define both a scenario of incoming surface runoff and the buffer zone characteristics. First, the contributive zone (surface, length, slope) is derived from the topography by a GIS tool, HydroDem. ; 2nd, the runoff hydrograph coming in the buffer zone is generated from a rainfall hyetogram typical of the area, using Curve Number theory, taking into account soil characteristics. The VFS's optimal width is then deduced for a given desired efficiency (for example 70% of runoff reduction), by using VFSMOD model, which simulates water, suspended matters (and pesticides) transfer inside a vegetative filter strip. Results also indicate if this kind of buffer zone is relevant in that situation (if too high, another type of buffer zone may be more relevant, for example constructed wetland). This method assumes that the user supplies quite a lot of field knowledge and data, which are not always easily available. In order to fill in the lack of real data, a set of virtual scenarios was tested, which is supposed to cover a large range of agro-pedo-climatic conditions in Europe, considering both the upslope agricultural field and the VFS characteristics. These scenarios are based on: 2 types of climates (North and South-west of France), different rainfall intensities and durations, different lengths and slopes of hillslope, different humidity conditions, 4 soil types (silt loam, sandy loam, clay loam, sandy clay loam), 2 crops (wheat and corn) for the contributive area, 2 water table depths (1m and 2.5m) and 4 soil types for the VFS. The sizing method was applied for all these scenarios, and a sensitivity analysis of the VFS optimal length was performed for all the input parameters in order to understand their influence, and to identify for which a special care has to be given. Based on that sensitivity analysis, a metamodel has been developed. The idea is to simplify the whole toolchain and to make it possible to perform the buffer sizing by using a unique tool and a smaller set of parameters, given the available information from the end users. We first compared several mathematical methods to compute the metamodel, and then validated them on an agricultural watershed with real data in the North-West of France.

  7. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time‐to‐Event Analysis

    PubMed Central

    Gong, Xiajing; Hu, Meng

    2018-01-01

    Abstract Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time‐to‐event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high‐dimensional data featured by a large number of predictor variables. Our results showed that ML‐based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high‐dimensional data. The prediction performances of ML‐based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML‐based methods provide a powerful tool for time‐to‐event analysis, with a built‐in capacity for high‐dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. PMID:29536640

  8. Carbon footprint analysis as a tool for energy and environmental management in small and medium-sized enterprises

    NASA Astrophysics Data System (ADS)

    Giama, E.; Papadopoulos, A. M.

    2018-01-01

    The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.

  9. Max Tech Efficiency Electric HPWH with low-GWP Halogenated Refrigerant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nawaz, Kashif; Shen, Bo; Elatar, Ahmed F.

    A scoping-level analysis was conducted to determine the maximum performance of an electric heat pump water heater (HPWH) with low GWP refrigerants (hydroflouroolefins (HFO), hydrofluorocarbons (HFC), and blends). A baseline heat pump water heater (GE GeoSpring) deploying R-134a was analyzed first using the DOE/ORNL Heat Pump Design Model (HPDM) modeling tool. The model was calibrated using experimental data to match the water temperature stratification in tank, first hour rating, energy factor and coefficient of performance. A CFD modeling tool was used to further refine the HPDM tank model. After calibration, the model was used to simulate the performance of alternativemore » refrigerants. The parametric analysis concluded that by appropriate selection of equipment size and condenser tube wrap configuration the overall performance of emerging low GWP refrigerants for HPWH application not only exceed the Energy Star Energy Factor criteria i.e. 2.20, but is also comparable to some of the most efficient products in the market.« less

  10. Parametric Study of Biconic Re-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Steele, Bryan; Banks, Daniel W.; Whitmore, Stephen A.

    2007-01-01

    An optimization based on hypersonic aerodynamic performance and volumetric efficiency was accomplished for a range of biconic configurations. Both axisymmetric and quasi-axisymmetric geometries (bent and flattened) were analyzed. The aerodynamic optimization wag based on hypersonic simple Incidence angle analysis tools. The range of configurations included those suitable for r lunar return trajectory with a lifting aerocapture at Earth and an overall volume that could support a nominal crew. The results yielded five configurations that had acceptable aerodynamic performance and met overall geometry and size limitations

  11. Overview of SDCM - The Spacecraft Design and Cost Model

    NASA Technical Reports Server (NTRS)

    Ferebee, Melvin J.; Farmer, Jeffery T.; Andersen, Gregory C.; Flamm, Jeffery D.; Badi, Deborah M.

    1988-01-01

    The Spacecraft Design and Cost Model (SDCM) is a computer-aided design and analysis tool for synthesizing spacecraft configurations, integrating their subsystems, and generating information concerning on-orbit servicing and costs. SDCM uses a bottom-up method in which the cost and performance parameters for subsystem components are first calculated; the model then sums the contributions from individual components in order to obtain an estimate of sizes and costs for each candidate configuration within a selected spacecraft system. An optimum spacraft configuration can then be selected.

  12. Large scale 20mm photography for range resources analysis in the Western United States. [Casa Grande, Arizona, Mercury, Nevada, and Mojave Desert

    NASA Technical Reports Server (NTRS)

    Tueller, P. T.

    1977-01-01

    Large scale 70mm aerial photography is a valuable supplementary tool for rangeland studies. A wide assortment of applications were developed varying from vegetation mapping to assessing environmental impact on rangelands. Color and color infrared stereo pairs are useful for effectively sampling sites limited by ground accessibility. They allow an increased sample size at similar or lower cost than ground sampling techniques and provide a permanent record.

  13. SNPmplexViewer--toward a cost-effective traceability system

    PubMed Central

    2011-01-01

    Background Beef traceability has become mandatory in many regions of the world and is typically achieved through the use of unique numerical codes on ear tags and animal passports. DNA-based traceability uses the animal's own DNA code to identify it and the products derived from it. Using SNaPshot, a primer-extension-based method, a multiplex of 25 SNPs in a single reaction has been practiced for reducing the expense of genotyping a panel of SNPs useful for identity control. Findings To further decrease SNaPshot's cost, we introduced the Perl script SNPmplexViewer, which facilitates the analysis of trace files for reactions performed without the use of fluorescent size standards. SNPmplexViewer automatically aligns reference and target trace electropherograms, run with and without fluorescent size standards, respectively. SNPmplexViewer produces a modified target trace file containing a normalised trace in which the reference size standards are embedded. SNPmplexViewer also outputs aligned images of the two electropherograms together with a difference profile. Conclusions Modified trace files generated by SNPmplexViewer enable genotyping of SnaPshot reactions performed without fluorescent size standards, using common fragment-sizing software packages. SNPmplexViewer's normalised output may also improve the genotyping software's performance. Thus, SNPmplexViewer is a general free tool enabling the reduction of SNaPshot's cost as well as the fast viewing and comparing of trace electropherograms for fragment analysis. SNPmplexViewer is available at http://cowry.agri.huji.ac.il/cgi-bin/SNPmplexViewer.cgi. PMID:21600063

  14. Provenance and depositional environment of epi-shelf lake sediment from Schirmacher Oasis, East Antarctica, vis-à-vis scanning electron microscopy of quartz grain, size distribution and chemical parameters

    NASA Astrophysics Data System (ADS)

    Shrivastava, Prakash K.; Asthana, Rajesh; Roy, Sandip K.; Swain, Ashit K.; Dharwadkar, Amit

    2012-07-01

    The scientific study of quartz grains is a powerful tool in deciphering the depositional environment and mode of transportation of sediments, and ultimately the origin and classification of sediments. Surface microfeatures, angularity, chemical features, and grain-size analysis of quartz grains, collectively reveal the sedimentary and physicochemical processes that acted on the grains during different stages of their geological history. Here, we apply scanning electron microscopic (SEM) analysis to evaluating the sedimentary provenance, modes of transport, weathering characteristics, alteration, and sedimentary environment of selected detrital quartz grains from the peripheral part of two epi-shelf lakes (ESL-1 and ESL-2) of the Schirmacher Oasis of East Antarctica. Our study reveals that different styles of physical weathering, erosive signatures, and chemical precipitation variably affected these quartz grains before final deposition as lake sediments. Statistical analysis (central tendencies, sorting, skewness, and kurtosis) indicates that these quartz-bearing sediments are poorly sorted glaciofluvial sediments. Saltation and suspension seem to have been the two dominant modes of transportation, and chemical analysis of these sediments indicates a gneissic provenance.

  15. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  16. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  17. Multiset singular value decomposition for joint analysis of multi-modal data: application to fingerprint analysis

    NASA Astrophysics Data System (ADS)

    Emge, Darren K.; Adalı, Tülay

    2014-06-01

    As the availability and use of imaging methodologies continues to increase, there is a fundamental need to jointly analyze data that is collected from multiple modalities. This analysis is further complicated when, the size or resolution of the images differ, implying that the observation lengths of each of modality can be highly varying. To address this expanding landscape, we introduce the multiset singular value decomposition (MSVD), which can perform a joint analysis on any number of modalities regardless of their individual observation lengths. Through simulations, the inter modal relationships across the different modalities which are revealed by the MSVD are shown. We apply the MSVD to forensic fingerprint analysis, showing that MSVD joint analysis successfully identifies relevant similarities for further analysis, significantly reducing the processing time required. This reduction, takes this technique from a laboratory method to a useful forensic tool with applications across the law enforcement and security regimes.

  18. A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations

    DOE PAGES

    Guo, Yi; Parsons, Tyler; Dykes, Katherine; ...

    2016-08-24

    This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less

  19. A systems engineering analysis of three-point and four-point wind turbine drivetrain configurations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Yi; Parsons, Tyler; Dykes, Katherine

    This study compares the impact of drivetrain configuration on the mass and capital cost of a series of wind turbines ranging from 1.5 MW to 5.0 MW power ratings for both land-based and offshore applications. The analysis is performed with a new physics-based drivetrain analysis and sizing tool, Drive Systems Engineering (DriveSE), which is part of the Wind-Plant Integrated System Design & Engineering Model. DriveSE uses physics-based relationships to size all major drivetrain components according to given rotor loads simulated based on International Electrotechnical Commission design load cases. The model's sensitivity to input loads that contain a high degree ofmore » variability was analyzed. Aeroelastic simulations are used to calculate the rotor forces and moments imposed on the drivetrain for each turbine design. DriveSE is then used to size all of the major drivetrain components for each turbine for both three-point and four-point configurations. The simulation results quantify the trade-offs in mass and component costs for the different configurations. On average, a 16.7% decrease in total nacelle mass can be achieved when using a three-point drivetrain configuration, resulting in a 3.5% reduction in turbine capital cost. This analysis is driven by extreme loads and does not consider fatigue. Thus, the effects of configuration choices on reliability and serviceability are not captured. Furthermore, a first order estimate of the sizing, dimensioning and costing of major drivetrain components are made which can be used in larger system studies which consider trade-offs between subsystems such as the rotor, drivetrain and tower.« less

  20. Optimal Sizing Tool for Battery Storage in Grid Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-09-24

    The battery storage sizing tool developed at Pacific Northwest National Laboratory can be used to evaluate economic performance and determine the optimal size of battery storage in different use cases considering multiple power system applications. The considered use cases include i) utility owned battery storage, and ii) battery storage behind customer meter. The power system applications from energy storage include energy arbitrage, balancing services, T&D deferral, outage mitigation, demand charge reduction etc. Most of existing solutions consider only one or two grid services simultaneously, such as balancing service and energy arbitrage. ES-select developed by Sandia and KEMA is able tomore » consider multiple grid services but it stacks the grid services based on priorities instead of co-optimization. This tool is the first one that provides a co-optimization for systematic and local grid services.« less

  1. Characterization of Nanocomposites by Thermal Analysis

    PubMed Central

    Corcione, Carola Esposito; Frigione, Mariaenrica

    2012-01-01

    In materials research, the development of polymer nanocomposites (PN) is rapidly emerging as a multidisciplinary research field with results that could broaden the applications of polymers to many different industries. PN are polymer matrices (thermoplastics, thermosets or elastomers) that have been reinforced with small quantities of nano-sized particles, preferably characterized by high aspect ratios, such as layered silicates and carbon nanotubes. Thermal analysis (TA) is a useful tool to investigate a wide variety of properties of polymers and it can be also applied to PN in order to gain further insight into their structure. This review illustrates the versatile applications of TA methods in the emerging field of polymer nanomaterial research, presenting some examples of applications of differential scanning calorimetry (DSC), thermogravimetric analysis (TGA), dynamic mechanical thermal analysis (DMTA) and thermal mechanical analysis (TMA) for the characterization of nanocomposite materials.

  2. Benefits and applications of interdisciplinary digital tools for environmental meta-reviews and analyses

    NASA Astrophysics Data System (ADS)

    Grubert, Emily; Siders, Anne

    2016-09-01

    Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.

  3. The coming of age of the first hybrid metrology software platform dedicated to nanotechnologies (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian

    2017-03-01

    The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control

  4. Rapid prototyping of nanofluidic systems using size-reduced electrospun nanofibers for biomolecular analysis.

    PubMed

    Park, Seung-Min; Huh, Yun Suk; Szeto, Kylan; Joe, Daniel J; Kameoka, Jun; Coates, Geoffrey W; Edel, Joshua B; Erickson, David; Craighead, Harold G

    2010-11-05

    Biomolecular transport in nanofluidic confinement offers various means to investigate the behavior of biomolecules in their native aqueous environments, and to develop tools for diverse single-molecule manipulations. Recently, a number of simple nanofluidic fabrication techniques has been demonstrated that utilize electrospun nanofibers as a backbone structure. These techniques are limited by the arbitrary dimension of the resulting nanochannels due to the random nature of electrospinning. Here, a new method for fabricating nanofluidic systems from size-reduced electrospun nanofibers is reported and demonstrated. As it is demonstrated, this method uses the scanned electrospinning technique for generation of oriented sacrificial nanofibers and exposes these nanofibers to harsh, but isotropic etching/heating environments to reduce their cross-sectional dimension. The creation of various nanofluidic systems as small as 20 nm is demonstrated, and practical examples of single biomolecular handling, such as DNA elongation in nanochannels and fluorescence correlation spectroscopic analysis of biomolecules passing through nanochannels, are provided.

  5. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  6. Identification of the condition of crops based on geospatial data embedded in graph databases

    NASA Astrophysics Data System (ADS)

    Idziaszek, P.; Mueller, W.; Górna, K.; Okoń, P.; Boniecki, P.; Koszela, K.; Fojud, A.

    2017-07-01

    The Web application presented here supports plant production and works with the graph database Neo4j shell to support the assessment of the condition of crops on the basis of geospatial data, including raster and vector data. The adoption of a graph database as a tool to store and manage the data, including geospatial data, is completely justified in the case of those agricultural holdings that have a wide range of types and sizes of crops. In addition, the authors tested the option of using the technology of Microsoft Cognitive Services at the level of produced application that enables an image analysis using the services provided. The presented application was designed using ASP.NET MVC technology and a wide range of leading IT tools.

  7. Consequences of theory level choice evaluated with new tools from QTAIM and the stress tensor for a dipeptide conformer

    NASA Astrophysics Data System (ADS)

    Li, Jiahui; Xu, Tianlv; Ping, Yang; van Mourik, Tanja; Früchtl, Herbert; Kirk, Steven R.; Jenkins, Samantha

    2018-03-01

    QTAIM and the stress tensor were used to provide a detailed analysis of the topology of the molecular graph, BCP and bond-path properties, including the new introduced helicity length H, of a Tyr-Gly dipeptide conformer subjected to a torsion with four levels of theory; MP2, M06-2X, B3LYP-D3 and B3LYP and a modest-sized basis set, 6-31+G(d). Structural effects and bonding properties are quantified and reflect differences in the BSSE and lack of inclusion of dispersion effects in the B3LYP calculations. The helicity length H demonstrated that MP2 produced a unique response to the torsion suggesting future use as a diagnostic tool.

  8. Extended wavelet transformation to digital holographic reconstruction: application to the elliptical, astigmatic Gaussian beams.

    PubMed

    Remacha, Clément; Coëtmellec, Sébastien; Brunel, Marc; Lebrun, Denis

    2013-02-01

    Wavelet analysis provides an efficient tool in numerous signal processing problems and has been implemented in optical processing techniques, such as in-line holography. This paper proposes an improvement of this tool for the case of an elliptical, astigmatic Gaussian (AEG) beam. We show that this mathematical operator allows reconstructing an image of a spherical particle without compression of the reconstructed image, which increases the accuracy of the 3D location of particles and of their size measurement. To validate the performance of this operator we have studied the diffraction pattern produced by a particle illuminated by an AEG beam. This study used mutual intensity propagation, and the particle is defined as a chirped Gaussian sum. The proposed technique was applied and the experimental results are presented.

  9. Structural design of the Sandia 34-M Vertical Axis Wind Turbine

    NASA Astrophysics Data System (ADS)

    Berg, D. E.

    Sandia National Laboratories, as the lead DOE laboratory for Vertical Axis Wind Turbine (VAWT) development, is currently designing a 34-meter diameter Darrieus-type VAWT. This turbine will be a research test bed which provides a focus for advancing technology and validating design and fabrication techniques in a size range suitable for utility use. Structural data from this machine will allow structural modeling to be refined and verified for a turbine on which the gravity effects and stochastic wind loading are significant. Performance data from it will allow aerodynamic modeling to be refined and verified. The design effort incorporates Sandia's state-of-the-art analysis tools in the design of a complete machine. The analytic tools used in this design are discussed and the conceptual design procedure is described.

  10. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  11. Comparison of normalization methods for differential gene expression analysis in RNA-Seq experiments

    PubMed Central

    Maza, Elie; Frasse, Pierre; Senin, Pavel; Bouzayen, Mondher; Zouine, Mohamed

    2013-01-01

    In recent years, RNA-Seq technologies became a powerful tool for transcriptome studies. However, computational methods dedicated to the analysis of high-throughput sequencing data are yet to be standardized. In particular, it is known that the choice of a normalization procedure leads to a great variability in results of differential gene expression analysis. The present study compares the most widespread normalization procedures and proposes a novel one aiming at removing an inherent bias of studied transcriptomes related to their relative size. Comparisons of the normalization procedures are performed on real and simulated data sets. Real RNA-Seq data sets analyses, performed with all the different normalization methods, show that only 50% of significantly differentially expressed genes are common. This result highlights the influence of the normalization step on the differential expression analysis. Real and simulated data sets analyses give similar results showing 3 different groups of procedures having the same behavior. The group including the novel method named “Median Ratio Normalization” (MRN) gives the lower number of false discoveries. Within this group the MRN method is less sensitive to the modification of parameters related to the relative size of transcriptomes such as the number of down- and upregulated genes and the gene expression levels. The newly proposed MRN method efficiently deals with intrinsic bias resulting from relative size of studied transcriptomes. Validation with real and simulated data sets confirmed that MRN is more consistent and robust than existing methods. PMID:26442135

  12. An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets.

    PubMed

    Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W

    2010-07-02

    The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease.

  13. Tomography of epidermal growth factor receptor binding to fluorescent Affibody in vivo studied with magnetic resonance guided fluorescence recovery in varying orthotopic glioma sizes

    NASA Astrophysics Data System (ADS)

    Holt, Robert W.; Demers, Jennifer-Lynn H.; Sexton, Kristian J.; Gunn, Jason R.; Davis, Scott C.; Samkoe, Kimberley S.; Pogue, Brian W.

    2015-02-01

    The ability to image targeted tracer binding to epidermal growth factor receptor (EGFR) was studied in vivo in orthotopically grown glioma tumors of different sizes. The binding potential was quantified using a dual-tracer approach, which employs a fluorescently labeled peptide targeted to EGFR and a reference tracer with similar pharmacokinetic properties but no specific binding, to estimate the relative bound fraction from kinetic compartment modeling. The recovered values of binding potential did not vary significantly as a function of tumor size (1 to 33 mm3), suggesting that binding potential may be consistent in the U251 tumors regardless of size or stage after implantation. However, the fluorescence yield of the targeted fluorescent tracers in the tumor was affected significantly by tumor size, suggesting that dual-tracer imaging helps account for variations in absolute uptake, which plague single-tracer imaging techniques. Ex vivo analysis showed relatively high spatial heterogeneity in each tumor that cannot be resolved by tomographic techniques. Nonetheless, the dual-tracer tomographic technique is a powerful tool for longitudinal bulk estimation of receptor binding.

  14. Further Improvement of the RITS Code for Pulsed Neutron Bragg-edge Transmission Imaging

    NASA Astrophysics Data System (ADS)

    Sato, H.; Watanabe, K.; Kiyokawa, K.; Kiyanagi, R.; Hara, K. Y.; Kamiyama, T.; Furusaka, M.; Shinohara, T.; Kiyanagi, Y.

    The RITS code is a unique and powerful tool for a whole Bragg-edge transmission spectrum fitting analysis. However, it has had two major problems. Therefore, we have proposed methods to overcome these problems. The first issue is the difference in the crystallite size values between the diffraction and the Bragg-edge analyses. We found the reason was a different definition of the crystal structure factor. It affects the crystallite size because the crystallite size is deduced from the primary extinction effect which depends on the crystal structure factor. As a result of algorithm change, crystallite sizes obtained by RITS drastically approached to crystallite sizes obtained by Rietveld analyses of diffraction data; from 155% to 110%. The second issue is correction of the effect of background neutrons scattered from a specimen. Through neutron transport simulation studies, we found that the background components consist of forward Bragg scattering, double backward Bragg scattering, and thermal diffuse scattering. RITS with the background correction function which was developed through the simulation studies could well reconstruct various simulated and experimental transmission spectra, but refined crystalline microstructural parameters were often distorted. Finally, it was recommended to reduce the background by improving experimental conditions.

  15. Passive vs. Parachute System Architecture for Robotic Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Maddock, Robert W.; Henning, Allen B.; Samareh, Jamshid A.

    2016-01-01

    The Multi-Mission Earth Entry Vehicle (MMEEV) is a flexible vehicle concept based on the Mars Sample Return (MSR) EEV design which can be used in the preliminary sample return mission study phase to parametrically investigate any trade space of interest to determine the best entry vehicle design approach for that particular mission concept. In addition to the trade space dimensions often considered (e.g. entry conditions, payload size and mass, vehicle size, etc.), the MMEEV trade space considers whether it might be more beneficial for the vehicle to utilize a parachute system during descent/landing or to be fully passive (i.e. not use a parachute). In order to evaluate this trade space dimension, a simplified parachute system model has been developed based on inputs such as vehicle size/mass, payload size/mass and landing requirements. This model works in conjunction with analytical approximations of a mission trade space dataset provided by the MMEEV System Analysis for Planetary EDL (M-SAPE) tool to help quantify the differences between an active (with parachute) and a passive (no parachute) vehicle concept.

  16. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis.

    PubMed

    Kuleesha, Yadav; Puah, Wee Choo; Lin, Feng; Wasser, Martin

    2014-01-01

    During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation.

  17. FMAj: a tool for high content analysis of muscle dynamics in Drosophila metamorphosis

    PubMed Central

    2014-01-01

    Background During metamorphosis in Drosophila melanogaster, larval muscles undergo two different developmental fates; one population is removed by cell death, while the other persistent subset undergoes morphological remodeling and survives to adulthood. Thanks to the ability to perform live imaging of muscle development in transparent pupae and the power of genetics, metamorphosis in Drosophila can be used as a model to study the regulation of skeletal muscle mass. However, time-lapse microscopy generates sizeable image data that require new tools for high throughput image analysis. Results We performed targeted gene perturbation in muscles and acquired 3D time-series images of muscles in metamorphosis using laser scanning confocal microscopy. To quantify the phenotypic effects of gene perturbations, we designed the Fly Muscle Analysis tool (FMAj) which is based on the ImageJ and MySQL frameworks for image processing and data storage, respectively. The image analysis pipeline of FMAj contains three modules. The first module assists in adding annotations to time-lapse datasets, such as genotypes, experimental parameters and temporal reference points, which are used to compare different datasets. The second module performs segmentation and feature extraction of muscle cells and nuclei. Users can provide annotations to the detected objects, such as muscle identities and anatomical information. The third module performs comparative quantitative analysis of muscle phenotypes. We applied our tool to the phenotypic characterization of two atrophy related genes that were silenced by RNA interference. Reduction of Drosophila Tor (Target of Rapamycin) expression resulted in enhanced atrophy compared to control, while inhibition of the autophagy factor Atg9 caused suppression of atrophy and enlarged muscle fibers of abnormal morphology. FMAj enabled us to monitor the progression of atrophic and hypertrophic phenotypes of individual muscles throughout metamorphosis. Conclusions We designed a new tool to visualize and quantify morphological changes of muscles in time-lapse images of Drosophila metamorphosis. Our in vivo imaging experiments revealed that evolutionarily conserved genes involved in Tor signalling and autophagy, perform similar functions in regulating muscle mass in mammals and Drosophila. Extending our approach to a genome-wide scale has the potential to identify new genes involved in muscle size regulation. PMID:25521203

  18. Process Parameters Optimization in Single Point Incremental Forming

    NASA Astrophysics Data System (ADS)

    Gulati, Vishal; Aryal, Ashmin; Katyal, Puneet; Goswami, Amitesh

    2016-04-01

    This work aims to optimize the formability and surface roughness of parts formed by the single-point incremental forming process for an Aluminium-6063 alloy. The tests are based on Taguchi's L18 orthogonal array selected on the basis of DOF. The tests have been carried out on vertical machining center (DMC70V); using CAD/CAM software (SolidWorks V5/MasterCAM). Two levels of tool radius, three levels of sheet thickness, step size, tool rotational speed, feed rate and lubrication have been considered as the input process parameters. Wall angle and surface roughness have been considered process responses. The influential process parameters for the formability and surface roughness have been identified with the help of statistical tool (response table, main effect plot and ANOVA). The parameter that has the utmost influence on formability and surface roughness is lubrication. In the case of formability, lubrication followed by the tool rotational speed, feed rate, sheet thickness, step size and tool radius have the influence in descending order. Whereas in surface roughness, lubrication followed by feed rate, step size, tool radius, sheet thickness and tool rotational speed have the influence in descending order. The predicted optimal values for the wall angle and surface roughness are found to be 88.29° and 1.03225 µm. The confirmation experiments were conducted thrice and the value of wall angle and surface roughness were found to be 85.76° and 1.15 µm respectively.

  19. Ecological and social correlates of chimpanzee tool use.

    PubMed

    Sanz, Crickette M; Morgan, David B

    2013-11-19

    The emergence of technology has been suggested to coincide with scarcity of staple resources that led to innovations in the form of tool-assisted strategies to diversify or augment typical diets. We examined seasonal patterns of several types of tool use exhibited by a chimpanzee (Pan troglodytes) population residing in central Africa, to determine whether their technical skills provided access to fallback resources when preferred food items were scarce. Chimpanzees in the Goualougo Triangle exhibit a diverse repertoire of tool behaviours, many of which are exhibited throughout the year. Further, they have developed specific tool sets to overcome the issues of accessibility to particular food items. Our conclusion is that these chimpanzees use a sophisticated tool technology to cope with seasonal changes in relative food abundance and gain access to high-quality foods. Subgroup sizes were smaller in tool using contexts than other foraging contexts, suggesting that the size of the social group may not be as important in promoting complex tool traditions as the frequency and type of social interactions. Further, reports from other populations and species showed that tool use may occur more often in response to ecological opportunities and relative profitability of foraging techniques than scarcity of resources.

  20. Ecological and social correlates of chimpanzee tool use

    PubMed Central

    Sanz, Crickette M.; Morgan, David B.

    2013-01-01

    The emergence of technology has been suggested to coincide with scarcity of staple resources that led to innovations in the form of tool-assisted strategies to diversify or augment typical diets. We examined seasonal patterns of several types of tool use exhibited by a chimpanzee (Pan troglodytes) population residing in central Africa, to determine whether their technical skills provided access to fallback resources when preferred food items were scarce. Chimpanzees in the Goualougo Triangle exhibit a diverse repertoire of tool behaviours, many of which are exhibited throughout the year. Further, they have developed specific tool sets to overcome the issues of accessibility to particular food items. Our conclusion is that these chimpanzees use a sophisticated tool technology to cope with seasonal changes in relative food abundance and gain access to high-quality foods. Subgroup sizes were smaller in tool using contexts than other foraging contexts, suggesting that the size of the social group may not be as important in promoting complex tool traditions as the frequency and type of social interactions. Further, reports from other populations and species showed that tool use may occur more often in response to ecological opportunities and relative profitability of foraging techniques than scarcity of resources. PMID:24101626

  1. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    NASA Astrophysics Data System (ADS)

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  2. Anthropometry and Biomechanics Facility Presentation to Open EVA Research Forum

    NASA Technical Reports Server (NTRS)

    Rajulu, Sudhakar

    2017-01-01

    NASA is required to accommodate individuals who fall within a 1st to 99th percentile range on a variety of critical dimensions. The hardware the crew interacts with must therefore be designed and verified to allow these selected individuals to complete critical mission tasks safely and at an optimal performance level. Until now, designers have been provided simpler univariate critical dimensional analyses. The multivariate characteristics of intra-individual and inter-individual size variation must be accounted for, since an individual who is 1st percentile in one body dimension will not be 1st percentile in all other dimensions. A more simplistic approach, assuming every measurement of an individual will fall within the same percentile range, can lead to a model that does not represent realistic members of the population. In other words, there is no '1st percentile female' or '99th percentile male', and designing for these unrealistic body types can lead to hardware issues down the road. Furthermore, due to budget considerations, designers are normally limited to providing only 1 size of a prototype suit, thus requiring other possible means to ensure that a given suit architecture would yield the necessary suit sizes to accommodate the entire user population. Fortunately, modeling tools can be used to more accurately model the types of human body sizes and shapes that will be encountered in a population. Anthropometry toolkits have been designed with a variety of capabilities, including grouping the population into clusters based on critical dimensions, providing percentile information given test subject measurements, and listing measurement ranges for critical dimensions in the 1st-99th percentile range. These toolkits can be combined with full body laser scans to allow designers to build human models that better represent the astronaut population. More recently, some rescaling and reposing capabilities have been developed, to allow reshaping of these static laser scans in more representative postures, such as an abducted shoulder. All of the hardware designed for use with the crew must be sized to accommodate the user population, but the interaction between subject size and hardware fit is complicated with multi-component, complex systems like a space suit. Again, prototype suits are normally only provided in a limited size range, and suited testing is an expensive endeavor; both of these factors limit the number and size of people who can be used to benchmark a spacesuit. However, modeling tools for assessing suit-human interaction can allow potential issues to be modeled and visualized. These types of modeling tools can be used for analysis of a larger combination of anthropometries and hardware types than could feasibly be done with actual human subjects and physical mockups.

  3. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries.

    PubMed

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W

    2014-11-01

    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.

  4. Lunar Habitat Optimization Using Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    SanScoucie, M. P.; Hull, P. V.; Tinker, M. L.; Dozier, G. V.

    2007-01-01

    Long-duration surface missions to the Moon and Mars will require bases to accommodate habitats for the astronauts. Transporting the materials and equipment required to build the necessary habitats is costly and difficult. The materials chosen for the habitat walls play a direct role in protection against each of the mentioned hazards. Choosing the best materials, their configuration, and the amount required is extremely difficult due to the immense size of the design region. Clearly, an optimization method is warranted for habitat wall design. Standard optimization techniques are not suitable for problems with such large search spaces; therefore, a habitat wall design tool utilizing genetic algorithms (GAs) has been developed. GAs use a "survival of the fittest" philosophy where the most fit individuals are more likely to survive and reproduce. This habitat design optimization tool is a multiobjective formulation of up-mass, heat loss, structural analysis, meteoroid impact protection, and radiation protection. This Technical Publication presents the research and development of this tool as well as a technique for finding the optimal GA search parameters.

  5. Do persistently fast-growing juveniles contribute disproportionately to population growth? A new analysis tool for matrix models and its application to rainforest trees.

    PubMed

    Zuidema, Pieter A; Brienen, Roel J W; During, Heinjo J; Güneralp, Burak

    2009-11-01

    Plants and animals often exhibit strong and persistent growth variation among individuals within a species. Persistently fast-growing individuals have a higher chance of reaching reproductive size, do so at a younger age, and therefore contribute disproportionately to population growth (lambda). Here we introduce a new approach to quantify this "fast-growth effect." We propose using age-size-structured matrix models in which persistently fast and slow growers are distinguished as they occur in relatively young and old age classes for a given size category. Life-cycle pathways involving fast growth can then be identified, and their contribution to lambda is quantified through loop analysis. We applied this approach to an example species, the tropical rainforest tree Cedrela odorata, that shows persistent growth variation among individuals. Loop analysis showed that juvenile trees reaching the 10-cm diameter class at below-median age contributed twice as much to lambda as slow juvenile growers. Fast growth to larger-diameter categories also contributed disproportionately to lambda. The results were robust to changes in parameter values and life-history trade-offs. These results show that the fast-growth effect can be strong in long-lived species. Persistent growth differences among individuals should therefore be accommodated for in demographic models and life-history studies.

  6. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Interpreting Meta-Analyses of Genome-Wide Association Studies

    PubMed Central

    Han, Buhm; Eskin, Eleazar

    2012-01-01

    Meta-analysis is an increasingly popular tool for combining multiple genome-wide association studies in a single analysis to identify associations with small effect sizes. The effect sizes between studies in a meta-analysis may differ and these differences, or heterogeneity, can be caused by many factors. If heterogeneity is observed in the results of a meta-analysis, interpreting the cause of heterogeneity is important because the correct interpretation can lead to a better understanding of the disease and a more effective design of a replication study. However, interpreting heterogeneous results is difficult. The standard approach of examining the association p-values of the studies does not effectively predict if the effect exists in each study. In this paper, we propose a framework facilitating the interpretation of the results of a meta-analysis. Our framework is based on a new statistic representing the posterior probability that the effect exists in each study, which is estimated utilizing cross-study information. Simulations and application to the real data show that our framework can effectively segregate the studies predicted to have an effect, the studies predicted to not have an effect, and the ambiguous studies that are underpowered. In addition to helping interpretation, the new framework also allows us to develop a new association testing procedure taking into account the existence of effect. PMID:22396665

  8. Pressure Mapping and Efficiency Analysis of an EPPLER 857 Hydrokinetic Turbine

    NASA Astrophysics Data System (ADS)

    Clark, Tristan

    A conceptual energy ship is presented to provide renewable energy. The ship, driven by the wind, drags a hydrokinetic turbine through the water. The power generated is used to run electrolysis on board, taking the resultant hydrogen back to shore to be used as an energy source. The basin efficiency (Power/thrust*velocity) of the Hydrokinetic Turbine (HTK) plays a vital role in this process. In order to extract the maximum allowable power from the flow, the blades need to be optimized. The structural analysis of the blade is important, as the blade will undergo high pressure loads from the water. A procedure for analysis of a preliminary Hydrokinetic Turbine blade design is developed. The blade was designed by a non-optimized Blade Element Momentum Theory (BEMT) code. Six simulations were run, with varying mesh resolution, turbulence models, and flow region size. The procedure was developed that provides detailed explanation for the entire process, from geometry and mesh generation to post-processing analysis tools. The efficiency results from the simulations are used to study the mesh resolution, flow region size, and turbulence models. The results are compared to the BEMT model design targets. Static pressure maps are created that can be used for structural analysis of the blades.

  9. Two worlds collide: Image analysis methods for quantifying structural variation in cluster molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steenbergen, K. G., E-mail: kgsteen@gmail.com; Gaston, N.

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement formore » a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.« less

  10. Two worlds collide: image analysis methods for quantifying structural variation in cluster molecular dynamics.

    PubMed

    Steenbergen, K G; Gaston, N

    2014-02-14

    Inspired by methods of remote sensing image analysis, we analyze structural variation in cluster molecular dynamics (MD) simulations through a unique application of the principal component analysis (PCA) and Pearson Correlation Coefficient (PCC). The PCA analysis characterizes the geometric shape of the cluster structure at each time step, yielding a detailed and quantitative measure of structural stability and variation at finite temperature. Our PCC analysis captures bond structure variation in MD, which can be used to both supplement the PCA analysis as well as compare bond patterns between different cluster sizes. Relying only on atomic position data, without requirement for a priori structural input, PCA and PCC can be used to analyze both classical and ab initio MD simulations for any cluster composition or electronic configuration. Taken together, these statistical tools represent powerful new techniques for quantitative structural characterization and isomer identification in cluster MD.

  11. Transient hydrodynamic finite-size effects in simulations under periodic boundary conditions

    NASA Astrophysics Data System (ADS)

    Asta, Adelchi J.; Levesque, Maximilien; Vuilleumier, Rodolphe; Rotenberg, Benjamin

    2017-06-01

    We use lattice-Boltzmann and analytical calculations to investigate transient hydrodynamic finite-size effects induced by the use of periodic boundary conditions. These effects are inevitable in simulations at the molecular, mesoscopic, or continuum levels of description. We analyze the transient response to a local perturbation in the fluid and obtain the local velocity correlation function via linear response theory. This approach is validated by comparing the finite-size effects on the steady-state velocity with the known results for the diffusion coefficient. We next investigate the full time dependence of the local velocity autocorrelation function. We find at long times a crossover between the expected t-3 /2 hydrodynamic tail and an oscillatory exponential decay, and study the scaling with the system size of the crossover time, exponential rate and amplitude, and oscillation frequency. We interpret these results from the analytic solution of the compressible Navier-Stokes equation for the slowest modes, which are set by the system size. The present work not only provides a comprehensive analysis of hydrodynamic finite-size effects in bulk fluids, which arise regardless of the level of description and simulation algorithm, but also establishes the lattice-Boltzmann method as a suitable tool to investigate such effects in general.

  12. Comprehensive performance comparison of high-resolution array platforms for genome-wide Copy Number Variation (CNV) analysis in humans.

    PubMed

    Haraksingh, Rajini R; Abyzov, Alexej; Urban, Alexander Eckehart

    2017-04-24

    High-resolution microarray technology is routinely used in basic research and clinical practice to efficiently detect copy number variants (CNVs) across the entire human genome. A new generation of arrays combining high probe densities with optimized designs will comprise essential tools for genome analysis in the coming years. We systematically compared the genome-wide CNV detection power of all 17 available array designs from the Affymetrix, Agilent, and Illumina platforms by hybridizing the well-characterized genome of 1000 Genomes Project subject NA12878 to all arrays, and performing data analysis using both manufacturer-recommended and platform-independent software. We benchmarked the resulting CNV call sets from each array using a gold standard set of CNVs for this genome derived from 1000 Genomes Project whole genome sequencing data. The arrays tested comprise both SNP and aCGH platforms with varying designs and contain between ~0.5 to ~4.6 million probes. Across the arrays CNV detection varied widely in number of CNV calls (4-489), CNV size range (~40 bp to ~8 Mbp), and percentage of non-validated CNVs (0-86%). We discovered strikingly strong effects of specific array design principles on performance. For example, some SNP array designs with the largest numbers of probes and extensive exonic coverage produced a considerable number of CNV calls that could not be validated, compared to designs with probe numbers that are sometimes an order of magnitude smaller. This effect was only partially ameliorated using different analysis software and optimizing data analysis parameters. High-resolution microarrays will continue to be used as reliable, cost- and time-efficient tools for CNV analysis. However, different applications tolerate different limitations in CNV detection. Our study quantified how these arrays differ in total number and size range of detected CNVs as well as sensitivity, and determined how each array balances these attributes. This analysis will inform appropriate array selection for future CNV studies, and allow better assessment of the CNV-analytical power of both published and ongoing array-based genomics studies. Furthermore, our findings emphasize the importance of concurrent use of multiple analysis algorithms and independent experimental validation in array-based CNV detection studies.

  13. A simplified economic filter for open-pit mining and heap-leach recovery of copper in the United States

    USGS Publications Warehouse

    Long, Keith R.; Singer, Donald A.

    2001-01-01

    Determining the economic viability of mineral deposits of various sizes and grades is a critical task in all phases of mineral supply, from land-use management to mine development. This study evaluates two simple tools for estimating the economic viability of porphyry copper deposits mined by open-pit, heap-leach methods when only limited information on these deposits is available. These two methods are useful for evaluating deposits that either (1) are undiscovered deposits predicted by a mineral resource assessment, or (2) have been discovered but for which little data has been collected or released. The first tool uses ordinary least-squared regression analysis of cost and operating data from selected deposits to estimate a predictive relationship between mining rate, itself estimated from deposit size, and capital and operating costs. The second method uses cost models developed by the U.S. Bureau of Mines (Camm, 1991) updated using appropriate cost indices. We find that the cost model method works best for estimating capital costs and the empirical model works best for estimating operating costs for mines to be developed in the United States.

  14. An open-pattern droplet-in-oil planar array for single cell analysis based on sequential inkjet printing technology.

    PubMed

    Wang, Chenyu; Liu, Wenwen; Tan, Manqing; Sun, Hongbo; Yu, Yude

    2017-07-01

    Cellular heterogeneity represents a fundamental principle of cell biology for which a readily available single-cell research tool is urgently required. Here, we present a novel method combining cell-sized well arrays with sequential inkjet printing. Briefly, K562 cells with phosphate buffer saline buffer were captured at high efficiency (74.5%) in a cell-sized well as a "primary droplet" and sealed using fluorinated oil. Then, piezoelectric inkjet printing technology was adapted to precisely inject the cell lysis buffer and the fluorogenic substrate, fluorescein-di-β-D-galactopyranoside, as a "secondary droplet" to penetrate the sealing oil and fuse with the "primary droplet." We thereby successfully measured the intracellular β-galactosidase activity of K562 cells at the single-cell level. Our method allows, for the first time, the ability to simultaneously accommodate the high occupancy rate of single cells and sequential addition of reagents while retaining an open structure. We believe that the feasibility and flexibility of our method will enhance its use as a universal single-cell research tool as well as accelerate the adoption of inkjet printing in the study of cellular heterogeneity.

  15. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    PubMed

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Advanced bioanalytics for precision medicine.

    PubMed

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  17. Enhanced and selective optical trapping in a slot-graphite photonic crystal.

    PubMed

    Krishnan, Aravind; Huang, Ningfeng; Wu, Shao-Hua; Martínez, Luis Javier; Povinelli, Michelle L

    2016-10-03

    Applicability of optical trapping tools for nanomanipulation is limited by the available laser power and trap efficiency. We utilized the strong confinement of light in a slot-graphite photonic crystal to develop high-efficiency parallel trapping over a large area. The stiffness is 35 times higher than our previously demonstrated on-chip, near field traps. We demonstrate the ability to trap both dielectric and metallic particles of sub-micron size. We find that the growth kinetics of nanoparticle arrays on the slot-graphite template depends on particle size. This difference is exploited to selectively trap one type of particle out of a binary colloidal mixture, creating an efficient optical sieve. This technique has rich potential for analysis, diagnostics, and enrichment and sorting of microscopic entities.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Guanguang; Chen, Baowei; Zhang, Rui

    In this work, new strategies involving organic bases were evaluated to depolymerize lignin to reduced molecular fragments in aqueous medium. NaOH as an inorganic base was also investigated as a reference. Full nature lignin samples are used for the study. As research tools to unravel the complexity of the macro lignin structure and bulky molecular size under this study, size exclusion chromatography and high resolution mass spectrometric analysis, typically used for protein characterizations, were used to follow the progress of lignin depolymerisation by measuring the molecular weight distribution of the products and determining the key molecular fingerprints, respectively. The resultsmore » show that sodium phenoxide and guanidine carbonate are effective catalysts for lignin depolymerization. It is observed that there exists a synergism between H2O2 and the organic base, which is strongest with guanidine carbonate.« less

  19. Analysis of Basis Weight Uniformity of Microfiber Nonwovens and Its Impact on Permeability and Filtration Properties

    NASA Astrophysics Data System (ADS)

    Amirnasr, Elham

    It is widely recognized that nonwoven basis weight non-uniformity affects various properties of nonwovens. However, few studies can be found in this topic. The development of uniformity definition and measurement methods and the study of their impact on various web properties such as filtration properties and air permeability would be beneficial both in industrial applications and in academia. They can be utilized as a quality control tool and would provide insights about nonwoven behaviors that cannot be solely explained by average values. Therefore, for quantifying nonwoven web basis weight uniformity we purse to develop an optical analytical tool. The quadrant method and clustering analysis was utilized in an image analysis scheme to help define "uniformity" and its spatial variation. Implementing the quadrant method in an image analysis system allows the establishment of a uniformity index that can be used to quantify the degree of uniformity. Clustering analysis has also been modified and verified using uniform and random simulated images with known parameters. Number of clusters and cluster properties such as cluster size, member and density was determined. We also utilized this new measurement method to evaluate uniformity of nonwovens produced with different processes and investigated impacts of uniformity on filtration and permeability. The results of quadrant method shows that uniformity index computed from quadrant method demonstrate a good range for non-uniformity of nonwoven webs. Clustering analysis is also been applied on reference nonwoven with known visual uniformity. From clustering analysis results, cluster size is promising to be used as uniformity parameter. It is been shown that non-uniform nonwovens has provide lager cluster size than uniform nonwovens. It was been tried to find a relationship between web properties and uniformity index (as a web characteristic). To achieve this, filtration properties, air permeability, solidity and uniformity index of meltblown and spunbond samples was measured. Results for filtration test show some deviation between theoretical and experimental filtration efficiency by considering different types of fiber diameter. This deviation can occur due to variation in basis weight non-uniformity. So an appropriate theory is required to predict the variation of filtration efficiency with respect to non-uniformity of nonwoven filter media. And the results for air permeability test showed that uniformity index determined by quadrant method and measured properties have some relationship. In the other word, air permeability decreases as uniformity index on nonwoven web increase.

  20. Biomimetic Dissolution: A Tool to Predict Amorphous Solid Dispersion Performance.

    PubMed

    Puppolo, Michael M; Hughey, Justin R; Dillon, Traciann; Storey, David; Jansen-Varnum, Susan

    2017-11-01

    The presented study describes the development of a membrane permeation non-sink dissolution method that can provide analysis of complete drug speciation and emulate the in vivo performance of poorly water-soluble Biopharmaceutical Classification System class II compounds. The designed membrane permeation methodology permits evaluation of free/dissolved/unbound drug from amorphous solid dispersion formulations with the use of a two-cell apparatus, biorelevant dissolution media, and a biomimetic polymer membrane. It offers insight into oral drug dissolution, permeation, and absorption. Amorphous solid dispersions of felodipine were prepared by hot melt extrusion and spray drying techniques and evaluated for in vitro performance. Prior to ranking performance of extruded and spray-dried felodipine solid dispersions, optimization of the dissolution methodology was performed for parameters such as agitation rate, membrane type, and membrane pore size. The particle size and zeta potential were analyzed during dissolution experiments to understand drug/polymer speciation and supersaturation sustainment of felodipine solid dispersions. Bland-Altman analysis was performed to measure the agreement or equivalence between dissolution profiles acquired using polymer membranes and porcine intestines and to establish the biomimetic nature of the treated polymer membranes. The utility of the membrane permeation dissolution methodology is seen during the evaluation of felodipine solid dispersions produced by spray drying and hot melt extrusion. The membrane permeation dissolution methodology can suggest formulation performance and be employed as a screening tool for selection of candidates to move forward to pharmacokinetic studies. Furthermore, the presented model is a cost-effective technique.

  1. Concept design theory and model for multi-use space facilities: Analysis of key system design parameters through variance of mission requirements

    NASA Astrophysics Data System (ADS)

    Reynerson, Charles Martin

    This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.

  2. Combining gas-phase electrophoretic mobility molecular analysis (GEMMA), light scattering, field flow fractionation and cryo electron microscopy in a multidimensional approach to characterize liposomal carrier vesicles.

    PubMed

    Urey, Carlos; Weiss, Victor U; Gondikas, Andreas; von der Kammer, Frank; Hofmann, Thilo; Marchetti-Deschmann, Martina; Allmaier, Günter; Marko-Varga, György; Andersson, Roland

    2016-11-20

    For drug delivery, characterization of liposomes regarding size, particle number concentrations, occurrence of low-sized liposome artefacts and drug encapsulation are of importance to understand their pharmacodynamic properties. In our study, we aimed to demonstrate the applicability of nano Electrospray Gas-Phase Electrophoretic Mobility Molecular Analyser (nES GEMMA) as a suitable technique for analyzing these parameters. We measured number-based particle concentrations, identified differences in size between nominally identical liposomal samples, and detected the presence of low-diameter material which yielded bimodal particle size distributions. Subsequently, we compared these findings to dynamic light scattering (DLS) data and results from light scattering experiments coupled to Asymmetric Flow-Field Flow Fractionation (AF4), the latter improving the detectability of smaller particles in polydisperse samples due to a size separation step prior detection. However, the bimodal size distribution could not be detected due to method inherent limitations. In contrast, cryo transmission electron microscopy corroborated nES GEMMA results. Hence, gas-phase electrophoresis proved to be a versatile tool for liposome characterization as it could analyze both vesicle size and size distribution. Finally, a correlation of nES GEMMA results with cell viability experiments was carried out to demonstrate the importance of liposome batch-to-batch control as low-sized sample components possibly impact cell viability. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Data and Tools | Energy Analysis | NREL

    Science.gov Websites

    and Tools Energy Analysis Data and Tools NREL develops energy analysis data and tools to assess collections. Data Products Technology and Performance Analysis Tools Energy Systems Analysis Tools Economic and Financial Analysis Tools

  4. Optimization of Gate, Runner and Sprue in Two-Plate Family Plastic Injection Mould

    NASA Astrophysics Data System (ADS)

    Amran, M. A.; Hadzley, M.; Amri, S.; Izamshah, R.; Hassan, A.; Samsi, S.; Shahir, K.

    2010-03-01

    This paper describes the optimization size of gate, runner and sprue in two-plate family plastic injection mould. An Electronic Cash Register (ECR) plastic product was used in this study, which there are three components in electronic cast register plastic product consist of top casing, bottom casing and paper holder. The objectives of this paper are to find out the optimum size of gate, runner and sprue, to locate the optimum layout of cavities and to recognize the defect problems due to the wrong size of gate, runner and sprue. Three types of software were used in this study, which Unigraphics software as CAD tool was used to design 3D modeling, Rhinoceros software as post processing tool was used to design gate, runner and sprue and Moldex software as simulation tool was used to analyze the plastic flow. As result, some modifications were made on size of feeding system and location of cavity to eliminate the short- shot, over filling and welding line problems in two-plate family plastic injection mould.

  5. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  6. Manufacture and evaluation of 3-dimensional printed sizing tools for use during intraoperative breast brachytherapy.

    PubMed

    Walker, Joshua M; Elliott, David A; Kubicky, Charlotte D; Thomas, Charles R; Naik, Arpana M

    2016-01-01

    Three-dimensional (3D) printing has emerged as a promising modality for the production of medical devices. Here we describe the design, production, and implementation of a series of sizing tools for use in an intraoperative breast brachytherapy program. These devices were produced using a commercially available low-cost 3D printer and software, and their implementation resulted in an immediate decrease in consumable costs without affecting the quality of care or the speed of delivery. This work illustrates the potential of 3D printing to revolutionize the field of medical devices, enabling physicians to rapidly develop and prototype novel tools.

  7. Subset Analysis of a Multicenter, Randomized Controlled Trial to Compare Magnifying Chromoendoscopy with Endoscopic Ultrasonography for Stage Diagnosis of Early Stage Colorectal Cancer.

    PubMed

    Yamada, Tomonori; Shimura, Takaya; Ebi, Masahide; Hirata, Yoshikazu; Nishiwaki, Hirotaka; Mizushima, Takashi; Asukai, Koki; Togawa, Shozo; Takahashi, Satoru; Joh, Takashi

    2015-01-01

    Our recent prospective study found equivalent accuracy of magnifying chromoendoscopy (MC) and endoscopic ultrasonography (EUS) for diagnosing the invasion depth of colorectal cancer (CRC); however, whether these tools show diagnostic differences in categories such as tumor size and morphology remains unclear. Hence, we conducted detailed subset analysis of the prospective data. In this multicenter, prospective, comparative trial, a total of 70 patients with early, flat CRC were enrolled from February 2011 to December 2012, and the results of 66 lesions were finally analyzed. Patients were randomly allocated to primary MC followed by EUS or to primary EUS followed by MC. Diagnoses of invasion depth by each tool were divided into intramucosal to slight submucosal invasion (invasion depth <1000 μm) and deep submucosal invasion (invasion depth ≥1000 μm), and then compared with the final pathological diagnosis by an independent pathologist blinded to clinical data. To standardize diagnoses among examiners, this trial was started after achievement of a mean κ value of ≥0.6 which was calculated from the average of κ values between each pair of participating endoscopists. Both MC and EUS showed similar diagnostic outcomes, with no significant differences in prediction of invasion depth in subset analyses according to tumor size, location, and morphology. Lesions that were consistently diagnosed as Tis/T1-SMS or ≥T1-SMD with both tools revealed accuracy of 76-78%. Accuracy was low in borderline lesions with irregular pit pattern in MC and distorted findings of the third layer in EUS (MC, 58.5%; EUS, 50.0%). MC and EUS showed the same limited accuracy for predicting invasion depth in all categories of early CRC. Since the irregular pit pattern in MC, distorted findings to the third layer in EUS and inconsistent diagnosis between both tools were associated with low accuracy, further refinements or even novel methods are still needed for such lesions. University hospital Medical Information Network Clinical Trials Registry UMIN 000005085.

  8. Rapid Active Sampling Package

    NASA Technical Reports Server (NTRS)

    Peters, Gregory

    2010-01-01

    A field-deployable, battery-powered Rapid Active Sampling Package (RASP), originally designed for sampling strong materials during lunar and planetary missions, shows strong utility for terrestrial geological use. The technology is proving to be simple and effective for sampling and processing materials of strength. Although this originally was intended for planetary and lunar applications, the RASP is very useful as a powered hand tool for geologists and the mining industry to quickly sample and process rocks in the field on Earth. The RASP allows geologists to surgically acquire samples of rock for later laboratory analysis. This tool, roughly the size of a wrench, allows the user to cut away swaths of weathering rinds, revealing pristine rock surfaces for observation and subsequent sampling with the same tool. RASPing deeper (.3.5 cm) exposes single rock strata in-situ. Where a geologist fs hammer can only expose unweathered layers of rock, the RASP can do the same, and then has the added ability to capture and process samples into powder with particle sizes less than 150 microns, making it easier for XRD/XRF (x-ray diffraction/x-ray fluorescence). The tool uses a rotating rasp bit (or two counter-rotating bits) that resides inside or above the catch container. The container has an open slot to allow the bit to extend outside the container and to allow cuttings to enter and be caught. When the slot and rasp bit are in contact with a substrate, the bit is plunged into it in a matter of seconds to reach pristine rock. A user in the field may sample a rock multiple times at multiple depths in minutes, instead of having to cut out huge, heavy rock samples for transport back to a lab for analysis. Because of the speed and accuracy of the RASP, hundreds of samples can be taken in one day. RASP-acquired samples are small and easily carried. A user can characterize more area in less time than by using conventional methods. The field-deployable RASP used a Ni/Cad rechargeable battery. Power usage was less than 1 Wh/ cm3 even when sampling strong basalts, so many samples could be taken on a single battery charge.

  9. The verification of printability about marginal defects and the detectability at the inspection tool in sub 50nm node

    NASA Astrophysics Data System (ADS)

    Lee, Hyemi; Jeong, Goomin; Seo, Kangjun; Kim, Sangchul; kim, changreol

    2008-05-01

    Since mask design rule is smaller and smaller, Defects become one of the issues dropping the mask yield. Furthermore controlled defect size become smaller while masks are manufactured. According to ITRS roadmap on 2007, controlled defect size is 46nm in 57nm node and 36nm in 45nm node on a mask. However the machine development is delayed in contrast with the speed of the photolithography development. Generally mask manufacturing process is divided into 3 parts. First part is patterning on a mask and second part is inspecting the pattern and repairing the defect on the mask. At that time, inspection tools of transmitted light type are normally used and are the most trustful as progressive type in the developed inspection tools until now. Final part is shipping the mask after the qualifying the issue points and weak points. Issue points on a mask are qualified by using the AIMS (Aerial image measurement system). But this system is including the inherent error possibility, which is AIMS measures the issue points based on the inspection results. It means defects printed on a wafer are over the specific size detected by inspection tools and the inspection tool detects the almost defects. Even though there are no tools to detect the 46nm and 36nm defects suggested by ITRS roadmap, this assumption is applied to manufacturing the 57nm and 45nm device. So we make the programmed defect mask consisted with various defect type such as spot, clear extension, dark extension and CD variation on L/S(line and space), C/H(contact hole) and Active pattern in 55nm and 45nm node. And the programmed defect mask was inspected by using the inspection tool of transmitted light type and was measured by using AIMS 45-193i. Then the marginal defects were compared between the inspection tool and AIMS. Accordingly we could verify whether defect size is proper or not, which was suggested to be controlled on a mask by ITRS roadmap. Also this result could suggest appropriate inspection tools for next generation device among the inspection tools of transmitted light type, reflected light type and aerial image type.

  10. ShapeRotator: An R tool for standardized rigid rotations of articulated three-dimensional structures with application for geometric morphometrics.

    PubMed

    Vidal-García, Marta; Bandara, Lashi; Keogh, J Scott

    2018-05-01

    The quantification of complex morphological patterns typically involves comprehensive shape and size analyses, usually obtained by gathering morphological data from all the structures that capture the phenotypic diversity of an organism or object. Articulated structures are a critical component of overall phenotypic diversity, but data gathered from these structures are difficult to incorporate into modern analyses because of the complexities associated with jointly quantifying 3D shape in multiple structures. While there are existing methods for analyzing shape variation in articulated structures in two-dimensional (2D) space, these methods do not work in 3D, a rapidly growing area of capability and research. Here, we describe a simple geometric rigid rotation approach that removes the effect of random translation and rotation, enabling the morphological analysis of 3D articulated structures. Our method is based on Cartesian coordinates in 3D space, so it can be applied to any morphometric problem that also uses 3D coordinates (e.g., spherical harmonics). We demonstrate the method by applying it to a landmark-based dataset for analyzing shape variation using geometric morphometrics. We have developed an R tool (ShapeRotator) so that the method can be easily implemented in the commonly used R package geomorph and MorphoJ software. This method will be a valuable tool for 3D morphological analyses in articulated structures by allowing an exhaustive examination of shape and size diversity.

  11. A web-based subsetting service for regional scale MODIS land products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SanthanaVannan, Suresh K; Cook, Robert B; Holladay, Susan K

    2009-12-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) sensor has provided valuable information on various aspects of the Earth System since March 2000. The spectral, spatial, and temporal characteristics of MODIS products have made them an important data source for analyzing key science questions relating to Earth System processes at regional, continental, and global scales. The size of the MODIS product and native HDF-EOS format are not optimal for use in field investigations at individual sites (100 - 100 km or smaller). In order to make MODIS data readily accessible for field investigations, the NASA-funded Distributed Active Archive Center (DAAC) for Biogeochemicalmore » Dynamics at Oak Ridge National Laboratory (ORNL) has developed an online system that provides MODIS land products in an easy-to-use format and in file sizes more appropriate to field research. This system provides MODIS land products data in a nonproprietary comma delimited ASCII format and in GIS compatible formats (GeoTIFF and ASCII grid). Web-based visualization tools are also available as part of this system and these tools provide a quick snapshot of the data. Quality control tools and a multitude of data delivery options are available to meet the demands of various user communities. This paper describes the important features and design goals for the system, particularly in the context of data archive and distribution for regional scale analysis. The paper also discusses the ways in which data from this system can be used for validation, data intercomparison, and modeling efforts.« less

  12. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture.

    PubMed

    Pollock, James; Coffman, Jon; Ho, Sa V; Farid, Suzanne S

    2017-07-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete-event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision-making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E-factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium-sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed-batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision-making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854-866, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  13. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture

    PubMed Central

    Pollock, James; Coffman, Jon; Ho, Sa V.

    2017-01-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete‐event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision‐making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E‐factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium‐sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed‐batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision‐making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854–866, 2017 PMID:28480535

  14. Distributing File-Based Data to Remote Sites Within the BABAR Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gowdy, Stephen J.

    BABAR [1] uses two formats for its data: Objectivity database and root [2] files. This poster concerns the distribution of the latter--for Objectivity data see [3]. The BABAR analysis data is stored in root files--one per physics run and analysis selection channel--maintained in a large directory tree. Currently BABAR has more than 4.5 TBytes in 200,000 root files. This data is (mostly) produced at SLAC, but is required for analysis at universities and research centers throughout the us and Europe. Two basic problems confront us when we seek to import bulk data from slac to an institute's local storage viamore » the network. We must determine which files must be imported (depending on the local site requirements and which files have already been imported), and we must make the optimum use of the network when transferring the data. Basic ftp-like tools (ftp, scp, etc) do not attempt to solve the first problem. More sophisticated tools like rsync [4], the widely-used mirror/synchronization program, compare local and remote file systems, checking for changes (based on file date, size and, if desired, an elaborate checksum) in order to only copy new or modified files. However rsync allows for only limited file selection. Also when, as in BABAR, an extremely large directory structure must be scanned, rsync can take several hours just to determine which files need to be copied. Although rsync (and scp) provides on-the-fly compression, it does not allow us to optimize the network transfer by using multiple streams, adjusting the tcp window size, or separating encrypted authentication from unencrypted data channels.« less

  15. The albatross plot: A novel graphical tool for presenting results of diversely reported studies in a systematic review

    PubMed Central

    Jones, Hayley E.; Martin, Richard M.; Lewis, Sarah J.; Higgins, Julian P.T.

    2017-01-01

    Abstract Meta‐analyses combine the results of multiple studies of a common question. Approaches based on effect size estimates from each study are generally regarded as the most informative. However, these methods can only be used if comparable effect sizes can be computed from each study, and this may not be the case due to variation in how the studies were done or limitations in how their results were reported. Other methods, such as vote counting, are then used to summarize the results of these studies, but most of these methods are limited in that they do not provide any indication of the magnitude of effect. We propose a novel plot, the albatross plot, which requires only a 1‐sided P value and a total sample size from each study (or equivalently a 2‐sided P value, direction of effect and total sample size). The plot allows an approximate examination of underlying effect sizes and the potential to identify sources of heterogeneity across studies. This is achieved by drawing contours showing the range of effect sizes that might lead to each P value for given sample sizes, under simple study designs. We provide examples of albatross plots using data from previous meta‐analyses, allowing for comparison of results, and an example from when a meta‐analysis was not possible. PMID:28453179

  16. Automated Defect and Correlation Length Analysis of Block Copolymer Thin Film Nanopatterns

    PubMed Central

    Murphy, Jeffrey N.; Harris, Kenneth D.; Buriak, Jillian M.

    2015-01-01

    Line patterns produced by lamellae- and cylinder-forming block copolymer (BCP) thin films are of widespread interest for their potential to enable nanoscale patterning over large areas. In order for such patterning methods to effectively integrate with current technologies, the resulting patterns need to have low defect densities, and be produced in a short timescale. To understand whether a given polymer or annealing method might potentially meet such challenges, it is necessary to examine the evolution of defects. Unfortunately, few tools are readily available to researchers, particularly those engaged in the synthesis and design of new polymeric systems with the potential for patterning, to measure defects in such line patterns. To this end, we present an image analysis tool, which we have developed and made available, to measure the characteristics of such patterns in an automated fashion. Additionally we apply the tool to six cylinder-forming polystyrene-block-poly(2-vinylpyridine) polymers thermally annealed to explore the relationship between the size of each polymer and measured characteristics including line period, line-width, defect density, line-edge roughness (LER), line-width roughness (LWR), and correlation length. Finally, we explore the line-edge roughness, line-width roughness, defect density, and correlation length as a function of the image area sampled to determine each in a more rigorous fashion. PMID:26207990

  17. ScanIndel: a hybrid framework for indel detection via gapped alignment, split reads and de novo assembly.

    PubMed

    Yang, Rendong; Nelson, Andrew C; Henzler, Christine; Thyagarajan, Bharat; Silverstein, Kevin A T

    2015-12-07

    Comprehensive identification of insertions/deletions (indels) across the full size spectrum from second generation sequencing is challenging due to the relatively short read length inherent in the technology. Different indel calling methods exist but are limited in detection to specific sizes with varying accuracy and resolution. We present ScanIndel, an integrated framework for detecting indels with multiple heuristics including gapped alignment, split reads and de novo assembly. Using simulation data, we demonstrate ScanIndel's superior sensitivity and specificity relative to several state-of-the-art indel callers across various coverage levels and indel sizes. ScanIndel yields higher predictive accuracy with lower computational cost compared with existing tools for both targeted resequencing data from tumor specimens and high coverage whole-genome sequencing data from the human NIST standard NA12878. Thus, we anticipate ScanIndel will improve indel analysis in both clinical and research settings. ScanIndel is implemented in Python, and is freely available for academic use at https://github.com/cauyrd/ScanIndel.

  18. Efficient Bayesian mixed model analysis increases association power in large cohorts

    PubMed Central

    Loh, Po-Ru; Tucker, George; Bulik-Sullivan, Brendan K; Vilhjálmsson, Bjarni J; Finucane, Hilary K; Salem, Rany M; Chasman, Daniel I; Ridker, Paul M; Neale, Benjamin M; Berger, Bonnie; Patterson, Nick; Price, Alkes L

    2014-01-01

    Linear mixed models are a powerful statistical tool for identifying genetic associations and avoiding confounding. However, existing methods are computationally intractable in large cohorts, and may not optimize power. All existing methods require time cost O(MN2) (where N = #samples and M = #SNPs) and implicitly assume an infinitesimal genetic architecture in which effect sizes are normally distributed, which can limit power. Here, we present a far more efficient mixed model association method, BOLT-LMM, which requires only a small number of O(MN)-time iterations and increases power by modeling more realistic, non-infinitesimal genetic architectures via a Bayesian mixture prior on marker effect sizes. We applied BOLT-LMM to nine quantitative traits in 23,294 samples from the Women’s Genome Health Study (WGHS) and observed significant increases in power, consistent with simulations. Theory and simulations show that the boost in power increases with cohort size, making BOLT-LMM appealing for GWAS in large cohorts. PMID:25642633

  19. Computed Tomography to Estimate the Representative Elementary Area for Soil Porosity Measurements

    PubMed Central

    Borges, Jaqueline Aparecida Ribaski; Pires, Luiz Fernando; Belmont Pereira, André

    2012-01-01

    Computed tomography (CT) is a technique that provides images of different solid and porous materials. CT could be an ideal tool to study representative sizes of soil samples because of the noninvasive characteristic of this technique. The scrutiny of such representative elementary sizes (RESs) has been the target of attention of many researchers related to soil physics field owing to the strong relationship between physical properties and size of the soil sample. In the current work, data from gamma-ray CT were used to assess RES in measurements of soil porosity (ϕ). For statistical analysis, a study on the full width at a half maximum (FWHM) of the adjustment of distribution of ϕ at different areas (1.2 to 1162.8 mm2) selected inside of tomographic images was proposed herein. The results obtained point out that samples with a section area corresponding to at least 882.1 mm2 were the ones that provided representative values of ϕ for the studied Brazilian tropical soil. PMID:22666133

  20. Detection, Tracking and Analysis of Turbulent Spots and Other Coherent Structures in Unsteady Transition

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques; Ashpis, David (Technical Monitor)

    2000-01-01

    Transition on turbine blades is an important factor in the determination of eventual flow separation and engine performance. The phenomenon is strongly affected by unsteady flow conditions (wake passing). It is likely that some physics of unsteadiness should be included in advanced models, but it is unclear which properties would best embody this information. In this paper, we use a GEAE experimental database in unsteady transition to test some tools of spot identification, tracking and characterization. In this preliminary study, we identify some parameters that appear to be insensitive to wake passing effects, such as convection speed, and others more likely to require unsteady modeling. The main findings are that wavelet duration can be used as a measure of spot size, and that spot energy density is most closely correlated to the wake passing. The energy density is also correlated to spot size, but spot size appears unrelated to the phase angle. Recommendations are made for further study.

  1. Scattering from phase-separated vesicles. I. An analytical form factor for multiple static domains

    DOE PAGES

    Heberle, Frederick A.; Anghel, Vinicius N. P.; Katsaras, John

    2015-08-18

    This is the first in a series of studies considering elastic scattering from laterally heterogeneous lipid vesicles containing multiple domains. Unique among biophysical tools, small-angle neutron scattering can in principle give detailed information about the size, shape and spatial arrangement of domains. A general theory for scattering from laterally heterogeneous vesicles is presented, and the analytical form factor for static domains with arbitrary spatial configuration is derived, including a simplification for uniformly sized round domains. The validity of the model, including series truncation effects, is assessed by comparison with simulated data obtained from a Monte Carlo method. Several aspects ofmore » the analytical solution for scattering intensity are discussed in the context of small-angle neutron scattering data, including the effect of varying domain size and number, as well as solvent contrast. Finally, the analysis indicates that effects of domain formation are most pronounced when the vesicle's average scattering length density matches that of the surrounding solvent.« less

  2. Ion-size dependent electroosmosis of viscoelastic fluids in microfluidic channels with interfacial slip

    NASA Astrophysics Data System (ADS)

    Mukherjee, Siddhartha; Goswami, Prakash; Dhar, Jayabrata; Dasgupta, Sunando; Chakraborty, Suman

    2017-07-01

    We report a study on the ion-size dependent electroosmosis of viscoelastic fluids in microfluidic channels with interfacial slip. Here, we derive an analytical solution for the potential distribution in a parallel plate microchannel, where the effects of finite sized ionic species are taken into account by invoking the free energy formalism. Following this, a purely electroosmotic flow of a simplified Phan-Thien-Tanner (sPTT) fluid is considered. For the sPTT model, linear, quadratic, and exponential kernels are chosen for the stress coefficient function describing its viscoelastic nature across various ranges of Deborah number. The theoretical framework presented in our analysis has been successfully compared with experimental results available in the literature. We believe that the implications of the considered effects on the net volumetric throughput will not only provide a deeper theoretical insight to interpret the electrokinetic data in the presence of ionic species but also serve as a fundamental design tool for novel electrokinetically driven lab-on-a-chip biofluidic devices.

  3. LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns

    NASA Astrophysics Data System (ADS)

    Netzel, P.; Stepinski, T.

    2012-12-01

    The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu

  4. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  5. The phenotype of cancer cell invasion controlled by fibril diameter and pore size of 3D collagen networks.

    PubMed

    Sapudom, Jiranuwat; Rubner, Stefan; Martin, Steve; Kurth, Tony; Riedel, Stefanie; Mierke, Claudia T; Pompe, Tilo

    2015-06-01

    The behavior of cancer cells is strongly influenced by the properties of extracellular microenvironments, including topology, mechanics and composition. As topological and mechanical properties of the extracellular matrix are hard to access and control for in-depth studies of underlying mechanisms in vivo, defined biomimetic in vitro models are needed. Herein we show, how pore size and fibril diameter of collagen I networks distinctively regulate cancer cell morphology and invasion. Three-dimensional collagen I matrices with a tight control of pore size, fibril diameter and stiffness were reconstituted by adjustment of concentration and pH value during matrix reconstitution. At first, a detailed analysis of topology and mechanics of matrices using confocal laser scanning microscopy, image analysis tools and force spectroscopy indicate pore size and not fibril diameter as the major determinant of matrix elasticity. Secondly, by using two different breast cancer cell lines (MDA-MB-231 and MCF-7), we demonstrate collagen fibril diameter--and not pore size--to primarily regulate cell morphology, cluster formation and invasion. Invasiveness increased and clustering decreased with increasing fibril diameter for both, the highly invasive MDA-MB-231 cells with mesenchymal migratory phenotype and the MCF-7 cells with amoeboid migratory phenotype. As this behavior was independent of overall pore size, matrix elasticity is shown to be not the major determinant of the cell characteristics. Our work emphasizes the complex relationship between structural-mechanical properties of the extracellular matrix and invasive behavior of cancer cells. It suggests a correlation of migratory and invasive phenotype of cancer cells in dependence on topological and mechanical features of the length scale of single fibrils and not on coarse-grained network properties. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Comparison of known food weights with image-based portion-size automated estimation and adolescents' self-reported portion size.

    PubMed

    Lee, Christina D; Chae, Junghoon; Schap, TusaRebecca E; Kerr, Deborah A; Delp, Edward J; Ebert, David S; Boushey, Carol J

    2012-03-01

    Diet is a critical element of diabetes self-management. An emerging area of research is the use of images for dietary records using mobile telephones with embedded cameras. These tools are being designed to reduce user burden and to improve accuracy of portion-size estimation through automation. The objectives of this study were to (1) assess the error of automatically determined portion weights compared to known portion weights of foods and (2) to compare the error between automation and human. Adolescents (n = 15) captured images of their eating occasions over a 24 h period. All foods and beverages served were weighed. Adolescents self-reported portion sizes for one meal. Image analysis was used to estimate portion weights. Data analysis compared known weights, automated weights, and self-reported portions. For the 19 foods, the mean ratio of automated weight estimate to known weight ranged from 0.89 to 4.61, and 9 foods were within 0.80 to 1.20. The largest error was for lettuce and the most accurate was strawberry jam. The children were fairly accurate with portion estimates for two foods (sausage links, toast) using one type of estimation aid and two foods (sausage links, scrambled eggs) using another aid. The automated method was fairly accurate for two foods (sausage links, jam); however, the 95% confidence intervals for the automated estimates were consistently narrower than human estimates. The ability of humans to estimate portion sizes of foods remains a problem and a perceived burden. Errors in automated portion-size estimation can be systematically addressed while minimizing the burden on people. Future applications that take over the burden of these processes may translate to better diabetes self-management. © 2012 Diabetes Technology Society.

  7. Flaw depth sizing using guided waves

    NASA Astrophysics Data System (ADS)

    Cobb, Adam C.; Fisher, Jay L.

    2016-02-01

    Guided wave inspection technology is most often applied as a survey tool for pipeline inspection, where relatively low frequency ultrasonic waves, compared to those used in conventional ultrasonic nondestructive evaluation (NDE) methods, propagate along the structure; discontinuities cause a reflection of the sound back to the sensor for flaw detection. Although the technology can be used to accurately locate a flaw over long distances, the flaw sizing performance, especially for flaw depth estimation, is much poorer than other, local NDE approaches. Estimating flaw depth, as opposed to other parameters, is of particular interest for failure analysis of many structures. At present, most guided wave technologies estimate the size of the flaw based on the reflected signal amplitude from the flaw compared to a known geometry reflection, such as a circumferential weld in a pipeline. This process, however, requires many assumptions to be made, such as weld geometry and flaw shape. Furthermore, it is highly dependent on the amplitude of the flaw reflection, which can vary based on many factors, such as attenuation and sensor installation. To improve sizing performance, especially depth estimation, and do so in a way that is not strictly amplitude dependent, this paper describes an approach to estimate the depth of a flaw based on a multimodal analysis. This approach eliminates the need of using geometric reflections for calibration and can be used for both pipeline and plate inspection applications. To verify the approach, a test set was manufactured on plate specimens with flaws of different widths and depths ranging from 5% to 100% of total wall thickness; 90% of these flaws were sized to within 15% of their true value. A description of the initial multimodal sizing strategy and results will be discussed.

  8. A Bayesian nonparametric method for prediction in EST analysis

    PubMed Central

    Lijoi, Antonio; Mena, Ramsés H; Prünster, Igor

    2007-01-01

    Background Expressed sequence tags (ESTs) analyses are a fundamental tool for gene identification in organisms. Given a preliminary EST sample from a certain library, several statistical prediction problems arise. In particular, it is of interest to estimate how many new genes can be detected in a future EST sample of given size and also to determine the gene discovery rate: these estimates represent the basis for deciding whether to proceed sequencing the library and, in case of a positive decision, a guideline for selecting the size of the new sample. Such information is also useful for establishing sequencing efficiency in experimental design and for measuring the degree of redundancy of an EST library. Results In this work we propose a Bayesian nonparametric approach for tackling statistical problems related to EST surveys. In particular, we provide estimates for: a) the coverage, defined as the proportion of unique genes in the library represented in the given sample of reads; b) the number of new unique genes to be observed in a future sample; c) the discovery rate of new genes as a function of the future sample size. The Bayesian nonparametric model we adopt conveys, in a statistically rigorous way, the available information into prediction. Our proposal has appealing properties over frequentist nonparametric methods, which become unstable when prediction is required for large future samples. EST libraries, previously studied with frequentist methods, are analyzed in detail. Conclusion The Bayesian nonparametric approach we undertake yields valuable tools for gene capture and prediction in EST libraries. The estimators we obtain do not feature the kind of drawbacks associated with frequentist estimators and are reliable for any size of the additional sample. PMID:17868445

  9. Dealing with the Conflicting Results of Psycholinguistic Experiments: How to Resolve Them with the Help of Statistical Meta-analysis.

    PubMed

    Rákosi, Csilla

    2018-01-22

    This paper proposes the use of the tools of statistical meta-analysis as a method of conflict resolution with respect to experiments in cognitive linguistics. With the help of statistical meta-analysis, the effect size of similar experiments can be compared, a well-founded and robust synthesis of the experimental data can be achieved, and possible causes of any divergence(s) in the outcomes can be revealed. This application of statistical meta-analysis offers a novel method of how diverging evidence can be dealt with. The workability of this idea is exemplified by a case study dealing with a series of experiments conducted as non-exact replications of Thibodeau and Boroditsky (PLoS ONE 6(2):e16782, 2011. https://doi.org/10.1371/journal.pone.0016782 ).

  10. Size, weight and position: ion mobility spectrometry and imaging MS combined.

    PubMed

    Kiss, András; Heeren, Ron M A

    2011-03-01

    Size, weight and position are three of the most important parameters that describe a molecule in a biological system. Ion mobility spectrometry is capable of separating molecules on the basis of their size or shape, whereas imaging mass spectrometry is an effective tool to measure the molecular weight and spatial distribution of molecules. Recent developments in both fields enabled the combination of the two technologies. As a result, ion-mobility-based imaging mass spectrometry is gaining more and more popularity as a (bio-)analytical tool enabling the determination of the size, weight and position of several molecules simultaneously on biological surfaces. This paper reviews the evolution of ion-mobility-based imaging mass spectrometry and provides examples of its application in analytical studies of biological surfaces.

  11. Relationships among charcoal particles from modern lacustrine sediments and remotely sensed fire events

    NASA Astrophysics Data System (ADS)

    López-Pérez, M.; Correa-Metrio, A.

    2013-05-01

    Analysis of charcoal particles from lacustrine sediments is a useful tool to understand fire regimes through time, and their relationships with climate and vegetation. However, the extent of the relationship between charcoal particles and their origin in terms of the spatial and temporal extent of the fire events is poorly known in the tropics. Modern sediments were collected from lakes in the Yucatan Peninsula and Central Mexico, 51 and 22 lakes respectively, to analyze their charcoal concentration and its relationships with modern fire events. Number of modern fire events was derived from the public source Fire Information for Resource Management System (FIRMS) for concentric spatial rings that ranged from 1 to 30 km of radius. The association between charcoal and fires was evaluated through the construction of linear models to explain charcoal concentration as a function of the number of fires recorded. Additionally, charcoal particles were stratified according to size to determine the association between fire distance and charcoal size classes. The relationship between total charcoal concentration and fire events was stronger for central Mexico than for the Yucatan Peninsula, which is probably the result of differences in vegetation cover. The highest determination coefficients were obtained for charcoal particle sizes ranging between 0.2 and 0.8 mm2, and for fire event distances of between 0 and 15 km from the lake. Overall, the analyses presented here offer useful tools to quantitatively and spatially reconstruct past regional fire dynamics in Central Mexico and the Yucatan Peninsula.

  12. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  13. Arabidopsis phenotyping through Geometric Morphometrics.

    PubMed

    Manacorda, Carlos A; Asurmendi, Sebastian

    2018-06-18

    Recently, much technical progress was achieved in the field of plant phenotyping. High-throughput platforms and the development of improved algorithms for rosette image segmentation make it now possible to extract shape and size parameters for genetic, physiological and environmental studies on a large scale. The development of low-cost phenotyping platforms and freeware resources make it possible to widely expand phenotypic analysis tools for Arabidopsis. However, objective descriptors of shape parameters that could be used independently of platform and segmentation software used are still lacking and shape descriptions still rely on ad hoc or even sometimes contradictory descriptors, which could make comparisons difficult and perhaps inaccurate. Modern geometric morphometrics is a family of methods in quantitative biology proposed to be the main source of data and analytical tools in the emerging field of phenomics studies. Based on the location of landmarks (corresponding points) over imaged specimens and by combining geometry, multivariate analysis and powerful statistical techniques, these tools offer the possibility to reproducibly and accurately account for shape variations amongst groups and measure them in shape distance units. Here, a particular scheme of landmarks placement on Arabidopsis rosette images is proposed to study shape variation in the case of viral infection processes. Shape differences between controls and infected plants are quantified throughout the infectious process and visualized. Quantitative comparisons between two unrelated ssRNA+ viruses are shown and reproducibility issues are assessed. Combined with the newest automated platforms and plant segmentation procedures, geometric morphometric tools could boost phenotypic features extraction and processing in an objective, reproducible manner.

  14. MobilomeFINDER: web-based tools for in silico and experimental discovery of bacterial genomic islands

    PubMed Central

    Ou, Hong-Yu; He, Xinyi; Harrison, Ewan M.; Kulasekara, Bridget R.; Thani, Ali Bin; Kadioglu, Aras; Lory, Stephen; Hinton, Jay C. D.; Barer, Michael R.; Rajakumar, Kumar

    2007-01-01

    MobilomeFINDER (http://mml.sjtu.edu.cn/MobilomeFINDER) is an interactive online tool that facilitates bacterial genomic island or ‘mobile genome’ (mobilome) discovery; it integrates the ArrayOme and tRNAcc software packages. ArrayOme utilizes a microarray-derived comparative genomic hybridization input data set to generate ‘inferred contigs’ produced by merging adjacent genes classified as ‘present’. Collectively these ‘fragments’ represent a hypothetical ‘microarray-visualized genome (MVG)’. ArrayOme permits recognition of discordances between physical genome and MVG sizes, thereby enabling identification of strains rich in microarray-elusive novel genes. Individual tRNAcc tools facilitate automated identification of genomic islands by comparative analysis of the contents and contexts of tRNA sites and other integration hotspots in closely related sequenced genomes. Accessory tools facilitate design of hotspot-flanking primers for in silico and/or wet-science-based interrogation of cognate loci in unsequenced strains and analysis of islands for features suggestive of foreign origins; island-specific and genome-contextual features are tabulated and represented in schematic and graphical forms. To date we have used MobilomeFINDER to analyse several Enterobacteriaceae, Pseudomonas aeruginosa and Streptococcus suis genomes. MobilomeFINDER enables high-throughput island identification and characterization through increased exploitation of emerging sequence data and PCR-based profiling of unsequenced test strains; subsequent targeted yeast recombination-based capture permits full-length sequencing and detailed functional studies of novel genomic islands. PMID:17537813

  15. Pilot Study of an Open-source Image Analysis Software for Automated Screening of Conventional Cervical Smears.

    PubMed

    Sanyal, Parikshit; Ganguli, Prosenjit; Barui, Sanghita; Deb, Prabal

    2018-01-01

    The Pap stained cervical smear is a screening tool for cervical cancer. Commercial systems are used for automated screening of liquid based cervical smears. However, there is no image analysis software used for conventional cervical smears. The aim of this study was to develop and test the diagnostic accuracy of a software for analysis of conventional smears. The software was developed using Python programming language and open source libraries. It was standardized with images from Bethesda Interobserver Reproducibility Project. One hundred and thirty images from smears which were reported Negative for Intraepithelial Lesion or Malignancy (NILM), and 45 images where some abnormality has been reported, were collected from the archives of the hospital. The software was then tested on the images. The software was able to segregate images based on overall nuclear: cytoplasmic ratio, coefficient of variation (CV) in nuclear size, nuclear membrane irregularity, and clustering. 68.88% of abnormal images were flagged by the software, as well as 19.23% of NILM images. The major difficulties faced were segmentation of overlapping cell clusters and separation of neutrophils. The software shows potential as a screening tool for conventional cervical smears; however, further refinement in technique is required.

  16. Estimating the Diets of Animals Using Stable Isotopes and a Comprehensive Bayesian Mixing Model

    PubMed Central

    Hopkins, John B.; Ferguson, Jake M.

    2012-01-01

    Using stable isotope mixing models (SIMMs) as a tool to investigate the foraging ecology of animals is gaining popularity among researchers. As a result, statistical methods are rapidly evolving and numerous models have been produced to estimate the diets of animals—each with their benefits and their limitations. Deciding which SIMM to use is contingent on factors such as the consumer of interest, its food sources, sample size, the familiarity a user has with a particular framework for statistical analysis, or the level of inference the researcher desires to make (e.g., population- or individual-level). In this paper, we provide a review of commonly used SIMM models and describe a comprehensive SIMM that includes all features commonly used in SIMM analysis and two new features. We used data collected in Yosemite National Park to demonstrate IsotopeR's ability to estimate dietary parameters. We then examined the importance of each feature in the model and compared our results to inferences from commonly used SIMMs. IsotopeR's user interface (in R) will provide researchers a user-friendly tool for SIMM analysis. The model is also applicable for use in paleontology, archaeology, and forensic studies as well as estimating pollution inputs. PMID:22235246

  17. Quantitative Analysis Of Three-dimensional Branching Systems From X-ray Computed Microtomography Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKinney, Adriana L.; Varga, Tamas

    Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less

  18. Recent advances in the development and application of nanoelectrodes.

    PubMed

    Fan, Yunshan; Han, Chu; Zhang, Bo

    2016-10-07

    Nanoelectrodes have key advantages compared to electrodes of conventional size and are the tool of choice for numerous applications in both fundamental electrochemistry research and bioelectrochemical analysis. This Minireview summarizes recent advances in the development, characterization, and use of nanoelectrodes in nanoscale electroanalytical chemistry. Methods of nanoelectrode preparation include laser-pulled glass-sealed metal nanoelectrodes, mass-produced nanoelectrodes, carbon nanotube based and carbon-filled nanopipettes, and tunneling nanoelectrodes. Several new topics of their recent application are covered, which include the use of nanoelectrodes for electrochemical imaging at ultrahigh spatial resolution, imaging with nanoelectrodes and nanopipettes, electrochemical analysis of single cells, single enzymes, and single nanoparticles, and the use of nanoelectrodes to understand single nanobubbles.

  19. Non-parametric correlative uncertainty quantification and sensitivity analysis: Application to a Langmuir bimolecular adsorption model

    NASA Astrophysics Data System (ADS)

    Feng, Jinchao; Lansford, Joshua; Mironenko, Alexander; Pourkargar, Davood Babaei; Vlachos, Dionisios G.; Katsoulakis, Markos A.

    2018-03-01

    We propose non-parametric methods for both local and global sensitivity analysis of chemical reaction models with correlated parameter dependencies. The developed mathematical and statistical tools are applied to a benchmark Langmuir competitive adsorption model on a close packed platinum surface, whose parameters, estimated from quantum-scale computations, are correlated and are limited in size (small data). The proposed mathematical methodology employs gradient-based methods to compute sensitivity indices. We observe that ranking influential parameters depends critically on whether or not correlations between parameters are taken into account. The impact of uncertainty in the correlation and the necessity of the proposed non-parametric perspective are demonstrated.

  20. Application of the GRC Stirling Convertor System Dynamic Model

    NASA Technical Reports Server (NTRS)

    Regan, Timothy F.; Lewandowski, Edward J.; Schreiber, Jeffrey G. (Technical Monitor)

    2004-01-01

    The GRC Stirling Convertor System Dynamic Model (SDM) has been developed to simulate dynamic performance of power systems incorporating free-piston Stirling convertors. This paper discusses its use in evaluating system dynamics and other systems concerns. Detailed examples are provided showing the use of the model in evaluation of off-nominal operating conditions. The many degrees of freedom in both the mechanical and electrical domains inherent in the Stirling convertor and the nonlinear dynamics make simulation an attractive analysis tool in conjunction with classical analysis. Application of SDM in studying the relationship of the size of the resonant circuit quality factor (commonly referred to as Q) in the various resonant mechanical and electrical sub-systems is discussed.

  1. Annotate-it: a Swiss-knife approach to annotation, analysis and interpretation of single nucleotide variation in human disease

    PubMed Central

    2012-01-01

    The increasing size and complexity of exome/genome sequencing data requires new tools for clinical geneticists to discover disease-causing variants. Bottlenecks in identifying the causative variation include poor cross-sample querying, constantly changing functional annotation and not considering existing knowledge concerning the phenotype. We describe a methodology that facilitates exploration of patient sequencing data towards identification of causal variants under different genetic hypotheses. Annotate-it facilitates handling, analysis and interpretation of high-throughput single nucleotide variant data. We demonstrate our strategy using three case studies. Annotate-it is freely available and test data are accessible to all users at http://www.annotate-it.org. PMID:23013645

  2. System Modeling of Lunar Oxygen Production: Mass and Power Requirements

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J.; Freeh, Joshua E.; Linne, Diane L.; Faykus, Eric W.; Gallo, Christopher A.; Green, Robert D.

    2007-01-01

    A systems analysis tool for estimating the mass and power requirements for a lunar oxygen production facility is introduced. The individual modeling components involve the chemical processing and cryogenic storage subsystems needed to process a beneficiated regolith stream into liquid oxygen via ilmenite reduction. The power can be supplied from one of six different fission reactor-converter systems. A baseline system analysis, capable of producing 15 metric tons of oxygen per annum, is presented. The influence of reactor-converter choice was seen to have a small but measurable impact on the system configuration and performance. Finally, the mission concept of operations can have a substantial impact upon individual component size and power requirements.

  3. The Challenges of Analyzing Behavioral Response Study Data: An Overview of the MOCHA (Multi-study OCean Acoustics Human Effects Analysis) Project.

    PubMed

    Harris, Catriona M; Thomas, Len; Sadykova, Dina; DeRuiter, Stacy L; Tyack, Peter L; Southall, Brandon L; Read, Andrew J; Miller, Patrick J O

    2016-01-01

    This paper describes the MOCHA project which aims to develop novel approaches for the analysis of data collected during Behavioral Response Studies (BRSs). BRSs are experiments aimed at directly quantifying the effects of controlled dosages of natural or anthropogenic stimuli (typically sound) on marine mammal behavior. These experiments typically result in low sample size, relative to variability, and so we are looking at a number of studies in combination to maximize the gain from each one. We describe a suite of analytical tools applied to BRS data on beaked whales, including a simulation study aimed at informing future experimental design.

  4. Factors that affect micro-tooling features created by direct printing approach

    NASA Astrophysics Data System (ADS)

    Kumbhani, Mayur N.

    Current market required faster pace production of smaller, better, and improved products in shorter amount of time. Traditional high-rate manufacturing process such as hot embossing, injection molding, compression molding, etc. use tooling to replicate feature on a products. Miniaturization of many product in the field of biomedical, electronics, optical, and microfluidic is occurring on a daily bases. There is a constant need to produce cheaper, and faster tooling, which can be utilize by existing manufacturing processes. Traditionally, in order to manufacture micron size tooling features processes such as micro-machining, Electrical Discharge Machining (EDM), etc. are utilized. Due to a higher difficulty to produce smaller size features, and longer production cycle time, various additive manufacturing approaches are proposed, e.g. selective laser sintering (SLS), inkjet printing (3DP), fused deposition modeling (FDM), etc. were proposed. Most of these approaches can produce net shaped products from different materials such as metal, ceramic, or polymers. Several attempts were made to produce tooling features using additive manufacturing approaches. Most of these produced tooling were not cost effective, and the life cycle of these tooling was reported short. In this research, a method to produce tooling features using direct printing approach, where highly filled feedstock was dispensed on a substrate. This research evaluated different natural binders, such as guar gum, xanthan gum, and sodium carboxymethyl cellulose (NaCMC) and their combinations were evaluated. The best binder combination was then use to evaluate effect of different metal (316L stainless steel (3 mum), 316 stainless steel (45 mum), and 304 stainless steel (45 mum)) particle size on feature quality. Finally, the effect of direct printing process variables such as dispensing tip internal diameter (500 mum, and 333 mum) at different printing speeds were evaluated.

  5. Identification of Shiga-Toxigenic Escherichia coli outbreak isolates by a novel data analysis tool after matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Christner, Martin; Dressler, Dirk; Andrian, Mark; Reule, Claudia; Petrini, Orlando

    2017-01-01

    The fast and reliable characterization of bacterial and fungal pathogens plays an important role in infectious disease control and tracking of outbreak agents. DNA based methods are the gold standard for epidemiological investigations, but they are still comparatively expensive and time-consuming. Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) is a fast, reliable and cost-effective technique now routinely used to identify clinically relevant human pathogens. It has been used for subspecies differentiation and typing, but its use for epidemiological tasks, e. g. for outbreak investigations, is often hampered by the complexity of data analysis. We have analysed publicly available MALDI-TOF mass spectra from a large outbreak of Shiga-Toxigenic Escherichia coli in northern Germany using a general purpose software tool for the analysis of complex biological data. The software was challenged with depauperate spectra and reduced learning group sizes to mimic poor spectrum quality and scarcity of reference spectra at the onset of an outbreak. With high quality formic acid extraction spectra, the software's built in classifier accurately identified outbreak related strains using as few as 10 reference spectra (99.8% sensitivity, 98.0% specificity). Selective variation of processing parameters showed impaired marker peak detection and reduced classification accuracy in samples with high background noise or artificially reduced peak counts. However, the software consistently identified mass signals suitable for a highly reliable marker peak based classification approach (100% sensitivity, 99.5% specificity) even from low quality direct deposition spectra. The study demonstrates that general purpose data analysis tools can effectively be used for the analysis of bacterial mass spectra.

  6. Can we trust the calculation of texture indices of CT images? A phantom study.

    PubMed

    Caramella, Caroline; Allorant, Adrien; Orlhac, Fanny; Bidault, Francois; Asselain, Bernard; Ammari, Samy; Jaranowski, Patricia; Moussier, Aurelie; Balleyguier, Corinne; Lassau, Nathalie; Pitre-Champagnat, Stephanie

    2018-04-01

    Texture analysis is an emerging tool in the field of medical imaging analysis. However, many issues have been raised in terms of its use in assessing patient images and it is crucial to harmonize and standardize this new imaging measurement tool. This study was designed to evaluate the reliability of texture indices of CT images on a phantom including a reproducibility study, to assess the discriminatory capacity of indices potentially relevant in CT medical images and to determine their redundancy. For the reproducibility and discriminatory analysis, eight identical CT acquisitions were performed on a phantom including one homogeneous insert and two close heterogeneous inserts. Texture indices were selected for their high reproducibility and capability of discriminating different textures. For the redundancy analysis, 39 acquisitions of the same phantom were performed using varying acquisition parameters and a correlation matrix was used to explore the 2 × 2 relationships. LIFEx software was used to explore 34 different parameters including first order and texture indices. Only eight indices of 34 exhibited high reproducibility and discriminated textures from each other. Skewness and kurtosis from histogram were independent from the six other indices but were intercorrelated, the other six indices correlated in diverse degrees (entropy, dissimilarity, and contrast of the co-occurrence matrix, contrast of the Neighborhood Gray Level difference matrix, SZE, ZLNU of the Gray-Level Size Zone Matrix). Care should be taken when using texture analysis as a tool to characterize CT images because changes in quantitation may be primarily due to internal variability rather than from real physio-pathological effects. Some textural indices appear to be sufficiently reliable and capable to discriminate close textures on CT images. © 2018 American Association of Physicists in Medicine.

  7. Fundamental radiological and geometric performance of two types of proton beam modulated discrete scanning systems.

    PubMed

    Farr, J B; Dessy, F; De Wilde, O; Bietzer, O; Schönenberg, D

    2013-07-01

    The purpose of this investigation was to compare and contrast the measured fundamental properties of two new types of modulated proton scanning systems. This provides a basis for clinical expectations based on the scanned beam quality and a benchmark for computational models. Because the relatively small beam and fast scanning gave challenges to the characterization, a secondary purpose was to develop and apply new approaches where necessary to do so. The following performances of the proton scanning systems were investigated: beamlet alignment, static in-air beamlet size and shape, scanned in-air penumbra, scanned fluence map accuracy, geometric alignment of scanning system to isocenter, maximum field size, lateral and longitudinal field uniformity of a 1 l cubic uniform field, output stability over time, gantry angle invariance, monitoring system linearity, and reproducibility. A range of detectors was used: film, ionization chambers, lateral multielement and longitudinal multilayer ionization chambers, and a scintillation screen combined with a digital video camera. Characterization of the scanned fluence maps was performed with a software analysis tool. The resulting measurements and analysis indicated that the two types of delivery systems performed within specification for those aspects investigated. The significant differences were observed between the two types of scanning systems where one type exhibits a smaller spot size and associated penumbra than the other. The differential is minimum at maximum energy and increases inversely with decreasing energy. Additionally, the large spot system showed an increase in dose precision to a static target with layer rescanning whereas the small spot system did not. The measured results from the two types of modulated scanning types of system were consistent with their designs under the conditions tested. The most significant difference between the types of system was their proton spot size and associated resolution, factors of magnetic optics, and vacuum length. The need and benefit of mutielement detectors and high-resolution sensors was also shown. The use of a fluence map analytical software tool was particularly effective in characterizing the dynamic proton energy-layer scanning.

  8. IMG/M: integrated genome and metagenome comparative data analysis system

    DOE PAGES

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken; ...

    2016-10-13

    The Integrated Microbial Genomes with Microbiome Samples (IMG/M: https://img.jgi.doe.gov/m/) system contains annotated DNA and RNA sequence data of (i) archaeal, bacterial, eukaryotic and viral genomes from cultured organisms, (ii) single cell genomes (SCG) and genomes from metagenomes (GFM) from uncultured archaea, bacteria and viruses and (iii) metagenomes from environmental, host associated and engineered microbiome samples. Sequence data are generated by DOE's Joint Genome Institute (JGI), submitted by individual scientists, or collected from public sequence data archives. Structural and functional annotation is carried out by JGI's genome and metagenome annotation pipelines. A variety of analytical and visualization tools provide support formore » examining and comparing IMG/M's datasets. IMG/M allows open access interactive analysis of publicly available datasets, while manual curation, submission and access to private datasets and computationally intensive workspace-based analysis require login/password access to its expert review(ER) companion system (IMG/M ER: https://img.jgi.doe.gov/ mer/). Since the last report published in the 2014 NAR Database Issue, IMG/M's dataset content has tripled in terms of number of datasets and overall protein coding genes, while its analysis tools have been extended to cope with the rapid growth in the number and size of datasets handled by the system.« less

  9. Topological characterization and early detection of bifurcations and chaos in complex systems using persistent homology.

    PubMed

    Mittal, Khushboo; Gupta, Shalabh

    2017-05-01

    Early detection of bifurcations and chaos and understanding their topological characteristics are essential for safe and reliable operation of various electrical, chemical, physical, and industrial processes. However, the presence of non-linearity and high-dimensionality in system behavior makes this analysis a challenging task. The existing methods for dynamical system analysis provide useful tools for anomaly detection (e.g., Bendixson-Dulac and Poincare-Bendixson criteria can detect the presence of limit cycles); however, they do not provide a detailed topological understanding about system evolution during bifurcations and chaos, such as the changes in the number of subcycles and their positions, lifetimes, and sizes. This paper addresses this research gap by using topological data analysis as a tool to study system evolution and develop a mathematical framework for detecting the topological changes in the underlying system using persistent homology. Using the proposed technique, topological features (e.g., number of relevant k-dimensional holes, etc.) are extracted from nonlinear time series data which are useful for deeper analysis of the system behavior and early detection of bifurcations and chaos. When applied to a Logistic map, a Duffing oscillator, and a real life Op-amp based Jerk circuit, these features are shown to accurately characterize the system dynamics and detect the onset of chaos.

  10. IMG/M: integrated genome and metagenome comparative data analysis system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken

    The Integrated Microbial Genomes with Microbiome Samples (IMG/M: https://img.jgi.doe.gov/m/) system contains annotated DNA and RNA sequence data of (i) archaeal, bacterial, eukaryotic and viral genomes from cultured organisms, (ii) single cell genomes (SCG) and genomes from metagenomes (GFM) from uncultured archaea, bacteria and viruses and (iii) metagenomes from environmental, host associated and engineered microbiome samples. Sequence data are generated by DOE's Joint Genome Institute (JGI), submitted by individual scientists, or collected from public sequence data archives. Structural and functional annotation is carried out by JGI's genome and metagenome annotation pipelines. A variety of analytical and visualization tools provide support formore » examining and comparing IMG/M's datasets. IMG/M allows open access interactive analysis of publicly available datasets, while manual curation, submission and access to private datasets and computationally intensive workspace-based analysis require login/password access to its expert review(ER) companion system (IMG/M ER: https://img.jgi.doe.gov/ mer/). Since the last report published in the 2014 NAR Database Issue, IMG/M's dataset content has tripled in terms of number of datasets and overall protein coding genes, while its analysis tools have been extended to cope with the rapid growth in the number and size of datasets handled by the system.« less

  11. IMG/M: integrated genome and metagenome comparative data analysis system

    PubMed Central

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken; Palaniappan, Krishna; Szeto, Ernest; Pillay, Manoj; Ratner, Anna; Huang, Jinghua; Andersen, Evan; Huntemann, Marcel; Varghese, Neha; Hadjithomas, Michalis; Tennessen, Kristin; Nielsen, Torben; Ivanova, Natalia N.; Kyrpides, Nikos C.

    2017-01-01

    The Integrated Microbial Genomes with Microbiome Samples (IMG/M: https://img.jgi.doe.gov/m/) system contains annotated DNA and RNA sequence data of (i) archaeal, bacterial, eukaryotic and viral genomes from cultured organisms, (ii) single cell genomes (SCG) and genomes from metagenomes (GFM) from uncultured archaea, bacteria and viruses and (iii) metagenomes from environmental, host associated and engineered microbiome samples. Sequence data are generated by DOE's Joint Genome Institute (JGI), submitted by individual scientists, or collected from public sequence data archives. Structural and functional annotation is carried out by JGI's genome and metagenome annotation pipelines. A variety of analytical and visualization tools provide support for examining and comparing IMG/M's datasets. IMG/M allows open access interactive analysis of publicly available datasets, while manual curation, submission and access to private datasets and computationally intensive workspace-based analysis require login/password access to its expert review (ER) companion system (IMG/M ER: https://img.jgi.doe.gov/mer/). Since the last report published in the 2014 NAR Database Issue, IMG/M's dataset content has tripled in terms of number of datasets and overall protein coding genes, while its analysis tools have been extended to cope with the rapid growth in the number and size of datasets handled by the system. PMID:27738135

  12. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  13. Acceptability and potential effectiveness of commercial portion control tools amongst people with obesity.

    PubMed

    Almiron-Roig, Eva; Domínguez, Angélica; Vaughan, David; Solis-Trapala, Ivonne; Jebb, Susan A

    2016-12-01

    Exposure to large portion sizes is a risk factor for obesity. Specifically designed tableware may modulate how much is eaten and help with portion control. We examined the experience of using a guided crockery set (CS) and a calibrated serving spoon set (SS) by individuals trying to manage their weight. Twenty-nine obese adults who had completed 7-12 weeks of a community weight-loss programme were invited to use both tools for 2 weeks each, in a crossover design, with minimal health professional contact. A paper-based questionnaire was used to collect data on acceptance, perceived changes in portion size, frequency, and type of meal when the tool was used. Scores describing acceptance, ease of use and perceived effectiveness were derived from five-point Likert scales from which binary indicators (high/low) were analysed using logistic regression. Mean acceptance, ease of use and perceived effectiveness were moderate to high (3·7-4·4 points). Tool type did not have an impact on indicators of acceptance, ease of use and perceived effectiveness (P>0·32 for all comparisons); 55 % of participants used the CS on most days v. 21 % for the SS. The CS was used for all meals, whereas the SS was mostly used for evening meals. Self-selected portion sizes increased for vegetables and decreased for chips and potatoes with both tools. Participants rated both tools as equally acceptable, easy to use and with similar perceived effectiveness. Formal trials to evaluate the impact of such tools on weight control are warranted.

  14. Formula hybrid SAE.

    DOT National Transportation Integrated Search

    2013-09-01

    User-friendly tools are needed for undergraduates to learn about component sizing, powertrain integration, and control : strategies for student competitions involving hybrid vehicles. A TK Solver tool was developed at the University of Idaho for : th...

  15. Analysis of Carbohydrate-Carbohydrate Interactions Using Sugar-Functionalized Silicon Nanoparticles for Cell Imaging.

    PubMed

    Lai, Chian-Hui; Hütter, Julia; Hsu, Chien-Wei; Tanaka, Hidenori; Varela-Aramburu, Silvia; De Cola, Luisa; Lepenies, Bernd; Seeberger, Peter H

    2016-01-13

    Protein-carbohydrate binding depends on multivalent ligand display that is even more important for low affinity carbohydrate-carbohydrate interactions. Detection and analysis of these low affinity multivalent binding events are technically challenging. We describe the synthesis of dual-fluorescent sugar-capped silicon nanoparticles that proved to be an attractive tool for the analysis of low affinity interactions. These ultrasmall NPs with sizes of around 4 nm can be used for NMR quantification of coupled sugars. The silicon nanoparticles are employed to measure the interaction between the cancer-associated glycosphingolipids GM3 and Gg3 and the associated kD value by surface plasmon resonance experiments. Cell binding studies, to investigate the biological relevance of these carbohydrate-carbohydrate interactions, also benefit from these fluorescent sugar-capped nanoparticles.

  16. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  17. Haemocompatibility of iron oxide nanoparticles synthesized for theranostic applications: a high-sensitivity microfluidic tool

    NASA Astrophysics Data System (ADS)

    Rodrigues, Raquel O.; Bañobre-López, Manuel; Gallo, Juan; Tavares, Pedro B.; Silva, Adrián M. T.; Lima, Rui; Gomes, Helder T.

    2016-07-01

    The poor heating efficiency of the most reported magnetic nanoparticles (MNPs), allied to the lack of comprehensive biocompatibility and haemodynamic studies, hampers the spread of multifunctional nanoparticles as the next generation of therapeutic bio-agents in medicine. The present work reports the synthesis and characterization, with special focus on biological/toxicological compatibility, of superparamagnetic nanoparticles with diameter around 18 nm, suitable for theranostic applications (i.e. simultaneous diagnosis and therapy of cancer). Envisioning more insights into the complex nanoparticle-red blood cells (RBCs) membrane interaction, the deformability of the human RBCs in contact with magnetic nanoparticles (MNPs) was assessed for the first time with a microfluidic extensional approach, and used as an indicator of haematological disorders in comparison with a conventional haematological test, i.e. the haemolysis analysis. Microfluidic results highlight the potential of this microfluidic tool over traditional haemolysis analysis, by detecting small increments in the rigidity of the blood cells, when traditional haemotoxicology analysis showed no significant alteration (haemolysis rates lower than 2 %). The detected rigidity has been predicted to be due to the wrapping of small MNPs by the bilayer membrane of the RBCs, which is directly related to MNPs size, shape and composition. The proposed microfluidic tool adds a new dimension into the field of nanomedicine, allowing to be applied as a high-sensitivity technique capable of bringing a better understanding of the biological impact of nanoparticles developed for clinical applications.

  18. Smartphone Cortex Controlled Real-Time Image Processing and Reprocessing for Concentration Independent LED Induced Fluorescence Detection in Capillary Electrophoresis.

    PubMed

    Szarka, Mate; Guttman, Andras

    2017-10-17

    We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.

  19. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  20. Intravital Microscopic Interrogation of Peripheral Taste Sensation

    NASA Astrophysics Data System (ADS)

    Choi, Myunghwan; Lee, Woei Ming; Yun, Seok Hyun

    2015-03-01

    Intravital microscopy is a powerful tool in neuroscience but has not been adapted to the taste sensory organ due to anatomical constraint. Here we developed an imaging window to facilitate microscopic access to the murine tongue in vivo. Real-time two-photon microscopy allowed the visualization of three-dimensional microanatomy of the intact tongue mucosa and functional activity of taste cells in response to topically administered tastants in live mice. Video microscopy also showed the calcium activity of taste cells elicited by small-sized tastants in the blood circulation. Molecular kinetic analysis suggested that intravascular taste sensation takes place at the microvilli on the apical side of taste cells after diffusion of the molecules through the pericellular capillaries and tight junctions in the taste bud. Our results demonstrate the capabilities and utilities of the new tool for taste research in vivo.

Top