Sample records for simple analytical tools

  1. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  2. Analytical Tools in School Finance Reform.

    ERIC Educational Resources Information Center

    Johns, R. L.

    This paper discusses the problem of analyzing variations in the educational opportunities provided by different school districts and describes how to assess the impact of school finance alternatives through use of various analytical tools. The author first examines relatively simple analytical methods, including calculation of per-pupil…

  3. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    ERIC Educational Resources Information Center

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  5. Analytical Tools Interface for Landscape Assessments

    EPA Science Inventory

    Environmental management practices are trending away from simple, local-scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to implemen...

  6. How to Recognize Success and Failure: Practical Assessment of an Evolving, First-Semester Laboratory Program Using Simple, Outcome-Based Tools

    ERIC Educational Resources Information Center

    Gron, Liz U.; Bradley, Shelly B.; McKenzie, Jennifer R.; Shinn, Sara E.; Teague, M. Warfield

    2013-01-01

    This paper presents the use of simple, outcome-based assessment tools to design and evaluate the first semester of a new introductory laboratory program created to teach green analytical chemistry using environmental samples. This general chemistry laboratory program, like many introductory courses, has a wide array of stakeholders within and…

  7. Analytical determination of space station response to crew motion and design of suspension system for microgravity experiments

    NASA Technical Reports Server (NTRS)

    Liu, F. C.

    1986-01-01

    The objective of this investigation is to make analytical determination of the acceleration produced by crew motion in an orbiting space station and define design parameters for the suspension system of microgravity experiments. A simple structural model for simulation of the IOC space station is proposed. Mathematical formulation of this model provides the engineers a simple and direct tool for designing an effective suspension system.

  8. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  9. ANALYTICAL TOOL INTERFACE FOR LANDSCAPE ASSESSMENTS (ATIILA): AN ARCVIEW EXTENSION FOR THE ANALYSIS OF LANDSCAPE PATTERNS, COMPOSITION, AND STRUCTURE

    EPA Science Inventory

    Environmental management practices are trending away from simple, local- scale assessments toward complex, multiple-stressor regional assessments. Landscape ecology provides the theory behind these assessments while geographic information systems (GIS) supply the tools to impleme...

  10. Analytical and multibody modeling for the power analysis of standing jumps.

    PubMed

    Palmieri, G; Callegari, M; Fioretti, S

    2015-01-01

    Two methods for the power analysis of standing jumps are proposed and compared in this article. The first method is based on a simple analytical formulation which requires as input the coordinates of the center of gravity in three specified instants of the jump. The second method is based on a multibody model that simulates the jumps processing the data obtained by a three-dimensional (3D) motion capture system and the dynamometric measurements obtained by the force platforms. The multibody model is developed with OpenSim, an open-source software which provides tools for the kinematic and dynamic analyses of 3D human body models. The study is focused on two of the typical tests used to evaluate the muscular activity of lower limbs, which are the counter movement jump and the standing long jump. The comparison between the results obtained by the two methods confirms that the proposed analytical formulation is correct and represents a simple tool suitable for a preliminary analysis of total mechanical work and the mean power exerted in standing jumps.

  11. Simplified, inverse, ejector design tool

    NASA Technical Reports Server (NTRS)

    Dechant, Lawrence J.

    1993-01-01

    A simple lumped parameter based inverse design tool has been developed which provides flow path geometry and entrainment estimates subject to operational, acoustic, and design constraints. These constraints are manifested through specification of primary mass flow rate or ejector thrust, fully-mixed exit velocity, and static pressure matching. Fundamentally, integral forms of the conservation equations coupled with the specified design constraints are combined to yield an easily invertible linear system in terms of the flow path cross-sectional areas. Entrainment is computed by back substitution. Initial comparison with experimental and analogous one-dimensional methods show good agreement. Thus, this simple inverse design code provides an analytically based, preliminary design tool with direct application to High Speed Civil Transport (HSCT) design studies.

  12. Using Modern Solid-State Analytical Tools for Investigations of an Advanced Carbon Capture Material: Experiments for the Inorganic Chemistry Laboratory

    ERIC Educational Resources Information Center

    Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai

    2016-01-01

    A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…

  13. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  14. Introduction, comparison, and validation of Meta‐Essentials: A free and simple tool for meta‐analysis

    PubMed Central

    van Rhee, Henk; Hak, Tony

    2017-01-01

    We present a new tool for meta‐analysis, Meta‐Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta‐analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta‐Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta‐analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp‐Hartung adjustment of the DerSimonian‐Laird estimator. However, more advanced meta‐analysis methods such as meta‐analytical structural equation modelling and meta‐regression with multiple covariates are not available. In summary, Meta‐Essentials may prove a valuable resource for meta‐analysts, including researchers, teachers, and students. PMID:28801932

  15. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    NASA Astrophysics Data System (ADS)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  16. Simple Parametric Model for Airfoil Shape Description

    NASA Astrophysics Data System (ADS)

    Ziemkiewicz, David

    2017-12-01

    We show a simple, analytic equation describing a class of two-dimensional shapes well suited for representation of aircraft airfoil profiles. Our goal was to create a description characterized by a small number of parameters with easily understandable meaning, providing a tool to alter the shape with optimization procedures as well as manual tweaks by the designer. The generated shapes are well suited for numerical analysis with 2D flow solving software such as XFOIL.

  17. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  18. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  19. Comparison of maximum runup through analytical and numerical approaches for different fault parameters estimates

    NASA Astrophysics Data System (ADS)

    Kanoglu, U.; Wronna, M.; Baptista, M. A.; Miranda, J. M. A.

    2017-12-01

    The one-dimensional analytical runup theory in combination with near shore synthetic waveforms is a promising tool for tsunami rapid early warning systems. Its application in realistic cases with complex bathymetry and initial wave condition from inverse modelling have shown that maximum runup values can be estimated reasonably well. In this study we generate a simplistic bathymetry domains which resemble realistic near-shore features. We investigate the accuracy of the analytical runup formulae to the variation of fault source parameters and near-shore bathymetric features. To do this we systematically vary the fault plane parameters to compute the initial tsunami wave condition. Subsequently, we use the initial conditions to run the numerical tsunami model using coupled system of four nested grids and compare the results to the analytical estimates. Variation of the dip angle of the fault plane showed that analytical estimates have less than 10% difference for angles 5-45 degrees in a simple bathymetric domain. These results shows that the use of analytical formulae for fast run up estimates constitutes a very promising approach in a simple bathymetric domain and might be implemented in Hazard Mapping and Early Warning.

  20. Reducing the barriers against analytical epidemiological studies in investigations of local foodborne disease outbreaks in Germany - a starter kit for local health authorities.

    PubMed

    Werber, D; Bernard, H

    2014-02-27

    Thousands of infectious food-borne disease outbreaks (FBDO) are reported annually to the European Food Safety Authority within the framework of the zoonoses Directive (2003/99/EC). Most recognised FBDO occur locally following point source exposure, but only few are investigated using analytical epidemiological studies. In Germany, and probably also in other countries of the European Union, this seems to be particularly true for those investigated by local health authorities. Analytical studies, usually cohort studies or case–control studies, are a powerful tool to identify suspect food vehicles. Therefore, from a public health and food safety perspective, their more frequent usage is highly desirable. We have developed a small toolbox consisting of a strategic concept and a simple software tool for data entry and analysis, with the objective to increase the use of analytical studies in the investigation of local point source FBDO in Germany.

  1. Implementation and Use of the Reference Analytics Module of LibAnswers

    ERIC Educational Resources Information Center

    Flatley, Robert; Jensen, Robert Bruce

    2012-01-01

    Academic libraries have traditionally collected reference statistics using hash marks on paper. Although efficient and simple, this method is not an effective way to capture the complexity of reference transactions. Several electronic tools are now available to assist libraries with collecting often elusive reference data--among them homegrown…

  2. Engaging or Distracting: Children's Tablet Computer Use in Education

    ERIC Educational Resources Information Center

    McEwen, Rhonda N.; Dubé, Adam K.

    2015-01-01

    Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…

  3. Moscow Test Well, INEL Oversight Program: Aqueous geochemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCurry, M.; Fromm, J.; Welhan, J.

    1992-09-29

    This report presents a summary and interpretation of data gathered during sampling of the Moscow Test Well at Moscow, Idaho during April and May of 1992. The principal objectives of this chemical survey were to validate sampling procedures with a new straddle packer sampling tool in a previously hydrologically well characterized and simple sampling environment, and to compare analytical results from two independent labs for reproducibility of analytical results. Analytes included a wide range of metals, anions, nutrients, BNA`s, and VOC`s. Secondary objectives included analyzing of waters from a large distilled water tank (utilized for all field laboratory purposes asmore » ``pure`` stock water), of water which passed through a steamer used to clean the packer, and of rinsates from the packer tool itself before it was lowered into the test well. Analyses were also obtained of blanks and spikes for data validation purposes.« less

  4. Introduction, comparison, and validation of Meta-Essentials: A free and simple tool for meta-analysis.

    PubMed

    Suurmond, Robert; van Rhee, Henk; Hak, Tony

    2017-12-01

    We present a new tool for meta-analysis, Meta-Essentials, which is free of charge and easy to use. In this paper, we introduce the tool and compare its features to other tools for meta-analysis. We also provide detailed information on the validation of the tool. Although free of charge and simple, Meta-Essentials automatically calculates effect sizes from a wide range of statistics and can be used for a wide range of meta-analysis applications, including subgroup analysis, moderator analysis, and publication bias analyses. The confidence interval of the overall effect is automatically based on the Knapp-Hartung adjustment of the DerSimonian-Laird estimator. However, more advanced meta-analysis methods such as meta-analytical structural equation modelling and meta-regression with multiple covariates are not available. In summary, Meta-Essentials may prove a valuable resource for meta-analysts, including researchers, teachers, and students. © 2017 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  5. Air quality climate in the Columbia River Basin.

    Treesearch

    Sue A. Ferguson

    1998-01-01

    Aspects of climate that influence air quality in the Columbia River basin of the Northwestern United States are described. A few, relatively simple, analytical tools were developed to show the spatial and temporal patterns of mean-monthly mixing heights, precipitation scavenging, upper level and surface trajectory winds, and drought that inhibit pollution uptake. Also...

  6. Analytical expressions for the nonlinear interference in dispersion managed transmission coherent optical systems

    NASA Astrophysics Data System (ADS)

    Qiao, Yaojun; Li, Ming; Yang, Qiuhong; Xu, Yanfei; Ji, Yuefeng

    2015-01-01

    Closed-form expressions of nonlinear interference of dense wavelength-division-multiplexed (WDM) systems with dispersion managed transmission (DMT) are derived. We carry out a simulative validation by addressing an ample and significant set of the Nyquist-WDM systems based on polarization multiplexed quadrature phase-shift keying (PM-QPSK) subcarriers at a baud rate of 32 Gbaud per channel. Simulation results show the simple closed-form analytical expressions can provide an effective tool for the quick and accurate prediction of system performance in DMT coherent optical systems.

  7. Increasing Access and Usability of Remote Sensing Data: The NASA Protected Area Archive

    NASA Technical Reports Server (NTRS)

    Geller, Gary N.

    2004-01-01

    Although remote sensing data are now widely available, much of it at low or no-cost, many managers of protected conservation areas do not have the expertise or tools to view or analyze it. Thus access to it by the protected area management community is effectively blocked. The Protected Area Archive will increase access to remote sensing data by creating collections of satellite images of protected areas and packaging them with simple-to-use visualization and analytical tools. The user can easily locate the area and image of interest on a map, then display, roam, and zoom the image. A set of simple tools will be provided so the user can explore the data and employ it to assist in management and monitoring of their area. The 'Phase 1 ' version requires only a Windows-based computer and basic computer skills, and may be of particular help to protected area managers in developing countries.

  8. A simple geometrical model describing shapes of soap films suspended on two rings

    NASA Astrophysics Data System (ADS)

    Herrmann, Felix J.; Kilvington, Charles D.; Wildenberg, Rebekah L.; Camacho, Franco E.; Walecki, Wojciech J.; Walecki, Peter S.; Walecki, Eve S.

    2016-09-01

    We measured and analysed the stability of two types of soap films suspended on two rings using the simple conical frusta-based model, where we use common definition of conical frustum as a portion of a cone that lies between two parallel planes cutting it. Using frusta-based we reproduced very well-known results for catenoid surfaces with and without a central disk. We present for the first time a simple conical frusta based spreadsheet model of the soap surface. This very simple, elementary, geometrical model produces results surprisingly well matching the experimental data and known exact analytical solutions. The experiment and the spreadsheet model can be used as a powerful teaching tool for pre-calculus and geometry students.

  9. Simple method for the determination of personal care product ingredients in lettuce by ultrasound-assisted extraction combined with solid-phase microextraction followed by GC-MS.

    PubMed

    Cabrera-Peralta, Jerónimo; Peña-Alvarez, Araceli

    2018-05-01

    A simple method for the simultaneous determination of personal care product ingredients: galaxolide, tonalide, oxybenzone, 4-methylbenzyliden camphor, padimate-o, 2-ethylhexyl methoxycinnamate, octocrylene, triclosan, and methyl triclosan in lettuce by ultrasound-assisted extraction combined with solid-phase microextraction followed by gas chromatography with mass spectrometry was developed. Lettuce was directly extracted by ultrasound-assisted extraction with methanol, this extract was combined with water, extracted by solid-phase microextraction in immersion mode, and analyzed by gas chromatography with mass spectrometry. Good linear relationships (25-250 ng/g, R 2  > 0.9702) and low detection limits (1.0-25 ng/g) were obtained for analytes along with acceptable precision for almost all analytes (RSDs < 20%). The validated method was applied for the determination of personal care product ingredients in commercial lettuce and lettuces grown in soil and irrigated with the analytes, identifying the target analytes in leaves and roots of the latter. This procedure is a miniaturized and environmentally friendly proposal which can be a useful tool for quality analysis in lettuce. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  11. Big Data Tools as Applied to ATLAS Event Data

    NASA Astrophysics Data System (ADS)

    Vukotic, I.; Gardner, R. W.; Bryant, L. A.

    2017-10-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Logfiles, database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and associated analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data. Such modes would simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning environments and tools like Spark, Jupyter, R, SciPy, Caffe, TensorFlow, etc. Machine learning challenges such as the Higgs Boson Machine Learning Challenge, the Tracking challenge, Event viewers (VP1, ATLANTIS, ATLASrift), and still to be developed educational and outreach tools would be able to access the data through a simple REST API. In this preliminary investigation we focus on derived xAOD data sets. These are much smaller than the primary xAODs having containers, variables, and events of interest to a particular analysis. Being encouraged with the performance of Elasticsearch for the ADC analytics platform, we developed an algorithm for indexing derived xAOD event data. We have made an appropriate document mapping and have imported a full set of standard model W/Z datasets. We compare the disk space efficiency of this approach to that of standard ROOT files, the performance in simple cut flow type of data analysis, and will present preliminary results on its scaling characteristics with different numbers of clients, query complexity, and size of the data retrieved.

  12. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  13. Combined distribution functions: A powerful tool to identify cation coordination geometries in liquid systems

    NASA Astrophysics Data System (ADS)

    Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina

    2018-01-01

    In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.

  14. From near-infrared and Raman to surface-enhanced Raman spectroscopy: progress, limitations and perspectives in bioanalysis.

    PubMed

    Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric

    2016-05-01

    Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.

  15. Forensic collection of trace chemicals from diverse surfaces with strippable coatings.

    PubMed

    Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A

    2013-11-07

    Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.

  16. Entropy generation in Gaussian quantum transformations: applying the replica method to continuous-variable quantum information theory

    NASA Astrophysics Data System (ADS)

    Gagatsos, Christos N.; Karanikas, Alexandros I.; Kordas, Georgios; Cerf, Nicolas J.

    2016-02-01

    In spite of their simple description in terms of rotations or symplectic transformations in phase space, quadratic Hamiltonians such as those modelling the most common Gaussian operations on bosonic modes remain poorly understood in terms of entropy production. For instance, determining the quantum entropy generated by a Bogoliubov transformation is notably a hard problem, with generally no known analytical solution, while it is vital to the characterisation of quantum communication via bosonic channels. Here we overcome this difficulty by adapting the replica method, a tool borrowed from statistical physics and quantum field theory. We exhibit a first application of this method to continuous-variable quantum information theory, where it enables accessing entropies in an optical parametric amplifier. As an illustration, we determine the entropy generated by amplifying a binary superposition of the vacuum and a Fock state, which yields a surprisingly simple, yet unknown analytical expression.

  17. Ultimate Longitudinal Strength of Composite Ship Hulls

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangming; Huang, Lingkai; Zhu, Libao; Tang, Yuhang; Wang, Anwen

    2017-01-01

    A simple analytical model to estimate the longitudinal strength of ship hulls in composite materials under buckling, material failure and ultimate collapse is presented in this paper. Ship hulls are regarded as assemblies of stiffened panels which idealized as group of plate-stiffener combinations. Ultimate strain of the plate-stiffener combination is predicted under buckling or material failure with composite beam-column theory. The effects of initial imperfection of ship hull and eccentricity of load are included. Corresponding longitudinal strengths of ship hull are derived in a straightforward method. A longitudinally framed ship hull made of symmetrically stacked unidirectional plies under sagging is analyzed. The results indicate that present analytical results have a good agreement with FEM method. The initial deflection of ship hull and eccentricity of load can dramatically reduce the bending capacity of ship hull. The proposed formulations provide a simple but useful tool for the longitudinal strength estimation in practical design.

  18. Burden Calculator: a simple and open analytical tool for estimating the population burden of injuries.

    PubMed

    Bhalla, Kavi; Harrison, James E

    2016-04-01

    Burden of disease and injury methods can be used to summarise and compare the effects of conditions in terms of disability-adjusted life years (DALYs). Burden estimation methods are not inherently complex. However, as commonly implemented, the methods include complex modelling and estimation. To provide a simple and open-source software tool that allows estimation of incidence-DALYs due to injury, given data on incidence of deaths and non-fatal injuries. The tool includes a default set of estimation parameters, which can be replaced by users. The tool was written in Microsoft Excel. All calculations and values can be seen and altered by users. The parameter sets currently used in the tool are based on published sources. The tool is available without charge online at http://calculator.globalburdenofinjuries.org. To use the tool with the supplied parameter sets, users need to only paste a table of population and injury case data organised by age, sex and external cause of injury into a specified location in the tool. Estimated DALYs can be read or copied from tables and figures in another part of the tool. In some contexts, a simple and user-modifiable burden calculator may be preferable to undertaking a more complex study to estimate the burden of disease. The tool and the parameter sets required for its use can be improved by user innovation, by studies comparing DALYs estimates calculated in this way and in other ways, and by shared experience of its use. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  19. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    PubMed

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  20. I'll take that to go: Big data bags and minimal identifiers for exchange of large, complex datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chard, Kyle; D'Arcy, Mike; Heavner, Benjamin D.

    Big data workflows often require the assembly and exchange of complex, multi-element datasets. For example, in biomedical applications, the input to an analytic pipeline can be a dataset consisting thousands of images and genome sequences assembled from diverse repositories, requiring a description of the contents of the dataset in a concise and unambiguous form. Typical approaches to creating datasets for big data workflows assume that all data reside in a single location, requiring costly data marshaling and permitting errors of omission and commission because dataset members are not explicitly specified. We address these issues by proposing simple methods and toolsmore » for assembling, sharing, and analyzing large and complex datasets that scientists can easily integrate into their daily workflows. These tools combine a simple and robust method for describing data collections (BDBags), data descriptions (Research Objects), and simple persistent identifiers (Minids) to create a powerful ecosystem of tools and services for big data analysis and sharing. We present these tools and use biomedical case studies to illustrate their use for the rapid assembly, sharing, and analysis of large datasets.« less

  1. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    USGS Publications Warehouse

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., <1 m) or deep (e.g., up to 100 m) profiles. The solution is not transient, and thus, it should be cautiously applied where diel signals propagate or in deeper zones where multi‐decadal surface signals have disturbed subsurface thermal regimes.

  2. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  3. Exploring the Potential of Predictive Analytics and Big Data in Emergency Care.

    PubMed

    Janke, Alexander T; Overbeek, Daniel L; Kocher, Keith E; Levy, Phillip D

    2016-02-01

    Clinical research often focuses on resource-intensive causal inference, whereas the potential of predictive analytics with constantly increasing big data sources remains largely unexplored. Basic prediction, divorced from causal inference, is much easier with big data. Emergency care may benefit from this simpler application of big data. Historically, predictive analytics have played an important role in emergency care as simple heuristics for risk stratification. These tools generally follow a standard approach: parsimonious criteria, easy computability, and independent validation with distinct populations. Simplicity in a prediction tool is valuable, but technological advances make it no longer a necessity. Emergency care could benefit from clinical predictions built using data science tools with abundant potential input variables available in electronic medical records. Patients' risks could be stratified more precisely with large pools of data and lower resource requirements for comparing each clinical encounter to those that came before it, benefiting clinical decisionmaking and health systems operations. The largest value of predictive analytics comes early in the clinical encounter, in which diagnostic and prognostic uncertainty are high and resource-committing decisions need to be made. We propose an agenda for widening the application of predictive analytics in emergency care. Throughout, we express cautious optimism because there are myriad challenges related to database infrastructure, practitioner uptake, and patient acceptance. The quality of routinely compiled clinical data will remain an important limitation. Complementing big data sources with prospective data may be necessary if predictive analytics are to achieve their full potential to improve care quality in the emergency department. Copyright © 2015 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  4. A simple and sensitive method for the determination of fibric acids in the liver by liquid chromatography.

    PubMed

    Karahashi, Minako; Fukuhara, Hiroto; Hoshina, Miki; Sakamoto, Takeshi; Yamazaki, Tohru; Mitsumoto, Atsushi; Kawashima, Yoichi; Kudo, Naomi

    2014-01-01

    Fibrates are used in biochemical and pharmacological studies as bioactive tools. Nevertheless, most studies have lacked information concerning the concentrations of fibric acids working inside tissues because a simple and sensitive method is not available for their quantitation. This study aimed to develop a simple and sensitive bioanalytical method for the quantitation of clofibric, bezafibric and fenofibric acids in samples of very small portions of tissues. Fibric acids were extracted into n-hexane-ethyl acetate from tissue homogenates (10 mg of liver, kidney or muscle) or serum (100 µL) and were derivatized with 4-bromomethyl-6,7-dimethoxycoumarin, followed by HPLC with fluorescence detection. These compounds were separated isocratically on a reversed phase with acetonitrile-water. Standard analytical curves were linear over the concentration range of 0.2-20 nmol/10 mg of liver. Precision and accuracy were within acceptable limits. Recovery from liver homogenates ranged from 93.03 to 112.29%. This method enabled the quantitation of fibric acids in 10 mg of liver from rats treated with clofibric acid, bezafibric acid or fenofibrate. From these analytical data, it became clear that there was no large difference in ratio of acyl-CoA oxidase 1 (Acox1) mRNA level to fibric acid content in the liver among the three fibric acids, suggesting that these three fibric acids have similar potency to increase expression of the Acox1 gene, which is a target of peroxisome proliferator-activated receptor α. Thus, the proposed method is a simple, sensitive and reliable tool for the quantitation of fibric acids working in vivo inside livers.

  5. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  6. Determining absolute protein numbers by quantitative fluorescence microscopy.

    PubMed

    Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry

    2014-01-01

    Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.

  7. Structural analyses for the modification and verification of the Viking aeroshell

    NASA Technical Reports Server (NTRS)

    Stephens, W. B.; Anderson, M. S.

    1976-01-01

    The Viking aeroshell is an extremely lightweight flexible shell structure that has undergone thorough buckling analyses in the course of its development. The analytical tools and modeling technique required to reveal the structural behavior are presented. Significant results are given which illustrate the complex failure modes not usually observed in simple models and analyses. Both shell-of-revolution analysis for the pressure loads and thermal loads during entry and a general shell analysis for concentrated tank loads during launch were used. In many cases fixes or alterations to the structure were required, and the role of the analytical results in determining these modifications is indicated.

  8. Analytical thermal model for end-pumped solid-state lasers

    NASA Astrophysics Data System (ADS)

    Cini, L.; Mackenzie, J. I.

    2017-12-01

    Fundamentally power-limited by thermal effects, the design challenge for end-pumped "bulk" solid-state lasers depends upon knowledge of the temperature gradients within the gain medium. We have developed analytical expressions that can be used to model the temperature distribution and thermal-lens power in end-pumped solid-state lasers. Enabled by the inclusion of a temperature-dependent thermal conductivity, applicable from cryogenic to elevated temperatures, typical pumping distributions are explored and the results compared with accepted models. Key insights are gained through these analytical expressions, such as the dependence of the peak temperature rise in function of the boundary thermal conductance to the heat sink. Our generalized expressions provide simple and time-efficient tools for parametric optimization of the heat distribution in the gain medium based upon the material and pumping constraints.

  9. Full quantum mechanical analysis of atomic three-grating Mach–Zehnder interferometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanz, A.S., E-mail: asanz@iff.csic.es; Davidović, M.; Božić, M.

    2015-02-15

    Atomic three-grating Mach–Zehnder interferometry constitutes an important tool to probe fundamental aspects of the quantum theory. There is, however, a remarkable gap in the literature between the oversimplified models and robust numerical simulations considered to describe the corresponding experiments. Consequently, the former usually lead to paradoxical scenarios, such as the wave–particle dual behavior of atoms, while the latter make difficult the data analysis in simple terms. Here these issues are tackled by means of a simple grating working model consisting of evenly-spaced Gaussian slits. As is shown, this model suffices to explore and explain such experiments both analytically and numerically,more » giving a good account of the full atomic journey inside the interferometer, and hence contributing to make less mystic the physics involved. More specifically, it provides a clear and unambiguous picture of the wavefront splitting that takes place inside the interferometer, illustrating how the momentum along each emerging diffraction order is well defined even though the wave function itself still displays a rather complex shape. To this end, the local transverse momentum is also introduced in this context as a reliable analytical tool. The splitting, apart from being a key issue to understand atomic Mach–Zehnder interferometry, also demonstrates at a fundamental level how wave and particle aspects are always present in the experiment, without incurring in any contradiction or interpretive paradox. On the other hand, at a practical level, the generality and versatility of the model and methodology presented, makes them suitable to attack analogous problems in a simple manner after a convenient tuning. - Highlights: • A simple model is proposed to analyze experiments based on atomic Mach–Zehnder interferometry. • The model can be easily handled both analytically and computationally. • A theoretical analysis based on the combination of the position and momentum representations is considered. • Wave and particle aspects are shown to coexist within the same experiment, thus removing the old wave-corpuscle dichotomy. • A good agreement between numerical simulations and experimental data is found without appealing to best-fit procedures.« less

  10. A simple and effective method for detecting precipitated proteins in MALDI-TOF MS.

    PubMed

    Oshikane, Hiroyuki; Watabe, Masahiko; Nakaki, Toshio

    2018-04-01

    MALDI-TOF MS has developed rapidly into an essential analytical tool for the life sciences. Cinnamic acid derivatives are generally employed in routine molecular weight determinations of intact proteins using MALDI-TOF MS. However, a protein of interest may precipitate when mixed with matrix solution, perhaps preventing MS detection. We herein provide a simple approach to enable the MS detection of such precipitated protein species by means of a "direct deposition method" -- loading the precipitant directly onto the sample plate. It is thus expected to improve routine MS analysis of intact proteins. Copyright © 2018. Published by Elsevier Inc.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cottam, Joseph A.; Blaha, Leslie M.

    Systems have biases. Their interfaces naturally guide a user toward specific patterns of action. For example, modern word-processors and spreadsheets are both capable of taking word wrapping, checking spelling, storing tables, and calculating formulas. You could write a paper in a spreadsheet or could do simple business modeling in a word-processor. However, their interfaces naturally communicate which function they are designed for. Visual analytic interfaces also have biases. In this paper, we outline why simple Markov models are a plausible tool for investigating that bias and how they might be applied. We also discuss some anticipated difficulties in such modelingmore » and touch briefly on what some Markov model extensions might provide.« less

  12. Comparative Chemometric Analysis for Classification of Acids and Bases via a Colorimetric Sensor Array.

    PubMed

    Kangas, Michael J; Burks, Raychelle M; Atwater, Jordyn; Lukowicz, Rachel M; Garver, Billy; Holmes, Andrea E

    2018-02-01

    With the increasing availability of digital imaging devices, colorimetric sensor arrays are rapidly becoming a simple, yet effective tool for the identification and quantification of various analytes. Colorimetric arrays utilize colorimetric data from many colorimetric sensors, with the multidimensional nature of the resulting data necessitating the use of chemometric analysis. Herein, an 8 sensor colorimetric array was used to analyze select acid and basic samples (0.5 - 10 M) to determine which chemometric methods are best suited for classification quantification of analytes within clusters. PCA, HCA, and LDA were used to visualize the data set. All three methods showed well-separated clusters for each of the acid or base analytes and moderate separation between analyte concentrations, indicating that the sensor array can be used to identify and quantify samples. Furthermore, PCA could be used to determine which sensors showed the most effective analyte identification. LDA, KNN, and HQI were used for identification of analyte and concentration. HQI and KNN could be used to correctly identify the analytes in all cases, while LDA correctly identified 95 of 96 analytes correctly. Additional studies demonstrated that controlling for solvent and image effects was unnecessary for all chemometric methods utilized in this study.

  13. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. SU-FF-T-668: A Simple Algorithm for Range Modulation Wheel Design in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, X; Nazaryan, Vahagn; Gueye, Paul

    2009-06-01

    Purpose: To develop a simple algorithm in designing the range modulation wheel to generate a very smooth Spread-Out Bragg peak (SOBP) for proton therapy.Method and Materials: A simple algorithm has been developed to generate the weight factors in corresponding pristine Bragg peaks which composed a smooth SOBP in proton therapy. We used a modified analytical Bragg peak function based on Monte Carol simulation tool-kits of Geant4 as pristine Bragg peaks input in our algorithm. A simple METLAB(R) Quad Program was introduced to optimize the cost function in our algorithm. Results: We found out that the existed analytical function of Braggmore » peak can't directly use as pristine Bragg peak dose-depth profile input file in optimization of the weight factors since this model didn't take into account of the scattering factors introducing from the range shifts in modifying the proton beam energies. We have done Geant4 simulations for proton energy of 63.4 MeV with a 1.08 cm SOBP for variation of pristine Bragg peaks which composed this SOBP and modified the existed analytical Bragg peak functions for their peak heights, ranges of R{sub 0}, and Gaussian energies {sigma}{sub E}. We found out that 19 pristine Bragg peaks are enough to achieve a flatness of 1.5% of SOBP which is the best flatness in the publications. Conclusion: This work develops a simple algorithm to generate the weight factors which is used to design a range modulation wheel to generate a smooth SOBP in protonradiation therapy. We have found out that a medium number of pristine Bragg peaks are enough to generate a SOBP with flatness less than 2%. It is potential to generate data base to store in the treatment plan to produce a clinic acceptable SOBP by using our simple algorithm.« less

  15. The dynamics of coastal models

    USGS Publications Warehouse

    Hearn, Clifford J.

    2008-01-01

    Coastal basins are defined as estuaries, lagoons, and embayments. This book deals with the science of coastal basins using simple models, many of which are presented in either analytical form or Microsoft Excel or MATLAB. The book introduces simple hydrodynamics and its applications, from the use of simple box and one-dimensional models to flow over coral reefs. The book also emphasizes models as a scientific tool in our understanding of coasts, and introduces the value of the most modern flexible mesh combined wave-current models. Examples from shallow basins around the world illustrate the wonders of the scientific method and the power of simple dynamics. This book is ideal for use as an advanced textbook for graduate students and as an introduction to the topic for researchers, especially those from other fields of science needing a basic understanding of the basic ideas of the dynamics of coastal basins.

  16. A Review of Numerical Simulation and Analytical Modeling for Medical Devices Safety in MRI

    PubMed Central

    Kabil, J.; Belguerras, L.; Trattnig, S.; Pasquier, C.; Missoffe, A.

    2016-01-01

    Summary Objectives To review past and present challenges and ongoing trends in numerical simulation for MRI (Magnetic Resonance Imaging) safety evaluation of medical devices. Methods A wide literature review on numerical and analytical simulation on simple or complex medical devices in MRI electromagnetic fields shows the evolutions through time and a growing concern for MRI safety over the years. Major issues and achievements are described, as well as current trends and perspectives in this research field. Results Numerical simulation of medical devices is constantly evolving, supported by calculation methods now well-established. Implants with simple geometry can often be simulated in a computational human model, but one issue remaining today is the experimental validation of these human models. A great concern is to assess RF heating on implants too complex to be traditionally simulated, like pacemaker leads. Thus, ongoing researches focus on alternative hybrids methods, both numerical and experimental, with for example a transfer function method. For the static field and gradient fields, analytical models can be used for dimensioning simple implants shapes, but limited for complex geometries that cannot be studied with simplifying assumptions. Conclusions Numerical simulation is an essential tool for MRI safety testing of medical devices. The main issues remain the accuracy of simulations compared to real life and the studies of complex devices; but as the research field is constantly evolving, some promising ideas are now under investigation to take up the challenges. PMID:27830244

  17. Sensitive analytical method for simultaneous analysis of some vasoconstrictors with highly overlapped analytical signals

    NASA Astrophysics Data System (ADS)

    Nikolić, G. S.; Žerajić, S.; Cakić, M.

    2011-10-01

    Multivariate calibration method is a powerful mathematical tool that can be applied in analytical chemistry when the analytical signals are highly overlapped. The method with regression by partial least squares is proposed for the simultaneous spectrophotometric determination of adrenergic vasoconstrictors in decongestive solution containing two active components: phenyleprine hydrochloride and trimazoline hydrochloride. These sympathomimetic agents are that frequently associated in pharmaceutical formulations against the common cold. The proposed method, which is, simple and rapid, offers the advantages of sensitivity and wide range of determinations without the need for extraction of the vasoconstrictors. In order to minimize the optimal factors necessary to obtain the calibration matrix by multivariate calibration, different parameters were evaluated. The adequate selection of the spectral regions proved to be important on the number of factors. In order to simultaneously quantify both hydrochlorides among excipients, the spectral region between 250 and 290 nm was selected. A recovery for the vasoconstrictor was 98-101%. The developed method was applied to assay of two decongestive pharmaceutical preparations.

  18. Analytic expressions for Atomic Layer Deposition: coverage, throughput, and materials utilization in cross-flow, particle coating, and spatial ALD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yanguas-Gil, Angel; Elam, Jeffrey W.

    2014-05-01

    In this work, the authors present analytic models for atomic layer deposition (ALD) in three common experimental configurations: cross-flow, particle coating, and spatial ALD. These models, based on the plug-flow and well-mixed approximations, allow us to determine the minimum dose times and materials utilization for all three configurations. A comparison between the three models shows that throughput and precursor utilization can each be expressed by universal equations, in which the particularity of the experimental system is contained in a single parameter related to the residence time of the precursor in the reactor. For the case of cross-flow reactors, the authorsmore » show how simple analytic expressions for the reactor saturation profiles agree well with experimental results. Consequently, the analytic model can be used to extract information about the ALD surface chemistry (e. g., the reaction probability) by comparing the analytic and experimental saturation profiles, providing a useful tool for characterizing new and existing ALD processes. (C) 2014 American Vacuum Society« less

  19. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  20. Selection by consequences, behavioral evolution, and the price equation.

    PubMed

    Baum, William M

    2017-05-01

    Price's equation describes evolution across time in simple mathematical terms. Although it is not a theory, but a derived identity, it is useful as an analytical tool. It affords lucid descriptions of genetic evolution, cultural evolution, and behavioral evolution (often called "selection by consequences") at different levels (e.g., individual vs. group) and at different time scales (local and extended). The importance of the Price equation for behavior analysis lies in its ability to precisely restate selection by consequences, thereby restating, or even replacing, the law of effect. Beyond this, the equation may be useful whenever one regards ontogenetic behavioral change as evolutionary change, because it describes evolutionary change in abstract, general terms. As an analytical tool, the behavioral Price equation is an excellent aid in understanding how behavior changes within organisms' lifetimes. For example, it illuminates evolution of response rate, analyses of choice in concurrent schedules, negative contingencies, and dilemmas of self-control. © 2017 Society for the Experimental Analysis of Behavior.

  1. Analytical expression for Risken-Nummedal-Graham-Haken instability threshold in quantum cascade lasers.

    PubMed

    Vukovic, N; Radovanovic, J; Milanovic, V; Boiko, D L

    2016-11-14

    We have obtained a closed-form expression for the threshold of Risken-Nummedal-Graham-Haken (RNGH) multimode instability in a Fabry-Pérot (FP) cavity quantum cascade laser (QCL). This simple analytical expression is a versatile tool that can easily be applied in practical situations which require analysis of QCL dynamic behavior and estimation of its RNGH multimode instability threshold. Our model for a FP cavity laser accounts for the carrier coherence grating and carrier population grating as well as their relaxation due to carrier diffusion. In the model, the RNGH instability threshold is analyzed using a second-order bi-orthogonal perturbation theory and we confirm our analytical solution by a comparison with the numerical simulations. In particular, the model predicts a low RNGH instability threshold in QCLs. This agrees very well with experimental data available in the literature.

  2. The design, analysis, and testing of a low-budget wind-tunnel flutter model with active aerodynamic controls

    NASA Technical Reports Server (NTRS)

    Bolding, R. M.; Stearman, R. O.

    1976-01-01

    A low budget flutter model incorporating active aerodynamic controls for flutter suppression studies was designed as both an educational and research tool to study the interfering lifting surface flutter phenomenon in the form of a swept wing-tail configuration. A flutter suppression mechanism was demonstrated on a simple semirigid three-degree-of-freedom flutter model of this configuration employing an active stabilator control, and was then verified analytically using a doublet lattice lifting surface code and the model's measured mass, mode shapes, and frequencies in a flutter analysis. Preliminary studies were significantly encouraging to extend the analysis to the larger degree of freedom AFFDL wing-tail flutter model where additional analytical flutter suppression studies indicated significant gains in flutter margins could be achieved. The analytical and experimental design of a flutter suppression system for the AFFDL model is presented along with the results of a preliminary passive flutter test.

  3. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  4. The use of a quartz crystal microbalance as an analytical tool to monitor particle/surface and particle/particle interactions under dry ambient and pressurized conditions: a study using common inhaler components.

    PubMed

    Turner, N W; Bloxham, M; Piletsky, S A; Whitcombe, M J; Chianella, I

    2016-12-19

    Metered dose inhalers (MDI) and multidose powder inhalers (MPDI) are commonly used for the treatment of chronic obstructive pulmonary diseases and asthma. Currently, analytical tools to monitor particle/particle and particle/surface interaction within MDI and MPDI at the macro-scale do not exist. A simple tool capable of measuring such interactions would ultimately enable quality control of MDI and MDPI, producing remarkable benefits for the pharmaceutical industry and the users of inhalers. In this paper, we have investigated whether a quartz crystal microbalance (QCM) could become such a tool. A QCM was used to measure particle/particle and particle/surface interactions on the macroscale, by additions of small amounts of MDPI components, in the powder form into a gas stream. The subsequent interactions with materials on the surface of the QCM sensor were analyzed. Following this, the sensor was used to measure fluticasone propionate, a typical MDI active ingredient, in a pressurized gas system to assess its interactions with different surfaces under conditions mimicking the manufacturing process. In both types of experiments the QCM was capable of discriminating interactions of different components and surfaces. The results have demonstrated that the QCM is a suitable platform for monitoring macro-scale interactions and could possibly become a tool for quality control of inhalers.

  5. Thin silica shell coated Ag assembled nanostructures for expanding generality of SERS analytes

    PubMed Central

    Kang, Yoo-Lee; Lee, Minwoo; Kang, Homan; Kim, Jaehi; Pham, Xuan-Hung; Kim, Tae Han; Hahm, Eunil; Lee, Yoon-Sik; Jeong, Dae Hong

    2017-01-01

    Surface-enhanced Raman scattering (SERS) provides a unique non-destructive spectroscopic fingerprint for chemical detection. However, intrinsic differences in affinity of analyte molecules to metal surface hinder SERS as a universal quantitative detection tool for various analyte molecules simultaneously. This must be overcome while keeping close proximity of analyte molecules to the metal surface. Moreover, assembled metal nanoparticles (NPs) structures might be beneficial for sensitive and reliable detection of chemicals than single NP structures. For this purpose, here we introduce thin silica-coated and assembled Ag NPs (SiO2@Ag@SiO2 NPs) for simultaneous and quantitative detection of chemicals that have different intrinsic affinities to silver metal. These SiO2@Ag@SiO2 NPs could detect each SERS peak of aniline or 4-aminothiophenol (4-ATP) from the mixture with limits of detection (LOD) of 93 ppm and 54 ppb, respectively. E-field distribution based on interparticle distance was simulated using discrete dipole approximation (DDA) calculation to gain insight into enhanced scattering of these thin silica coated Ag NP assemblies. These NPs were successfully applied to detect aniline in river water and tap water. Results suggest that SiO2@Ag@SiO2 NP-based SERS detection systems can be used as a simple and universal detection tool for environment pollutants and food safety. PMID:28570633

  6. Distributed Parameter Analysis of Pressure and Flow Disturbances in Rocket Propellant Feed Systems

    NASA Technical Reports Server (NTRS)

    Dorsch, Robert G.; Wood, Don J.; Lightner, Charlene

    1966-01-01

    A digital distributed parameter model for computing the dynamic response of propellant feed systems is formulated. The analytical approach used is an application of the wave-plan method of analyzing unsteady flow. Nonlinear effects are included. The model takes into account locally high compliances at the pump inlet and at the injector dome region. Examples of the calculated transient and steady-state periodic responses of a simple hypothetical propellant feed system to several types of disturbances are presented. Included are flow disturbances originating from longitudinal structural motion, gimbaling, throttling, and combustion-chamber coupling. The analytical method can be employed for analyzing developmental hardware and offers a flexible tool for the calculation of unsteady flow in these systems.

  7. Analytical Studies on the Synchronization of a Network of Linearly-Coupled Simple Chaotic Systems

    NASA Astrophysics Data System (ADS)

    Sivaganesh, G.; Arulgnanam, A.; Seethalakshmi, A. N.; Selvaraj, S.

    2018-05-01

    We present explicit generalized analytical solutions for a network of linearly-coupled simple chaotic systems. Analytical solutions are obtained for the normalized state equations of a network of linearly-coupled systems driven by a common chaotic drive system. Two parameter bifurcation diagrams revealing the various hidden synchronization regions, such as complete, phase and phase-lag synchronization are identified using the analytical results. The synchronization dynamics and their stability are studied using phase portraits and the master stability function, respectively. Further, experimental results for linearly-coupled simple chaotic systems are presented to confirm the analytical results. The synchronization dynamics of a network of chaotic systems studied analytically is reported for the first time.

  8. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  9. Preliminary Upper Estimate of Peak Currents in Transcranial Magnetic Stimulation at Distant Locations from a TMS Coil

    PubMed Central

    Makarov, Sergey N.; Yanamadala, Janakinadh; Piazza, Matthew W.; Helderman, Alex M.; Thang, Niang S.; Burnham, Edward H.; Pascual-Leone, Alvaro

    2016-01-01

    Goals Transcranial magnetic stimulation (TMS) is increasingly used as a diagnostic and therapeutic tool for numerous neuropsychiatric disorders. The use of TMS might cause whole-body exposure to undesired induced currents in patients and TMS operators. The aim of the present study is to test and justify a simple analytical model known previously, which may be helpful as an upper estimate of eddy current density at a particular distant observation point for any body composition and any coil setup. Methods We compare the analytical solution with comprehensive adaptive mesh refinement-based FEM simulations of a detailed full-body human model, two coil types, five coil positions, about 100,000 observation points, and two distinct pulse rise times, thus providing a representative number of different data sets for comparison, while also using other numerical data. Results Our simulations reveal that, after a certain modification, the analytical model provides an upper estimate for the eddy current density at any location within the body. In particular, it overestimates the peak eddy currents at distant locations from a TMS coil by a factor of 10 on average. Conclusion The simple analytical model tested in the present study may be valuable as a rapid method to safely estimate levels of TMS currents at different locations within a human body. Significance At present, safe limits of general exposure to TMS electric and magnetic fields are an open subject, including fetal exposure for pregnant women. PMID:26685221

  10. DNA Electrochemistry and Electrochemical Sensors for Nucleic Acids.

    PubMed

    Ferapontova, Elena E

    2018-06-12

    Sensitive, specific, and fast analysis of nucleic acids (NAs) is strongly needed in medicine, environmental science, biodefence, and agriculture for the study of bacterial contamination of food and beverages and genetically modified organisms. Electrochemistry offers accurate, simple, inexpensive, and robust tools for the development of such analytical platforms that can successfully compete with other approaches for NA detection. Here, electrode reactions of DNA, basic principles of electrochemical NA analysis, and their relevance for practical applications are reviewed and critically discussed.

  11. Optimized theory for simple and molecular fluids.

    PubMed

    Marucho, M; Montgomery Pettitt, B

    2007-03-28

    An optimized closure approximation for both simple and molecular fluids is presented. A smooth interpolation between Perkus-Yevick and hypernetted chain closures is optimized by minimizing the free energy self-consistently with respect to the interpolation parameter(s). The molecular version is derived from a refinement of the method for simple fluids. In doing so, a method is proposed which appropriately couples an optimized closure with the variant of the diagrammatically proper integral equation recently introduced by this laboratory [K. M. Dyer et al., J. Chem. Phys. 123, 204512 (2005)]. The simplicity of the expressions involved in this proposed theory has allowed the authors to obtain an analytic expression for the approximate excess chemical potential. This is shown to be an efficient tool to estimate, from first principles, the numerical value of the interpolation parameters defining the aforementioned closure. As a preliminary test, representative models for simple fluids and homonuclear diatomic Lennard-Jones fluids were analyzed, obtaining site-site correlation functions in excellent agreement with simulation data.

  12. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  13. Simple, rapid, and environmentally friendly method for the separation of isoflavones using ultra-high performance supercritical fluid chromatography.

    PubMed

    Wu, Wenjie; Zhang, Yuan; Wu, Hanqiu; Zhou, Weie; Cheng, Yan; Li, Hongna; Zhang, Chuanbin; Li, Lulu; Huang, Ying; Zhang, Feng

    2017-07-01

    Isoflavones are natural substances that exhibit hormone-like pharmacological activities. The separation of isoflavones remains an analytical challenge because of their similar structures. We show that ultra-high performance supercritical fluid chromatography can be an appropriate tool to achieve the fast separation of 12 common dietary isoflavones. Among the five tested columns the Torus DEA column was found to be the most effective column for the separation of these isoflavones. The impact of individual parameters on the retention time and separation factor was evaluated. These parameters were optimized to develop a simple, rapid, and green method for the separation of the 12 target analytes. It only took 12.91 min using gradient elution with methanol as an organic modifier and formic acid as an additive. These isoflavones were determined with limit of quantitation ranging from 0.10 to 0.50 μg/mL, which was sufficient for reliable determination of various matrixes. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Numerics made easy: solving the Navier-Stokes equation for arbitrary channel cross-sections using Microsoft Excel.

    PubMed

    Richter, Christiane; Kotz, Frederik; Giselbrecht, Stefan; Helmer, Dorothea; Rapp, Bastian E

    2016-06-01

    The fluid mechanics of microfluidics is distinctively simpler than the fluid mechanics of macroscopic systems. In macroscopic systems effects such as non-laminar flow, convection, gravity etc. need to be accounted for all of which can usually be neglected in microfluidic systems. Still, there exists only a very limited selection of channel cross-sections for which the Navier-Stokes equation for pressure-driven Poiseuille flow can be solved analytically. From these equations, velocity profiles as well as flow rates can be calculated. However, whenever a cross-section is not highly symmetric (rectangular, elliptical or circular) the Navier-Stokes equation can usually not be solved analytically. In all of these cases, numerical methods are required. However, in many instances it is not necessary to turn to complex numerical solver packages for deriving, e.g., the velocity profile of a more complex microfluidic channel cross-section. In this paper, a simple spreadsheet analysis tool (here: Microsoft Excel) will be used to implement a simple numerical scheme which allows solving the Navier-Stokes equation for arbitrary channel cross-sections.

  15. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    PubMed

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  17. Combination of nano-material enrichment and dead-end filtration for uniform and rapid sample preparation in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Wu, Zengnan; Khan, Mashooq; Mao, Sifeng; Lin, Ling; Lin, Jin-Ming

    2018-05-01

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool for the detection of a wide range of analytes. However, heterogeneous distribution of matrix/analyte cocrystal, variation in signal intensity and poor experimental reproducibility at different locations of the same spot means difficulty in quantitative analysis. In this work, carbon nanotubes (CNTs) were employed as adsorbent for analyte cum matrix on a conductive porous membrane as a novel mass target plate. The sample pretreatment step was achieved by enrichment and dead-end filtration and dried by a solid-liquid separation. This approach enables the homogeneous distribution of analyte in the matrix, good shot-to-shot reproducibility in signals and quantitative detection of peptide and protein at different concentrations with correlation coefficient (R 2 ) of 0.9920 and 0.9909, respectively. The simple preparation of sample in a short time, uniform distribution of analyte, easy quantitative detection, and high reproducibility makes this technique useful and may diversify the application of MALDI-MS for quantitative detection of a variety of proteins. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Streamflow variability and optimal capacity of run-of-river hydropower plants

    NASA Astrophysics Data System (ADS)

    Basso, S.; Botter, G.

    2012-10-01

    The identification of the capacity of a run-of-river plant which allows for the optimal utilization of the available water resources is a challenging task, mainly because of the inherent temporal variability of river flows. This paper proposes an analytical framework to describe the energy production and the economic profitability of small run-of-river power plants on the basis of the underlying streamflow regime. We provide analytical expressions for the capacity which maximize the produced energy as a function of the underlying flow duration curve and minimum environmental flow requirements downstream of the plant intake. Similar analytical expressions are derived for the capacity which maximize the economic return deriving from construction and operation of a new plant. The analytical approach is applied to a minihydro plant recently proposed in a small Alpine catchment in northeastern Italy, evidencing the potential of the method as a flexible and simple design tool for practical application. The analytical model provides useful insight on the major hydrologic and economic controls (e.g., streamflow variability, energy price, costs) on the optimal plant capacity and helps in identifying policy strategies to reduce the current gap between the economic and energy optimizations of run-of-river plants.

  19. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  20. Development of a quality assessment tool for systematic reviews of observational studies (QATSO) of HIV prevalence in men having sex with men and associated risk behaviours

    PubMed Central

    Wong, William CW; Cheung, Catherine SK; Hart, Graham J

    2008-01-01

    Background Systematic reviews based on the critical appraisal of observational and analytic studies on HIV prevalence and risk factors for HIV transmission among men having sex with men are very useful for health care decisions and planning. Such appraisal is particularly difficult, however, as the quality assessment tools available for use with observational and analytic studies are poorly established. Methods We reviewed the existing quality assessment tools for systematic reviews of observational studies and developed a concise quality assessment checklist to help standardise decisions regarding the quality of studies, with careful consideration of issues such as external and internal validity. Results A pilot version of the checklist was developed based on epidemiological principles, reviews of study designs, and existing checklists for the assessment of observational studies. The Quality Assessment Tool for Systematic Reviews of Observational Studies (QATSO) Score consists of five items: External validity (1 item), reporting (2 items), bias (1 item) and confounding factors (1 item). Expert opinions were sought and it was tested on manuscripts that fulfil the inclusion criteria of a systematic review. Like all assessment scales, QATSO may oversimplify and generalise information yet it is inclusive, simple and practical to use, and allows comparability between papers. Conclusion A specific tool that allows researchers to appraise and guide study quality of observational studies is developed and can be modified for similar studies in the future. PMID:19014686

  1. Detection of UV-treatment effects on plankton by rapid analytic tools for ballast water compliance monitoring immediately following treatment

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Gianoli, Claudio; He, Jianjun; Lo Curto, Alberto; Stehouwer, Peter; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Non-indigenous species seriously threaten native biodiversity. To reduce establishments, the International Maritime Organization established the Convention for the Control and Management of Ships' Ballast Water and Sediments which limits organism concentrations at discharge under regulation D-2. Most ships will comply by using on-board treatment systems to disinfect their ballast water. Port state control officers will need simple, rapid methods to detect compliance. Appropriate monitoring methods may be dependent on treatment type, since different treatments will affect organisms by a variety of mechanisms. Many indicative tools have been developed, but must be examined to ensure the measured variable is an appropriate signal for the response of the organisms to the applied treatment. We assessed the abilities of multiple analytic tools to rapidly detect the effects of a ballast water treatment system based on UV disinfection. All devices detected a large decrease in the concentrations of vital organisms ≥ 50 μm and organisms < 10 μm (mean 82.7-99.7% decrease across devices), but results were more variable for the ≥ 10 to < 50 μm size class (mean 9.0-99.9% decrease across devices). Results confirm the necessity to choose tools capable of detecting the damage inflicted on living organisms, as examined herein for UV-C treatment systems.

  2. Development of a new analytical tool for assessing the mutagen 2-methyl-1,4-dinitro-pyrrole in meat products by LC-ESI-MS/MS.

    PubMed

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; Yotsuyanagi, Suzana Eri; da Silva Correa Lemos, Ana Lucia; Joussef, Antonio Carlos; De Dea Lindner, Juliano

    2018-08-01

    The use of sorbate and nitrite in meat processing may lead to the formation of 2-methyl-1,4-dinitro-pyrrole (DNMP), a mutagenic compound. This work was aimed at developing and validating an analytical method for the quantitation of DNMP by liquid chromatography-tandem mass spectrometry. Full validation was performed in accordance to Commission Decision 2002/657/EC and method applicability was checked in several samples of meat products. A simple procedure, with low temperature partitioning solid-liquid extraction, was developed. The nitrosation during the extraction was monitored by the N-nitroso-DL-pipecolic acid content. Chromatographic separation was achieved in 8 min with di-isopropyl-3-aminopropyl silane bound to hydroxylated silica as stationary phase. Samples of bacon and cooked sausage yielded the highest concentrations of DNMP (68 ± 3 and 50 ± 3 μg kg -1 , respectively). The developed method proved to be a reliable, selective, and sensitive tool for DNMP measurements in meat products. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. New Technologies for Studying Biofilms

    PubMed Central

    FRANKLIN, MICHAEL J.; CHANG, CONNIE; AKIYAMA, TATSUYA; BOTHNER, BRIAN

    2016-01-01

    Bacteria have traditionally been studied as single-cell organisms. In laboratory settings, aerobic bacteria are usually cultured in aerated flasks, where the cells are considered essentially homogenous. However, in many natural environments, bacteria and other microorganisms grow in mixed communities, often associated with surfaces. Biofilms are comprised of surface-associated microorganisms, their extracellular matrix material, and environmental chemicals that have adsorbed to the bacteria or their matrix material. While this definition of a biofilm is fairly simple, biofilms are complex and dynamic. Our understanding of the activities of individual biofilm cells and whole biofilm systems has developed rapidly, due in part to advances in molecular, analytical, and imaging tools and the miniaturization of tools designed to characterize biofilms at the enzyme level, cellular level, and systems level. PMID:26350329

  4. Post hoc support vector machine learning for impedimetric biosensors based on weak protein-ligand interactions.

    PubMed

    Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S

    2018-04-30

    Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.

  5. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Quantitative characterization of edge enhancement in phase contrast x-ray imaging.

    PubMed

    Monnin, P; Bulling, S; Hoszowska, J; Valley, J F; Meuli, R; Verdun, F R

    2004-06-01

    The aim of this study was to model the edge enhancement effect in in-line holography phase contrast imaging. A simple analytical approach was used to quantify refraction and interference contrasts in terms of beam energy and imaging geometry. The model was applied to predict the peak intensity and frequency of the edge enhancement for images of cylindrical fibers. The calculations were compared with measurements, and the relationship between the spatial resolution of the detector and the amplitude of the phase contrast signal was investigated. Calculations using the analytical model were in good agreement with experimental results for nylon, aluminum and copper wires of 50 to 240 microm diameter, and with numerical simulations based on Fresnel-Kirchhoff theory. A relationship between the defocusing distance and the pixel size of the image detector was established. This analytical model is a useful tool for optimizing imaging parameters in phase contrast in-line holography, including defocusing distance, detector resolution and beam energy.

  7. Introducing Simple Detection of Bioavailable Arsenic at Rafaela (Santa Fe Province, Argentina) Using the ARSOlux Biosensor.

    PubMed

    Siegfried, Konrad; Hahn-Tomer, Sonja; Koelsch, Andreas; Osterwalder, Eva; Mattusch, Juergen; Staerk, Hans-Joachim; Meichtry, Jorge M; De Seta, Graciela E; Reina, Fernando D; Panigatti, Cecilia; Litter, Marta I; Harms, Hauke

    2015-05-21

    Numerous articles have reported the occurrence of arsenic in drinking water in Argentina, and the resulting health effects in severely affected regions of the country. Arsenic in drinking water in Argentina is largely naturally occurring due to elevated background content of the metalloid in volcanic sediments, although, in some regions, mining can contribute. While the origin of arsenic release has been discussed extensively, the problem of drinking water contamination has not yet been solved. One key step in progress towards mitigation of problems related with the consumption of As-containing water is the availability of simple detection tools. A chemical test kit and the ARSOlux biosensor were evaluated as simple analytical tools for field measurements of arsenic in the groundwater of Rafaela (Santa Fe, Argentina), and the results were compared with ICP-MS and HPLC-ICP-MS measurements. A survey of the groundwater chemistry was performed to evaluate possible interferences with the field tests. The results showed that the ARSOlux biosensor performed better than the chemical field test, that the predominant species of arsenic in the study area was arsenate and that arsenic concentration in the studied samples had a positive correlation with fluoride and vanadium, and a negative one with calcium and iron.

  8. A green and facile approach for synthesizing imine to develop optical biosensor for wide range detection of bilirubin in human biofluids.

    PubMed

    Ellairaja, Sundaram; Shenbagavalli, Kathiravan; Ponmariappan, Sarkaraisamy; Vasantha, Vairathevar Sivasamy

    2017-05-15

    Bilirubin, a key biomarker for the jaundice and its clinical diagnosis needs a better analytical tool. A novel and simple fluorescent platform based on (2,2'-((1E,1'E)-((6-bromopyridine-2,3-diyl) bis(azanylylidene)) bis(methanylylidene diphenol) (BAMD) was designed. BAMD showed a remarkable fluorescent intensity with a very good quantum yield of 0.85 and lifetime of 870ps. Hence, it was applied for the determination of bilirubin using both colorimetric and fluorimetric techniques in physiological and basic pH. Under optimized experimental conditions, the probe detects bilirubin selectively in the presence of other interfering biomolecules and metal ions. The linear range of detection is 1pM-500µM at pH=7.4 and LOD is 2.8 and 3.3 pM at pH=7.4 and 9.0, respectively, which were reported so far. The probe detects the bilirubin through FRET mechanism. The practical application of the probe was successfully tested in the human blood and urine samples. Based on all above advantages, this simple idea can be applied to design a simple clinical diagnostic tool for jaundice. Copyright © 2016. Published by Elsevier B.V.

  9. Introducing Simple Detection of Bioavailable Arsenic at Rafaela (Santa Fe Province, Argentina) Using the ARSOlux Biosensor

    PubMed Central

    Siegfried, Konrad; Hahn-Tomer, Sonja; Koelsch, Andreas; Osterwalder, Eva; Mattusch, Juergen; Staerk, Hans-Joachim; Meichtry, Jorge M.; De Seta, Graciela E.; Reina, Fernando D.; Panigatti, Cecilia; Litter, Marta I.; Harms, Hauke

    2015-01-01

    Numerous articles have reported the occurrence of arsenic in drinking water in Argentina, and the resulting health effects in severely affected regions of the country. Arsenic in drinking water in Argentina is largely naturally occurring due to elevated background content of the metalloid in volcanic sediments, although, in some regions, mining can contribute. While the origin of arsenic release has been discussed extensively, the problem of drinking water contamination has not yet been solved. One key step in progress towards mitigation of problems related with the consumption of As-containing water is the availability of simple detection tools. A chemical test kit and the ARSOlux biosensor were evaluated as simple analytical tools for field measurements of arsenic in the groundwater of Rafaela (Santa Fe, Argentina), and the results were compared with ICP-MS and HPLC-ICP-MS measurements. A survey of the groundwater chemistry was performed to evaluate possible interferences with the field tests. The results showed that the ARSOlux biosensor performed better than the chemical field test, that the predominant species of arsenic in the study area was arsenate and that arsenic concentration in the studied samples had a positive correlation with fluoride and vanadium, and a negative one with calcium and iron. PMID:26006123

  10. Software Models Impact Stresses

    NASA Technical Reports Server (NTRS)

    Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark

    1991-01-01

    Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.

  11. Calculation of Excavation Force for ISRU on Lunar Surface

    NASA Technical Reports Server (NTRS)

    Zeng, Xiangwu (David); Burnoski, Louis; Agui, Juan H.; Wilkinson, Allen

    2007-01-01

    Accurately predicting the excavation force that will be encountered by digging tools on the lunar surface is a crucial element of in-situ resource utilization (ISRU). Based on principles of soil mechanics, this paper develops an analytical model that is relatively simple to apply and uses soil parameters that can be determined by traditional soil strength tests. The influence of important parameters on the excavation force is investigated. The results are compared with that predicted by other available theories. Results of preliminary soil tests on lunar stimulant are also reported.

  12. [Electronic data processing-assisted bookkeeping and accounting system at the Düsseldorf Institute of Forensic Medicine].

    PubMed

    Bonte, W; Bonte, I

    1989-01-01

    In 1985 we reported about the usefulness of a simple home computer (here: Commodore C 64) for scientific work. This paper will demonstrate, that such an instrument also can be an appropriate tool for the entire accountancy of a medicolegal institute. Presented were self-designed programs which deal with the following matters: complication of monthly performance reports, calculation of services for clinical care, typing of analytical results and brief interpretations, typing of liquidations, clearing of proceeds from written expertises and autopsies against administration and staff.

  13. Free web-based modelling platform for managed aquifer recharge (MAR) applications

    NASA Astrophysics Data System (ADS)

    Stefan, Catalin; Junghanns, Ralf; Glaß, Jana; Sallwey, Jana; Fatkhutdinov, Aybulat; Fichtner, Thomas; Barquero, Felix; Moreno, Miguel; Bonilla, José; Kwoyiga, Lydia

    2017-04-01

    Managed aquifer recharge represents a valuable instrument for sustainable water resources management. The concept implies purposeful infiltration of surface water into underground for later recovery or environmental benefits. Over decades, MAR schemes were successfully installed worldwide for a variety of reasons: to maximize the natural storage capacity of aquifers, physical aquifer management, water quality management, and ecological benefits. The INOWAS-DSS platform provides a collection of free web-based tools for planning, management and optimization of main components of MAR schemes. The tools are grouped into 13 specific applications that cover most relevant challenges encountered at MAR sites, both from quantitative and qualitative perspectives. The applications include among others the optimization of MAR site location, the assessment of saltwater intrusion, the restoration of groundwater levels in overexploited aquifers, the maximization of natural storage capacity of aquifers, the improvement of water quality, the design and operational optimization of MAR schemes, clogging development and risk assessment. The platform contains a collection of about 35 web-based tools of various degrees of complexity, which are either included in application specific workflows or used as standalone modelling instruments. Among them are simple tools derived from data mining and empirical equations, analytical groundwater related equations, as well as complex numerical flow and transport models (MODFLOW, MT3DMS and SEAWAT). Up to now, the simulation core of the INOWAS-DSS, which is based on the finite differences groundwater flow model MODFLOW, is implemented and runs on the web. A scenario analyser helps to easily set up and evaluate new management options as well as future development such as land use and climate change and compare them to previous scenarios. Additionally simple tools such as analytical equations to assess saltwater intrusion are already running online. Besides the simulation tools, a web-based data base is under development where geospatial and time series data can be stored, managed, and processed. Furthermore, a web-based information system containing user guides for the various developed tools and applications as well as basic information on MAR and related topics is published and will be regularly expanded as new tools are getting implemented. The INOWAS-DSS including its simulation tools, data base and information system provides an extensive framework to manage, plan and optimize MAR facilities. As the INOWAS-DSS is an open-source software accessible via the internet using standard web browsers, it offers new ways for data sharing and collaboration among various partners and decision makers.

  14. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A high-throughput label-free nanoparticle analyser.

    PubMed

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  16. Real-Time XRD Studies of Li-O2 Electrochemical Reaction in Nonaqueous Lithium-Oxygen Battery.

    PubMed

    Lim, Hyunseob; Yilmaz, Eda; Byon, Hye Ryung

    2012-11-01

    Understanding of electrochemical process in rechargeable Li-O2 battery has suffered from lack of proper analytical tool, especially related to the identification of chemical species and number of electrons involved in the discharge/recharge process. Here we present a simple and straightforward analytical method for simultaneously attaining chemical and quantified information of Li2O2 (discharge product) and byproducts using in situ XRD measurement. By real-time monitoring of solid-state Li2O2 peak area, the accurate efficiency of Li2O2 formation and the number of electrons can be evaluated during full discharge. Furthermore, by observation of sequential area change of Li2O2 peak during recharge, we found nonlinearity of Li2O2 decomposition rate for the first time in ether-based electrolyte.

  17. Modeling the Mousetrap Car

    NASA Astrophysics Data System (ADS)

    Jumper, William D.

    2012-03-01

    Many high school and introductory college physics courses make use of mousetrap car projects and competitions as a way of providing an engaging hands-on learning experience incorporating Newton's laws, conversion of potential to kinetic energy, dissipative forces, and rotational mechanics. Presented here is a simple analytical and finite element spreadsheet model for a typical mousetrap car, as shown in Fig. 1. It is hoped that the model will provide students with a tool for designing or modifying the designs of their cars, provide instructors with a means to insure students close the loop between physical principles and an understanding of their car's speed and distance performance, and, third, stimulate in students at an early stage an appreciation for the merits of computer modeling as an aid in understanding and tackling otherwise analytically intractable problems so common in today's professional world.

  18. Versatile electrophoresis-based self-test platform.

    PubMed

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  20. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    PubMed

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  1. Is the Jeffreys' scale a reliable tool for Bayesian model comparison in cosmology?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nesseris, Savvas; García-Bellido, Juan, E-mail: savvas.nesseris@uam.es, E-mail: juan.garciabellido@uam.es

    2013-08-01

    We are entering an era where progress in cosmology is driven by data, and alternative models will have to be compared and ruled out according to some consistent criterium. The most conservative and widely used approach is Bayesian model comparison. In this paper we explicitly calculate the Bayes factors for all models that are linear with respect to their parameters. We do this in order to test the so called Jeffreys' scale and determine analytically how accurate its predictions are in a simple case where we fully understand and can calculate everything analytically. We also discuss the case of nestedmore » models, e.g. one with M{sub 1} and another with M{sub 2} superset of M{sub 1} parameters and we derive analytic expressions for both the Bayes factor and the figure of Merit, defined as the inverse area of the model parameter's confidence contours. With all this machinery and the use of an explicit example we demonstrate that the threshold nature of Jeffreys' scale is not a ''one size fits all'' reliable tool for model comparison and that it may lead to biased conclusions. Furthermore, we discuss the importance of choosing the right basis in the context of models that are linear with respect to their parameters and how that basis affects the parameter estimation and the derived constraints.« less

  2. Validation of the replica trick for simple models

    NASA Astrophysics Data System (ADS)

    Shinzato, Takashi

    2018-04-01

    We discuss the replica analytic continuation using several simple models in order to prove mathematically the validity of the replica analysis, which is used in a wide range of fields related to large-scale complex systems. While replica analysis consists of two analytical techniques—the replica trick (or replica analytic continuation) and the thermodynamical limit (and/or order parameter expansion)—we focus our study on replica analytic continuation, which is the mathematical basis of the replica trick. We apply replica analysis to solve a variety of analytical models, and examine the properties of replica analytic continuation. Based on the positive results for these models we propose that replica analytic continuation is a robust procedure in replica analysis.

  3. Customisation of the exome data analysis pipeline using a combinatorial approach.

    PubMed

    Pattnaik, Swetansu; Vaidyanathan, Srividya; Pooja, Durgad G; Deepak, Sa; Panda, Binay

    2012-01-01

    The advent of next generation sequencing (NGS) technologies have revolutionised the way biologists produce, analyse and interpret data. Although NGS platforms provide a cost-effective way to discover genome-wide variants from a single experiment, variants discovered by NGS need follow up validation due to the high error rates associated with various sequencing chemistries. Recently, whole exome sequencing has been proposed as an affordable option compared to whole genome runs but it still requires follow up validation of all the novel exomic variants. Customarily, a consensus approach is used to overcome the systematic errors inherent to the sequencing technology, alignment and post alignment variant detection algorithms. However, the aforementioned approach warrants the use of multiple sequencing chemistry, multiple alignment tools, multiple variant callers which may not be viable in terms of time and money for individual investigators with limited informatics know-how. Biologists often lack the requisite training to deal with the huge amount of data produced by NGS runs and face difficulty in choosing from the list of freely available analytical tools for NGS data analysis. Hence, there is a need to customise the NGS data analysis pipeline to preferentially retain true variants by minimising the incidence of false positives and make the choice of right analytical tools easier. To this end, we have sampled different freely available tools used at the alignment and post alignment stage suggesting the use of the most suitable combination determined by a simple framework of pre-existing metrics to create significant datasets.

  4. Meta-analyzing dependent correlations: an SPSS macro and an R script.

    PubMed

    Cheung, Shu Fai; Chan, Darius K-S

    2014-06-01

    The presence of dependent correlation is a common problem in meta-analysis. Cheung and Chan (2004, 2008) have shown that samplewise-adjusted procedures perform better than the more commonly adopted simple within-sample mean procedures. However, samplewise-adjusted procedures have rarely been applied in meta-analytic reviews, probably due to the lack of suitable ready-to-use programs. In this article, we compare the samplewise-adjusted procedures with existing procedures to handle dependent effect sizes, and present the samplewise-adjusted procedures in a way that will make them more accessible to researchers conducting meta-analysis. We also introduce two tools, an SPSS macro and an R script, that researchers can apply to their meta-analyses; these tools are compatible with existing meta-analysis software packages.

  5. Evaluation of plasma proteomic data for Alzheimer disease state classification and for the prediction of progression from mild cognitive impairment to Alzheimer disease.

    PubMed

    Llano, Daniel A; Devanarayan, Viswanath; Simon, Adam J

    2013-01-01

    Previous studies that have examined the potential for plasma markers to serve as biomarkers for Alzheimer disease (AD) have studied single analytes and focused on the amyloid-β and τ isoforms and have failed to yield conclusive results. In this study, we performed a multivariate analysis of 146 plasma analytes (the Human DiscoveryMAP v 1.0 from Rules-Based Medicine) in 527 subjects with AD, mild cognitive impairment (MCI), or cognitively normal elderly subjects from the Alzheimer's Disease Neuroimaging Initiative database. We identified 4 different proteomic signatures, each using 5 to 14 analytes, that differentiate AD from control patients with sensitivity and specificity ranging from 74% to 85%. Five analytes were common to all 4 signatures: apolipoprotein A-II, apolipoprotein E, serum glutamic oxaloacetic transaminase, α-1-microglobulin, and brain natriuretic peptide. None of the signatures adequately predicted progression from MCI to AD over a 12- and 24-month period. A new panel of analytes, optimized to predict MCI to AD conversion, was able to provide 55% to 60% predictive accuracy. These data suggest that a simple panel of plasma analytes may provide an adjunctive tool to differentiate AD from controls, may provide mechanistic insights to the etiology of AD, but cannot adequately predict MCI to AD conversion.

  6. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. MALDI-TOF MS identification of anaerobic bacteria: assessment of pre-analytical variables and specimen preparation techniques.

    PubMed

    Hsu, Yen-Michael S; Burnham, Carey-Ann D

    2014-06-01

    Matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) has emerged as a tool for identifying clinically relevant anaerobes. We evaluated the analytical performance characteristics of the Bruker Microflex with Biotyper 3.0 software system for identification of anaerobes and examined the impact of direct formic acid (FA) treatment and other pre-analytical factors on MALDI-TOF MS performance. A collection of 101 anaerobic bacteria were evaluated, including Clostridium spp., Propionibacterium spp., Fusobacterium spp., Bacteroides spp., and other anaerobic bacterial of clinical relevance. The results of our study indicate that an on-target extraction with 100% FA improves the rate of accurate identification without introducing misidentification (P<0.05). In addition, we modify the reporting cutoffs for the Biotyper "score" yielding acceptable identification. We found that a score of ≥1.700 can maximize the rate of identification. Of interest, MALDI-TOF MS can correctly identify anaerobes grown in suboptimal conditions, such as on selective culture media and following oxygen exposure. In conclusion, we report on a number of simple and cost-effective pre- and post-analytical modifications could enhance MALDI-TOF MS identification for anaerobic bacteria. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Validation of the SINDA/FLUINT code using several analytical solutions

    NASA Technical Reports Server (NTRS)

    Keller, John R.

    1995-01-01

    The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.

  9. The stationary sine-Gordon equation on metric graphs: Exact analytical solutions for simple topologies

    NASA Astrophysics Data System (ADS)

    Sabirov, K.; Rakhmanov, S.; Matrasulov, D.; Susanto, H.

    2018-04-01

    We consider the stationary sine-Gordon equation on metric graphs with simple topologies. Exact analytical solutions are obtained for different vertex boundary conditions. It is shown that the method can be extended for tree and other simple graph topologies. Applications of the obtained results to branched planar Josephson junctions and Josephson junctions with tricrystal boundaries are discussed.

  10. A simple method to calculate first-passage time densities with arbitrary initial conditions

    NASA Astrophysics Data System (ADS)

    Nyberg, Markus; Ambjörnsson, Tobias; Lizana, Ludvig

    2016-06-01

    Numerous applications all the way from biology and physics to economics depend on the density of first crossings over a boundary. Motivated by the lack of general purpose analytical tools for computing first-passage time densities (FPTDs) for complex problems, we propose a new simple method based on the independent interval approximation (IIA). We generalise previous formulations of the IIA to include arbitrary initial conditions as well as to deal with discrete time and non-smooth continuous time processes. We derive a closed form expression for the FPTD in z and Laplace-transform space to a boundary in one dimension. Two classes of problems are analysed in detail: discrete time symmetric random walks (Markovian) and continuous time Gaussian stationary processes (Markovian and non-Markovian). Our results are in good agreement with Langevin dynamics simulations.

  11. Determination of hydrazine in drinking water: Development and multivariate optimization of a rapid and simple solid phase microextraction-gas chromatography-triple quadrupole mass spectrometry protocol.

    PubMed

    Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio

    2014-07-04

    In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Thermodynamics of Gas Turbine Cycles with Analytic Derivatives in OpenMDAO

    NASA Technical Reports Server (NTRS)

    Gray, Justin; Chin, Jeffrey; Hearn, Tristan; Hendricks, Eric; Lavelle, Thomas; Martins, Joaquim R. R. A.

    2016-01-01

    A new equilibrium thermodynamics analysis tool was built based on the CEA method using the OpenMDAO framework. The new tool provides forward and adjoint analytic derivatives for use with gradient based optimization algorithms. The new tool was validated against the original CEA code to ensure an accurate analysis and the analytic derivatives were validated against finite-difference approximations. Performance comparisons between analytic and finite difference methods showed a significant speed advantage for the analytic methods. To further test the new analysis tool, a sample optimization was performed to find the optimal air-fuel equivalence ratio, , maximizing combustion temperature for a range of different pressures. Collectively, the results demonstrate the viability of the new tool to serve as the thermodynamic backbone for future work on a full propulsion modeling tool.

  13. Investigating Analytic Tools for e-Book Design in Early Literacy Learning

    ERIC Educational Resources Information Center

    Roskos, Kathleen; Brueck, Jeremy; Widman, Sarah

    2009-01-01

    Toward the goal of better e-book design to support early literacy learning, this study investigates analytic tools for examining design qualities of e-books for young children. Three research-based analytic tools related to e-book design were applied to a mixed genre collection of 50 e-books from popular online sites. Tool performance varied…

  14. An Approximate Solution to the Equation of Motion for Large-Angle Oscillations of the Simple Pendulum with Initial Velocity

    ERIC Educational Resources Information Center

    Johannessen, Kim

    2010-01-01

    An analytic approximation of the solution to the differential equation describing the oscillations of a simple pendulum at large angles and with initial velocity is discussed. In the derivation, a sinusoidal approximation has been applied, and an analytic formula for the large-angle period of the simple pendulum is obtained, which also includes…

  15. A graphical approach to radio frequency quadrupole design

    NASA Astrophysics Data System (ADS)

    Turemen, G.; Unel, G.; Yasatekin, B.

    2015-07-01

    The design of a radio frequency quadrupole, an important section of all ion accelerators, and the calculation of its beam dynamics properties can be achieved using the existing computational tools. These programs, originally designed in 1980s, show effects of aging in their user interfaces and in their output. The authors believe there is room for improvement in both design techniques using a graphical approach and in the amount of analytical calculations before going into CPU burning finite element analysis techniques. Additionally an emphasis on the graphical method of controlling the evolution of the relevant parameters using the drag-to-change paradigm is bound to be beneficial to the designer. A computer code, named DEMIRCI, has been written in C++ to demonstrate these ideas. This tool has been used in the design of Turkish Atomic Energy Authority (TAEK)'s 1.5 MeV proton beamline at Saraykoy Nuclear Research and Training Center (SANAEM). DEMIRCI starts with a simple analytical model, calculates the RFQ behavior and produces 3D design files that can be fed to a milling machine. The paper discusses the experience gained during design process of SANAEM Project Prometheus (SPP) RFQ and underlines some of DEMIRCI's capabilities.

  16. A Quantum-Like View to a Generalized Two Players Game

    NASA Astrophysics Data System (ADS)

    Bagarello, F.

    2015-10-01

    This paper consider the possibility of using some quantum tools in decision making strategies. In particular, we consider here a dynamical open quantum system helping two players, and , to take their decisions in a specific context. We see that, within our approach, the final choices of the players do not depend in general on their initial mental states, but they are driven essentially by the environment which interacts with them. The model proposed here also considers interactions of different nature between the two players, and it is simple enough to allow for an analytical solution of the equations of motion.

  17. On a computational model of building thermal dynamic response

    NASA Astrophysics Data System (ADS)

    Jarošová, Petra; Vala, Jiří

    2016-07-01

    Development and exploitation of advanced materials, structures and technologies in civil engineering, both for buildings with carefully controlled interior temperature and for common residential houses, together with new European and national directives and technical standards, stimulate the development of rather complex and robust, but sufficiently simple and inexpensive computational tools, supporting their design and optimization of energy consumption. This paper demonstrates the possibility of consideration of such seemingly contradictory requirements, using the simplified non-stationary thermal model of a building, motivated by the analogy with the analysis of electric circuits; certain semi-analytical forms of solutions come from the method of lines.

  18. Classical problems in computational aero-acoustics

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C.

    1996-01-01

    In relation to the expected problems in the development of computational aeroacoustics (CAA), the preliminary applications were to classical problems where the known analytical solutions could be used to validate the numerical results. Such comparisons were used to overcome the numerical problems inherent in these calculations. Comparisons were made between the various numerical approaches to the problems such as direct simulations, acoustic analogies and acoustic/viscous splitting techniques. The aim was to demonstrate the applicability of CAA as a tool in the same class as computational fluid dynamics. The scattering problems that occur are considered and simple sources are discussed.

  19. Nuclear magnetic resonance and high-performance liquid chromatography techniques for the characterization of bioactive compounds from Humulus lupulus L. (hop).

    PubMed

    Bertelli, Davide; Brighenti, Virginia; Marchetti, Lucia; Reik, Anna; Pellati, Federica

    2018-06-01

    Humulus lupulus L. (hop) represents one of the most cultivated crops, it being a key ingredient in the brewing process. Many health-related properties have been described for hop extracts, making this plant gain more interest in the field of pharmaceutical and nutraceutical research. Among the analytical tools available for the phytochemical characterization of plant extracts, quantitative nuclear magnetic resonance (qNMR) represents a new and powerful technique. In this ambit, the present study was aimed at the development of a new, simple, and efficient qNMR method for the metabolite fingerprinting of bioactive compounds in hop cones, taking advantage of the novel ERETIC 2 tool. To the best of our knowledge, this is the first attempt to apply this method to complex matrices of natural origin, such as hop extracts. The qNMR method set up in this study was applied to the quantification of both prenylflavonoids and bitter acids in eight hop cultivars. The performance of this analytical method was compared with that of HPLC-UV/DAD, which represents the most frequently used technique in the field of natural product analysis. The quantitative data obtained for hop samples by means of the two aforementioned techniques highlighted that the amount of bioactive compounds was slightly higher when qNMR was applied, although the order of magnitude of the values was the same. The accuracy of qNMR was comparable to that of the chromatographic method, thus proving to be a reliable tool for the analysis of these secondary metabolites in hop extracts. Graphical abstract Graphical abstract related to the extraction and analytical methods applied in this work for the analysis of bioactive compounds in Humulus lupulus L. (hop) cones.

  20. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  1. TRAC, a collaborative computer tool for tracer-test interpretation

    NASA Astrophysics Data System (ADS)

    Gutierrez, A.; Klinka, T.; Thiéry, D.; Buscarlet, E.; Binet, S.; Jozja, N.; Défarge, C.; Leclerc, B.; Fécamp, C.; Ahumada, Y.; Elsass, J.

    2013-05-01

    Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being). Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr">http://trac.brgm.fr.

  2. Carbon Nanomaterial Based Biosensors for Non-Invasive Detection of Cancer and Disease Biomarkers for Clinical Diagnosis

    PubMed Central

    Tung, Thanh Tran

    2017-01-01

    The early diagnosis of diseases, e.g., Parkinson’s and Alzheimer’s disease, diabetes, and various types of cancer, and monitoring the response of patients to the therapy plays a critical role in clinical treatment; therefore, there is an intensive research for the determination of many clinical analytes. In order to achieve point-of-care sensing in clinical practice, sensitive, selective, cost-effective, simple, reliable, and rapid analytical methods are required. Biosensors have become essential tools in biomarker sensing, in which electrode material and architecture play critical roles in achieving sensitive and stable detection. Carbon nanomaterials in the form of particle/dots, tube/wires, and sheets have recently become indispensable elements of biosensor platforms due to their excellent mechanical, electronic, and optical properties. This review summarizes developments in this lucrative field by presenting major biosensor types and variability of sensor platforms in biomedical applications. PMID:28825646

  3. Title: Experimental and analytical study of frictional anisotropy of nanotubes

    NASA Astrophysics Data System (ADS)

    Riedo, Elisa; Gao, Yang; Li, Tai-De; Chiu, Hsiang-Chih; Kim, Suenne; Klinke, Christian; Tosatti, Erio

    The frictional properties of Carbon and Boron Nitride nanotubes (NTs) are very important in a variety of applications, including composite materials, carbon fibers, and micro/nano-electromechanical systems. Atomic force microscopy (AFM) is a powerful tool to investigate with nanoscale resolution the frictional properties of individual NTs. Here, we report on an experimental study of the frictional properties of different types of supported nanotubes by AFM. We also propose a quantitative model to describe and then predict the frictional properties of nanotubes sliding on a substrate along (longitudinal friction) or perpendicular (transverse friction) their axis. This model provides a simple but general analytical relationship that well describes the acquired experimental data. As an example of potential applications, this experimental method combined with the proposed model can guide to design better NTs-ceramic composites, or to self-assemble the nanotubes on a surface in a given direction. M. Lucas et al., Nature Materials 8, 876-881 (2009).

  4. From pixel to voxel: a deeper view of biological tissue by 3D mass spectral imaging

    PubMed Central

    Ye, Hui; Greer, Tyler; Li, Lingjun

    2011-01-01

    Three dimensional mass spectral imaging (3D MSI) is an exciting field that grants the ability to study a broad mass range of molecular species ranging from small molecules to large proteins by creating lateral and vertical distribution maps of select compounds. Although the general premise behind 3D MSI is simple, factors such as choice of ionization method, sample handling, software considerations and many others must be taken into account for the successful design of a 3D MSI experiment. This review provides a brief overview of ionization methods, sample preparation, software types and technological advancements driving 3D MSI research of a wide range of low- to high-mass analytes. Future perspectives in this field are also provided to conclude that the positive and promises ever-growing applications in the biomedical field with continuous developments of this powerful analytical tool. PMID:21320052

  5. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  7. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  8. TLD efficiency calculations for heavy ions: an analytical approach

    DOE PAGES

    Boscolo, Daria; Scifoni, Emanuele; Carlino, Antonio; ...

    2015-12-18

    The use of thermoluminescent dosimeters (TLDs) in heavy charged particles’ dosimetry is limited by their non-linear dose response curve and by their response dependence on the radiation quality. Thus, in order to use TLDs with particle beams, a model that can reproduce the behavior of these detectors under different conditions is needed. Here a new, simple and completely analytical algorithm for the calculation of the relative TL-efficiency depending on the ion charge Z and energy E is presented. In addition, the detector response is evaluated starting from the single ion case, where the computed effectiveness values have been compared withmore » experimental data as well as with predictions from a different method. The main advantage of this approach is that, being fully analytical, it is computationally fast and can be efficiently integrated into treatment planning verification tools. In conclusion, the calculated efficiency values have been then implemented in the treatment planning code TRiP98 and dose calculations on a macroscopic target irradiated with an extended carbon ion field have been performed and verified against experimental data.« less

  9. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Fabricating Simple Wax Screen-Printing Paper-Based Analytical Devices to Demonstrate the Concept of Limiting Reagent in Acid- Base Reactions

    ERIC Educational Resources Information Center

    Namwong, Pithakpong; Jarujamrus, Purim; Amatatongchai, Maliwan; Chairam, Sanoe

    2018-01-01

    In this article, a low-cost, simple, and rapid fabrication of paper-based analytical devices (PADs) using a wax screen-printing method is reported here. The acid-base reaction is implemented in the simple PADs to demonstrate to students the chemistry concept of a limiting reagent. When a fixed concentration of base reacts with a gradually…

  11. 33 CFR 385.33 - Revisions to models and analytical tools.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Management District, and other non-Federal sponsors shall rely on the best available science including models..., and assessment of projects. The selection of models and analytical tools shall be done in consultation... system-wide simulation models and analytical tools used in the evaluation and assessment of projects, and...

  12. Simple and accurate methods for quantifying deformation, disruption, and development in biological tissues

    PubMed Central

    Boyle, John J.; Kume, Maiko; Wyczalkowski, Matthew A.; Taber, Larry A.; Pless, Robert B.; Xia, Younan; Genin, Guy M.; Thomopoulos, Stavros

    2014-01-01

    When mechanical factors underlie growth, development, disease or healing, they often function through local regions of tissue where deformation is highly concentrated. Current optical techniques to estimate deformation can lack precision and accuracy in such regions due to challenges in distinguishing a region of concentrated deformation from an error in displacement tracking. Here, we present a simple and general technique for improving the accuracy and precision of strain estimation and an associated technique for distinguishing a concentrated deformation from a tracking error. The strain estimation technique improves accuracy relative to other state-of-the-art algorithms by directly estimating strain fields without first estimating displacements, resulting in a very simple method and low computational cost. The technique for identifying local elevation of strain enables for the first time the successful identification of the onset and consequences of local strain concentrating features such as cracks and tears in a highly strained tissue. We apply these new techniques to demonstrate a novel hypothesis in prenatal wound healing. More generally, the analytical methods we have developed provide a simple tool for quantifying the appearance and magnitude of localized deformation from a series of digital images across a broad range of disciplines. PMID:25165601

  13. Erratum: A Simple, Analytical Model of Collisionless Magnetic Reconnection in a Pair Plasma

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Zenitani, Seiji; Kuznetsova, Masha; Klimas, Alex

    2011-01-01

    The following describes a list of errata in our paper, "A simple, analytical model of collisionless magnetic reconnection in a pair plasma." It supersedes an earlier erratum. We recently discovered an error in the derivation of the outflow-to-inflow density ratio.

  14. The Role of Wakes in Modelling Tidal Current Turbines

    NASA Astrophysics Data System (ADS)

    Conley, Daniel; Roc, Thomas; Greaves, Deborah

    2010-05-01

    The eventual proper development of arrays of Tidal Current Turbines (TCT) will require a balance which maximizes power extraction while minimizing environmental impacts. Idealized analytical analogues and simple 2-D models are useful tools for investigating questions of a general nature but do not represent a practical tool for application to realistic cases. Some form of 3-D numerical simulations will be required for such applications and the current project is designed to develop a numerical decision-making tool for use in planning large scale TCT projects. The project is predicated on the use of an existing regional ocean modelling framework (the Regional Ocean Modelling System - ROMS) which is modified to enable the user to account for the effects of TCTs. In such a framework where mixing processes are highly parametrized, the fidelity of the quantitative results is critically dependent on the parameter values utilized. In light of the early stage of TCT development and the lack of field scale measurements, the calibration of such a model is problematic. In the absence of explicit calibration data sets, the device wake structure has been identified as an efficient feature for model calibration. This presentation will discuss efforts to design an appropriate calibration scheme which focuses on wake decay and the motivation for this approach, techniques applied, validation results from simple test cases and limitations shall be presented.

  15. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    PubMed

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  16. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food.

    PubMed

    Della Pelle, Flavio; Compagnone, Dario

    2018-02-04

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.

  17. Nanomaterial-Based Sensing and Biosensing of Phenolic Compounds and Related Antioxidant Capacity in Food

    PubMed Central

    2018-01-01

    Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719

  18. Neural-network quantum state tomography

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe

    2018-05-01

    The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics1-3. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems4,5. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators6-8.

  19. A Simple Numerical Procedure for the Simulation of "Lifelike" Linear-Sweep Voltammograms

    NASA Astrophysics Data System (ADS)

    Bozzini, Benedetto P.

    2000-01-01

    Practical linear-sweep voltammograms seldom resemble the theoretical ones shown in textbooks. This is because several phenomena (activation, mass transport, ohmic resistance) control the kinetics over different potential ranges scanned during the potential sweep. These effects are generally treated separately in the didactic literature, yet they have never been "assembled" in a way that allows the educational use of real experiments. This makes linear-sweep voltammetric experiments almost unusable in the teaching of physical chemistry. A simple approach to the classroom description of "lifelike" experimental results is proposed in this paper. Analytical expressions of linear sweep voltammograms are provided. The actual numerical evaluations can be carried out with a pocket calculator. Two typical examples are executed and comparison with experimental data is described. This approach to teaching electrode kinetics has proved an effective tool to provide students with an insight into the effects of electrochemical parameters and operating conditions.

  20. Exploring the Dynamics of Cell Processes through Simulations of Fluorescence Microscopy Experiments

    PubMed Central

    Angiolini, Juan; Plachta, Nicolas; Mocskos, Esteban; Levi, Valeria

    2015-01-01

    Fluorescence correlation spectroscopy (FCS) methods are powerful tools for unveiling the dynamical organization of cells. For simple cases, such as molecules passively moving in a homogeneous media, FCS analysis yields analytical functions that can be fitted to the experimental data to recover the phenomenological rate parameters. Unfortunately, many dynamical processes in cells do not follow these simple models, and in many instances it is not possible to obtain an analytical function through a theoretical analysis of a more complex model. In such cases, experimental analysis can be combined with Monte Carlo simulations to aid in interpretation of the data. In response to this need, we developed a method called FERNET (Fluorescence Emission Recipes and Numerical routines Toolkit) based on Monte Carlo simulations and the MCell-Blender platform, which was designed to treat the reaction-diffusion problem under realistic scenarios. This method enables us to set complex geometries of the simulation space, distribute molecules among different compartments, and define interspecies reactions with selected kinetic constants, diffusion coefficients, and species brightness. We apply this method to simulate single- and multiple-point FCS, photon-counting histogram analysis, raster image correlation spectroscopy, and two-color fluorescence cross-correlation spectroscopy. We believe that this new program could be very useful for predicting and understanding the output of fluorescence microscopy experiments. PMID:26039162

  1. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    ERIC Educational Resources Information Center

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  2. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    PubMed

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  3. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  4. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  5. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  6. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  7. Simultaneous determination of gabapentin, pregabalin, vigabatrin, and topiramate in plasma by HPLC with fluorescence detection.

    PubMed

    Martinc, Boštjan; Roškar, Robert; Grabnar, Iztok; Vovk, Tomaž

    2014-07-01

    Therapeutic drug monitoring (TDM) of antiepileptic drugs (AEDs) has been recognized as a useful tool in management of epilepsy. We developed a simple analytical method for simultaneous determination of four second generation AEDs, including gabapentin (GBP), pregabalin (PGB), vigabatrin (VGB), and topiramate (TOP). Analytes were extracted from human plasma using universal solid phase extraction, derivatized with 4-chloro-7-nitrobenzofurazan (NBD-Cl) and analyzed by HPLC with fluorescence detection. Using mass spectrometry we confirmed that NBD-Cl reacts with sulfamate group of TOP similarly as with amine group of the other three analytes. The method is linear (r(2)>0.998) across investigated analytical ranges (0.375-30.0μg/mL for GBP, PGB, and VGB; 0.50-20.0μg/mL for TOP). Intraday and interday precision do not exceed 9.40%. The accuracy is from 95.6% to 106%. The recovery is higher than 80.6%, and the lower limit of quantification is at least 0.5μg/mL. The method is selective and robust. For TOP determination the method was compared to a previously published method and the results obtained by the two methods were in good agreement. The developed method is suitable for routine TDM. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Making advanced analytics work for you.

    PubMed

    Barton, Dominic; Court, David

    2012-10-01

    Senior leaders who write off the move toward big data as a lot of big talk are making, well, a big mistake. So argue McKinsey's Barton and Court, who worked with dozens of companies to figure out how to translate advanced analytics into nuts-and-bolts practices that affect daily operations on the front lines. The authors offer a useful guide for leaders and managers who want to take a deliberative approach to big data-but who also want to get started now. First, companies must identify the right data for their business, seek to acquire the information creatively from diverse sources, and secure the necessary IT support. Second, they need to build analytics models that are tightly focused on improving performance, making the models only as complex as business goals demand. Third, and most important, companies must transform their capabilities and culture so that the analytical results can be implemented from the C-suite to the front lines. That means developing simple tools that everyone in the organization can understand and teaching people why the data really matter. Embracing big data is as much about changing mind-sets as it is about crunching numbers. Executed with the right care and flexibility, this cultural shift could have payoffs that are, well, bigger than you expect.

  9. Determination of transformation products of unsymmetrical dimethylhydrazine in water using vacuum-assisted headspace solid-phase microextraction.

    PubMed

    Orazbayeva, Dina; Kenessov, Bulat; Psillakis, Elefteria; Nassyrova, Dayana; Bektassov, Marat

    2018-06-22

    A new, sensitive and simple method based on vacuum-assisted headspace solid-phase microextraction (Vac-HSSPME) followed by gas chromatography-mass-spectrometry (GC-MS), is proposed for the quantification of rocket fuel unsymmetrical dimethylhydrazine (UDMH) transformation products in water samples. The target transformation products were: pyrazine, 1-methyl-1H-pyrazole, N-nitrosodimethylamine, N,N-dimethylformamide, 1-methyl-1Н-1,2,4-triazole, 1-methyl-imidazole and 1H-pyrazole. For these analytes and within shorter sampling times, Vac-HSSPME yielded detection limits (0.5-100 ng L -1 ) 3-10 times lower than those reported for regular HSSPME. Vac-HSSPME sampling for 30 min at 50 °C yielded the best combination of analyte responses and their standard deviations (<15%). 1-Formyl-2,2-dimethylhydrazine and formamide were discarded because of the poor precision and accuracy when using Vac-HSSPME. The recoveries for the rest of the analytes ranged between 80 and 119%. The modified Mininert valve and Thermogreen septum could be used for automated extraction as it ensured stable analyte signals even after long waiting times (>24 h). Finally, multiple Vac-HSSME proved to be an efficient tool for controlling the matrix effect and quantifying UDMH transformation products. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Urinary 24-h creatinine excretion in adults and its use as a simple tool for the estimation of daily urinary analyte excretion from analyte/creatinine ratios in populations.

    PubMed

    Johner, S A; Boeing, H; Thamm, M; Remer, T

    2015-12-01

    The assessment of urinary excretion of specific nutrients (e.g. iodine, sodium) is frequently used to monitor a population's nutrient status. However, when only spot urines are available, always a risk of hydration-status-dependent dilution effects and related misinterpretations exists. The aim of the present study was to establish mean values of 24-h creatinine excretion widely applicable for an appropriate estimation of 24-h excretion rates of analytes from spot urines in adults. Twenty-four-hour creatinine excretion from the formerly representative cross-sectional German VERA Study (n=1463, 20-79 years old) was analysed. Linear regression analysis was performed to identify the most important influencing factors of creatinine excretion. In a subsample of the German DONALD Study (n=176, 20-29 years old), the applicability of the 24-h creatinine excretion values of VERA for the estimation of 24-h sodium and iodine excretion from urinary concentration measurements was tested. In the VERA Study, mean 24-h creatinine excretion was 15.4 mmol per day in men and 11.1 mmol per day in women, significantly dependent on sex, age, body weight and body mass index. Based on the established 24-h creatinine excretion values, mean 24-h iodine and sodium excretions could be estimated from respective analyte/creatinine concentrations, with average deviations <10% compared with the actual 24-h means. The present mean values of 24-h creatinine excretion are suggested as a useful tool to derive realistic hydration-status-independent average 24-h excretion rates from urinary analyte/creatinine ratios. We propose to apply these creatinine reference means routinely in biomarker-based studies aiming at characterizing the nutrient or metabolite status of adult populations by simply measuring metabolite/creatinine ratios in spot urines.

  11. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  12. A simple method for estimating frequency response corrections for eddy covariance systems

    Treesearch

    W. J. Massman

    2000-01-01

    A simple analytical formula is developed for estimating the frequency attenuation of eddy covariance fluxes due to sensor response, path-length averaging, sensor separation, signal processing, and flux averaging periods. Although it is an approximation based on flat terrain cospectra, this analytical formula should have broader applicability than just flat-terrain...

  13. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  14. Understanding Business Analytics

    DTIC Science & Technology

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  15. 3D inelastic analysis methods for hot section components

    NASA Technical Reports Server (NTRS)

    Dame, L. T.; Chen, P. C.; Hartle, M. S.; Huang, H. T.

    1985-01-01

    The objective is to develop analytical tools capable of economically evaluating the cyclic time dependent plasticity which occurs in hot section engine components in areas of strain concentration resulting from the combination of both mechanical and thermal stresses. Three models were developed. A simple model performs time dependent inelastic analysis using the power law creep equation. The second model is the classical model of Professors Walter Haisler and David Allen of Texas A and M University. The third model is the unified model of Bodner, Partom, et al. All models were customized for linear variation of loads and temperatures with all material properties and constitutive models being temperature dependent.

  16. Photoacoustic spectroscopy of condensed matter

    NASA Technical Reports Server (NTRS)

    Somoano, R. B.

    1978-01-01

    Photoacoustic spectroscopy is a new analytical tool that provides a simple nondestructive technique for obtaining information about the electronic absorption spectrum of samples such as powders, semisolids, gels, and liquids. It can also be applied to samples which cannot be examined by conventional optical methods. Numerous applications of this technique in the field of inorganic and organic semiconductors, biology, and catalysis have been described. Among the advantages of photoacoustic spectroscopy, the signal is almost insensitive to light scattering by the sample and information can be obtained about nonradiative deactivation processes. Signal saturation, which can modify the intensity of individual absorption bands in special cases, is a drawback of the method.

  17. Raman spectroscopic investigation of thorium dioxide-uranium dioxide (ThO₂-UO₂) fuel materials.

    PubMed

    Rao, Rekha; Bhagat, R K; Salke, Nilesh P; Kumar, Arun

    2014-01-01

    Raman spectroscopic investigations were carried out on proposed nuclear fuel thorium dioxide-uranium dioxide (ThO2-UO2) solid solutions and simulated fuels based on ThO2-UO2. Raman spectra of ThO2-UO2 solid solutions exhibited two-mode behavior in the entire composition range. Variations in mode frequencies and relative intensities of Raman modes enabled estimation of composition, defects, and oxygen stoichiometry in these compounds that are essential for their application. The present study shows that Raman spectroscopy is a simple, promising analytical tool for nondestructive characterization of this important class of nuclear fuel materials.

  18. Design/analysis of the JWST ISIM bonded joints for survivability at cryogenic temperatures

    NASA Astrophysics Data System (ADS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini, Benjamin; Young, Daniel

    2005-08-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite adhesively bonded joints at the cryogenic temperature of 30K (-405°F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to hybrid composite tubes (75mm square) made with M55J/954-6 and T300/954-6 prepregs. Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently, the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  19. Next-generation confirmatory disease diagnostics

    NASA Astrophysics Data System (ADS)

    Lin, Robert; Gerver, Rachel; Karns, Kelly; Apori, Akwasi A.; Denisin, Aleksandra K.; Herr, Amy E.

    2014-06-01

    Microfluidic tools are advancing capabilities in screening diagnostics for use in near-patient settings. Here, we review three case studies to illustrate the flexibility and analytical power offered by microanalytical tools. We first overview a near-patient tool for detection of protein markers found in cerebrospinal fluid (CSF), as a means to identify the presence of cerebrospinal fluid in nasal mucous - an indication that CSF is leaking into the nasal cavity. Microfluidic design allowed integration of several up-stream preparatory steps and rapid, specific completion of the human CSF protein assay. Second, we overview a tear fluid based assay for lactoferrin, a protein produced in the lacrimal gland, then secreted into tear fluid. Tear Lf is a putative biomarker for primary SS. A critical contribution of this and related work being measurement of Lf, even in light of well-known and significant matrix interactions and losses during the tear fluid collection and preparation. Lastly, we review a microfluidic barcode platform that enables rapid measurement of multiple infectious disease biomarkers in human sera. The assay presents a new approach to multiplexed biomarker detection, yet in a simple straight microchannel - thus providing a streamlined, simplified microanalytical platform, as is relevant to robust operation in diagnostic settings. We view microfluidic design and analytical chemistry as the basis for emerging, sophisticated assays that will advance not just screening diagnostic technology, but confirmatory assays, sample preparation and handling, and thus introduction and utilization of new biomarkers and assay formats.

  20. A highly specific competitive direct enzyme immunoassay for sterigmatocystin as a tool for rapid immunochemotaxonomic differentiation of mycotoxigenic Aspergillus species.

    PubMed

    Wegner, S; Bauer, J I; Dietrich, R; Märtlbauer, E; Usleber, E; Gottschalk, C; Gross, M

    2017-02-01

    A simplified method to produce specific polyclonal rabbit antibodies against sterigmatocystin (STC) was established, using a STC-glycolic acid-ether derivative (STC-GE) conjugated to keyhole limpet haemocyanin (immunogen). The competitive direct enzyme immunoassay (EIA) established for STC had a detection limit (20% binding inhibition) of 130 pg ml -1 . The test was highly specific for STC, with minor cross-reactivity with O-methylsterigmatocystin (OMSTC, 0·87%) and negligible reactivity with aflatoxins (<0·02%). STC-EIA was used in combination with a previously developed specific EIA for aflatoxins (<0·1% cross-reactivity with STC and OMSTC), to study the STC/aflatoxin production profiles of reference strains of Aspergillus species. This immunochemotaxonomic procedure was found to be a convenient tool to identify STC- or aflatoxin-producing strains. The carcinogenic mycotoxin sterigmatocystin (STC) is produced by several Aspergillus species, either alone or together with aflatoxins. Here, we report a very simple and straightforward procedure to obtain highly sensitive and specific anti-STC antibodies, and their use in the first ever real STC-specific competitive direct enzyme immunoassay (EIA). In combination with a previous EIA for aflatoxins, this study for the first time demonstrates the potential of a STC/aflatoxin EIA pair for what is branded as 'immunochemotaxonomic' identification of mycotoxigenic Aspergillus species. This new analytical tool enhances analytical possibilities for differential analysis of STC and aflatoxins. © 2016 The Society for Applied Microbiology.

  1. Strehl ratio: a tool for optimizing optical nulls and singularities.

    PubMed

    Hénault, François

    2015-07-01

    In this paper a set of radial and azimuthal phase functions are reviewed that have a null Strehl ratio, which is equivalent to generating a central extinction in the image plane of an optical system. The study is conducted in the framework of Fraunhofer scalar diffraction, and is oriented toward practical cases where optical nulls or singularities are produced by deformable mirrors or phase plates. The identified solutions reveal unexpected links with the zeros of type-J Bessel functions of integer order. They include linear azimuthal phase ramps giving birth to an optical vortex, azimuthally modulated phase functions, and circular phase gratings (CPGs). It is found in particular that the CPG radiometric efficiency could be significantly improved by the null Strehl ratio condition. Simple design rules for rescaling and combining the different phase functions are also defined. Finally, the described analytical solutions could also serve as starting points for an automated searching software tool.

  2. Preliminary design methods for fiber reinforced composite structures employing a personal computer

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1986-01-01

    The objective of this project was to develop a user-friendly interactive computer program to be used as an analytical tool by structural designers. Its intent was to do preliminary, approximate stress analysis to help select or verify sizing choices for composite structural members. The approach to the project was to provide a subroutine which uses classical lamination theory to predict an effective elastic modulus for a laminate of arbitrary material and ply orientation. This effective elastic modulus can then be used in a family of other subroutines which employ the familiar basic structural analysis methods for isotropic materials. This method is simple and convenient to use but only approximate, as is appropriate for a preliminary design tool which will be subsequently verified by more sophisticated analysis. Additional subroutines have been provided to calculate laminate coefficient of thermal expansion and to calculate ply-by-ply strains within a laminate.

  3. A digital future for the history of psychology?

    PubMed

    Green, Christopher D

    2016-08-01

    This article discusses the role that digital approaches to the history of psychology are likely to play in the near future. A tentative hierarchy of digital methods is proposed. A few examples are briefly described: a digital repository, a simple visualization using ready-made online database and tools, and more complex visualizations requiring the assembly of the database and, possibly, the analytic tools by the researcher. The relationship of digital history to the old "New Economic History" (Cliometrics) is considered. The question of whether digital history and traditional history need be at odds or, instead, might complement each other is woven throughout. The rapidly expanding territory of digital humanistic research outside of psychology is briefly discussed. Finally, the challenging current employment trends in history and the humanities more broadly are considered, along with the role that digital skills might play in mitigating those factors for prospective academic workers. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  4. Simulation of green roof runoff under different substrate depths and vegetation covers by coupling a simple conceptual and a physically based hydrological model.

    PubMed

    Soulis, Konstantinos X; Valiantzas, John D; Ntoulas, Nikolaos; Kargas, George; Nektarios, Panayiotis A

    2017-09-15

    In spite of the well-known green roof benefits, their widespread adoption in the management practices of urban drainage systems requires the use of adequate analytical and modelling tools. In the current study, green roof runoff modeling was accomplished by developing, testing, and jointly using a simple conceptual model and a physically based numerical simulation model utilizing HYDRUS-1D software. The use of such an approach combines the advantages of the conceptual model, namely simplicity, low computational requirements, and ability to be easily integrated in decision support tools with the capacity of the physically based simulation model to be easily transferred in conditions and locations other than those used for calibrating and validating it. The proposed approach was evaluated with an experimental dataset that included various green roof covers (either succulent plants - Sedum sediforme, or xerophytic plants - Origanum onites, or bare substrate without any vegetation) and two substrate depths (either 8 cm or 16 cm). Both the physically based and the conceptual models matched very closely the observed hydrographs. In general, the conceptual model performed better than the physically based simulation model but the overall performance of both models was sufficient in most cases as it is revealed by the Nash-Sutcliffe Efficiency index which was generally greater than 0.70. Finally, it was showcased how a physically based and a simple conceptual model can be jointly used to allow the use of the simple conceptual model for a wider set of conditions than the available experimental data and in order to support green roof design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  6. Advances in aptamer screening and small molecule aptasensors.

    PubMed

    Kim, Yeon Seok; Gu, Man Bock

    2014-01-01

    It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.

  7. Development and application of accurate analytical models for single active electron potentials

    NASA Astrophysics Data System (ADS)

    Miller, Michelle; Jaron-Becker, Agnieszka; Becker, Andreas

    2015-05-01

    The single active electron (SAE) approximation is a theoretical model frequently employed to study scenarios in which inner-shell electrons may productively be treated as frozen spectators to a physical process of interest, and accurate analytical approximations for these potentials are sought as a useful simulation tool. Density function theory is often used to construct a SAE potential, requiring that a further approximation for the exchange correlation functional be enacted. In this study, we employ the Krieger, Li, and Iafrate (KLI) modification to the optimized-effective-potential (OEP) method to reduce the complexity of the problem to the straightforward solution of a system of linear equations through simple arguments regarding the behavior of the exchange-correlation potential in regions where a single orbital dominates. We employ this method for the solution of atomic and molecular potentials, and use the resultant curve to devise a systematic construction for highly accurate and useful analytical approximations for several systems. Supported by the U.S. Department of Energy (Grant No. DE-FG02-09ER16103), and the U.S. National Science Foundation (Graduate Research Fellowship, Grants No. PHY-1125844 and No. PHY-1068706).

  8. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    USGS Publications Warehouse

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  9. Method development and qualification of capillary zone electrophoresis for investigation of therapeutic monoclonal antibody quality.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2016-10-01

    Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A multi-purpose tool for food inspection: Simultaneous determination of various classes of preservatives and biogenic amines in meat and fish products by LC-MS.

    PubMed

    Molognoni, Luciano; Daguer, Heitor; de Sá Ploêncio, Leandro Antunes; De Dea Lindner, Juliano

    2018-02-01

    This paper describes an innovative fast and multipurpose method for the chemical inspection of meat and fish products by liquid chromatography-tandem mass spectrometry. Solid-liquid extraction and low temperature partitioning were applied to 17 analytes, which included large bacteriocins (3.5kDa) and small molecules (organic acids, heterocyclic compounds, polyene macrolides, alkyl esters of the p-hydroxybenzoic acid, aromatic, and aliphatic biogenic amines and polyamines). Chromatographic separation was achieved in 10min, using stationary phase of di-isopropyl-3-aminopropyl silane bound to hydroxylated silica. Method validation was in accordance to Commission Decision 657/2002/CE. Linear ranges were among 1.25-10.0mgkg -1 (natamycin and parabens), 2.50-10.0mgkg -1 (sorbate and nisin), 25.0-200mgkg -1 (biogenic amines, hexamethylenetetramine, benzoic and lactic acids), and 50.0-400mgkg -1 (citric acid). Expanded measurement uncertainty (U) was estimated by single laboratory validation combined to modeling in two calculation approaches: internal (U = 5%) and external standardization (U = 24%). Method applicability was checked on 89 real samples among raw, cooked, dry fermented and cured products, yielding acceptable recoveries. Many regulatory issues were revealed, corroborating the need for enhancement of the current analytical methods. This simple execution method dispenses the use of additional procedures of extraction and, therefore, reduces costs over time. It is suitable for routine analysis as a screening or confirmatory tool for both qualitative and quantitative results, replacing many time consuming analytical procedures. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. HPLC-ESI-QTOF-MS as a powerful analytical tool for characterising phenolic compounds in olive-leaf extracts.

    PubMed

    Quirantes-Piné, Rosa; Lozano-Sánchez, Jesús; Herrero, Miguel; Ibáñez, Elena; Segura-Carretero, Antonio; Fernández-Gutiérrez, Alberto

    2013-01-01

    Olea europaea L. leaves may be considered a cheap, easily available natural source of phenolic compounds. In a previous study we evaluated the possibility of obtaining bioactive phenolic compounds from olive leaves by pressurised liquid extraction (PLE) for their use as natural anti-oxidants. The alimentary use of these kinds of extract makes comprehensive knowledge of their composition essential. To undertake a comprehensive characterisation of two olive-leaf extracts obtained by PLE using high-performance liquid chromatography coupled to electrospray ionisation and quadrupole time-of-flight mass spectrometry (HPLC-ESI-QTOF-MS). Olive leaves were extracted by PLE using ethanol and water as extraction solvents at 150°C and 200°C respectively. Separation was carried out in a HPLC system equipped with a C₁₈-column working in a gradient elution programme coupled to ESI-QTOF-MS operating in negative ion mode. This analytical platform was able to detect 48 compounds and tentatively identify 31 different phenolic compounds in these extracts, including secoiridoids, simple phenols, flavonoids, cinnamic-acid derivatives and benzoic acids. Lucidumoside C was also identified for the first time in olive leaves. The coupling of HPLC-ESI-QTOF-MS led to the in-depth characterisation of the olive-leaf extracts on the basis of mass accuracy, true isotopic pattern and tandem mass spectrometry (MS/MS) spectra. We may conclude therefore that this analytical tool is very valuable in the study of phenolic compounds in plant matrices. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Concept design theory and model for multi-use space facilities: Analysis of key system design parameters through variance of mission requirements

    NASA Astrophysics Data System (ADS)

    Reynerson, Charles Martin

    This research has been performed to create concept design and economic feasibility data for space business parks. A space business park is a commercially run multi-use space station facility designed for use by a wide variety of customers. Both space hardware and crew are considered as revenue producing payloads. Examples of commercial markets may include biological and materials research, processing, and production, space tourism habitats, and satellite maintenance and resupply depots. This research develops a design methodology and an analytical tool to create feasible preliminary design information for space business parks. The design tool is validated against a number of real facility designs. Appropriate model variables are adjusted to ensure that statistical approximations are valid for subsequent analyses. The tool is used to analyze the effect of various payload requirements on the size, weight and power of the facility. The approach for the analytical tool was to input potential payloads as simple requirements, such as volume, weight, power, crew size, and endurance. In creating the theory, basic principles are used and combined with parametric estimation of data when necessary. Key system parameters are identified for overall system design. Typical ranges for these key parameters are identified based on real human spaceflight systems. To connect the economics to design, a life-cycle cost model is created based upon facility mass. This rough cost model estimates potential return on investments, initial investment requirements and number of years to return on the initial investment. Example cases are analyzed for both performance and cost driven requirements for space hotels, microgravity processing facilities, and multi-use facilities. In combining both engineering and economic models, a design-to-cost methodology is created for more accurately estimating the commercial viability for multiple space business park markets.

  13. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    NASA Astrophysics Data System (ADS)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  14. The transfer of analytical procedures.

    PubMed

    Ermer, J; Limberger, M; Lis, K; Wätzig, H

    2013-11-01

    Analytical method transfers are certainly among the most discussed topics in the GMP regulated sector. However, they are surprisingly little regulated in detail. General information is provided by USP, WHO, and ISPE in particular. Most recently, the EU emphasized the importance of analytical transfer by including it in their draft of the revised GMP Guideline. In this article, an overview and comparison of these guidelines is provided. The key to success for method transfers is the excellent communication between sending and receiving unit. In order to facilitate this communication, procedures, flow charts and checklists for responsibilities, success factors, transfer categories, the transfer plan and report, strategies in case of failed transfers, tables with acceptance limits are provided here, together with a comprehensive glossary. Potential pitfalls are described such that they can be avoided. In order to assure an efficient and sustainable transfer of analytical procedures, a practically relevant and scientifically sound evaluation with corresponding acceptance criteria is crucial. Various strategies and statistical tools such as significance tests, absolute acceptance criteria, and equivalence tests are thoroughly descibed and compared in detail giving examples. Significance tests should be avoided. The success criterion is not statistical significance, but rather analytical relevance. Depending on a risk assessment of the analytical procedure in question, statistical equivalence tests are recommended, because they include both, a practically relevant acceptance limit and a direct control of the statistical risks. However, for lower risk procedures, a simple comparison of the transfer performance parameters to absolute limits is also regarded as sufficient. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. WetDATA Hub: Democratizing Access to Water Data to Accelerate Innovation through Data Visualization, Predictive Analytics and Artificial Intelligence Applications

    NASA Astrophysics Data System (ADS)

    Sarni, W.

    2017-12-01

    Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.

  16. Using Laboratory Homework to Facilitate Skill Integration and Assess Understanding in Intermediate Physics Courses

    NASA Astrophysics Data System (ADS)

    Johnston, Marty; Jalkio, Jeffrey

    2013-04-01

    By the time students have reached the intermediate level physics courses they have been exposed to a broad set of analytical, experimental, and computational skills. However, their ability to independently integrate these skills into the study of a physical system is often weak. To address this weakness and assess their understanding of the underlying physical concepts we have introduced laboratory homework into lecture based, junior level theoretical mechanics and electromagnetics courses. A laboratory homework set replaces a traditional one and emphasizes the analysis of a single system. In an exercise, students use analytical and computational tools to predict the behavior of a system and design a simple measurement to test their model. The laboratory portion of the exercises is straight forward and the emphasis is on concept integration and application. The short student reports we collect have revealed misconceptions that were not apparent in reviewing the traditional homework and test problems. Work continues on refining the current problems and expanding the problem sets.

  17. μ-PADs for detection of chemical warfare agents.

    PubMed

    Pardasani, Deepak; Tak, Vijay; Purohit, Ajay K; Dubey, D K

    2012-12-07

    Conventional methods of detection of chemical warfare agents (CWAs) based on chromogenic reactions are time and solvent intensive. The development of cost, time and solvent effective microfluidic paper based analytical devices (μ-PADs) for the detection of nerve and vesicant agents is described. The detection of analytes was based upon their reactions with rhodamine hydroxamate and para-nitrobenzyl pyridine, producing red and blue colours respectively. Reactions were optimized on the μ-PADs to produce the limits of detection (LODs) as low as 100 μM for sulfur mustard in aqueous samples. Results were quantified with the help of a simple desktop scanner and Photoshop software. Sarin achieved a linear response in the two concentration ranges of 20-100 mM and 100-500 mM, whereas the response of sulfur mustard was found to be linear in the concentration range of 10-75 mM. Results were precise enough to establish the μ-PADs as a valuable tool for security personnel fighting against chemical terrorism.

  18. Hawkeye and AMOS: visualizing and assessing the quality of genome assemblies

    PubMed Central

    Schatz, Michael C.; Phillippy, Adam M.; Sommer, Daniel D.; Delcher, Arthur L.; Puiu, Daniela; Narzisi, Giuseppe; Salzberg, Steven L.; Pop, Mihai

    2013-01-01

    Since its launch in 2004, the open-source AMOS project has released several innovative DNA sequence analysis applications including: Hawkeye, a visual analytics tool for inspecting the structure of genome assemblies; the Assembly Forensics and FRCurve pipelines for systematically evaluating the quality of a genome assembly; and AMOScmp, the first comparative genome assembler. These applications have been used to assemble and analyze dozens of genomes ranging in complexity from simple microbial species through mammalian genomes. Recent efforts have been focused on enhancing support for new data characteristics brought on by second- and now third-generation sequencing. This review describes the major components of AMOS in light of these challenges, with an emphasis on methods for assessing assembly quality and the visual analytics capabilities of Hawkeye. These interactive graphical aspects are essential for navigating and understanding the complexities of a genome assembly, from the overall genome structure down to individual bases. Hawkeye and AMOS are available open source at http://amos.sourceforge.net. PMID:22199379

  19. A novel multiple headspace extraction gas chromatographic method for measuring the diffusion coefficient of methanol in water and in olive oil.

    PubMed

    Zhang, Chun-Yun; Chai, Xin-Sheng

    2015-03-13

    A novel method for the determination of the diffusion coefficient (D) of methanol in water and olive oil has been developed. Based on multiple headspace extraction gas chromatography (MHE-GC), the methanol released from the liquid sample of interest in a closed sample vial was determined in a stepwise fashion. A theoretical model was derived to establish the relationship between the diffusion coefficient and the GC signals from MHE-GC measurements. The results showed that the present method has an excellent precision (RSD<1%) in the linear fitting procedure and good accuracy for the diffusion coefficients of methanol in both water and olive oil, when compared with data reported in the literature. The present method is simple and practical and can be a valuable tool for the determination of the diffusion coefficient of volatile analyte(s) into food simulants from food and beverage packaging material, both in research studies and in actual applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Partial least-squares with residual bilinearization for the spectrofluorimetric determination of pesticides. A solution of the problems of inner-filter effects and matrix interferents.

    PubMed

    Piccirilli, Gisela N; Escandar, Graciela M

    2006-09-01

    This paper demonstrates for the first time the power of a chemometric second-order algorithm for predicting, in a simple way and using spectrofluorimetric data, the concentration of analytes in the presence of both the inner-filter effect and unsuspected species. The simultaneous determination of the systemic fungicides carbendazim and thiabendazole was achieved and employed for the discussion of the scopes of the applied second-order chemometric tools: parallel factor analysis (PARAFAC) and partial least-squares with residual bilinearization (PLS/RBL). The chemometric study was performed using fluorescence excitation-emission matrices obtained after the extraction of the analytes over a C18-membrane surface. The ability of PLS/RBL to recognize and overcome the significant changes produced by thiabendazole in both the excitation and emission spectra of carbendazim is demonstrated. The high performance of the selected PLS/RBL method was established with the determination of both pesticides in artificial and real samples.

  1. Transient well flow in leaky multiple-aquifer systems

    NASA Astrophysics Data System (ADS)

    Hemker, C. J.

    1985-10-01

    A previously developed eigenvalue analysis approach to groundwater flow in leaky multiple aquifers is used to derive exact solutions for transient well flow problems in leaky and confined systems comprising any number of aquifers. Equations are presented for the drawdown distribution in systems of infinite extent, caused by wells penetrating one or more of the aquifers completely and discharging each layer at a constant rate. Since the solution obtained may be regarded as a combined analytical-numerical technique, a type of one-dimensional modelling can be applied to find approximate solutions for several complicating conditions. Numerical evaluations are presented as time-drawdown curves and include effects of storage in the aquitard, unconfined conditions, partially penetrating wells and stratified aquifers. The outcome of calculations for relatively simple systems compares very well with published corresponding results. The proposed multilayer solution can be a valuable tool in aquifer test evaluation, as it provides the analytical expression required to enable the application of existing computer methods to the determination of aquifer characteristics.

  2. Fast Quantum Algorithm for Predicting Descriptive Statistics of Stochastic Processes

    NASA Technical Reports Server (NTRS)

    Williams Colin P.

    1999-01-01

    Stochastic processes are used as a modeling tool in several sub-fields of physics, biology, and finance. Analytic understanding of the long term behavior of such processes is only tractable for very simple types of stochastic processes such as Markovian processes. However, in real world applications more complex stochastic processes often arise. In physics, the complicating factor might be nonlinearities; in biology it might be memory effects; and in finance is might be the non-random intentional behavior of participants in a market. In the absence of analytic insight, one is forced to understand these more complex stochastic processes via numerical simulation techniques. In this paper we present a quantum algorithm for performing such simulations. In particular, we show how a quantum algorithm can predict arbitrary descriptive statistics (moments) of N-step stochastic processes in just O(square root of N) time. That is, the quantum complexity is the square root of the classical complexity for performing such simulations. This is a significant speedup in comparison to the current state of the art.

  3. Investigating the causes of low detectability of pesticides in fruits and vegetables analysed by high-performance liquid chromatography - Time-of-flight.

    PubMed

    Muehlwald, S; Buchner, N; Kroh, L W

    2018-03-23

    Because of the high number of possible pesticide residues and their chemical complexity, it is necessary to develop methods which cover a broad range of pesticides. In this work, a qualitative multi-screening method for pesticides was developed by use of HPLC-ESI-Q-TOF. 110 pesticides were chosen for the creation of a personal compound database and library (PCDL). The MassHunter Qualitative Analysis software from Agilent Technologies was used to identify the analytes. The software parameter settings were optimised to produce a low number of false positive as well as false negative results. The method was validated for 78 selected pesticides. However, the validation criteria were not fulfilled for 45 analytes. Due to this result, investigations were started to elucidate reasons for the low detectability. It could be demonstrated that the three main causes of the signal suppression were the co-eluting matrix (matrix effect), the low sensitivity of the analyte in standard solution and the fragmentation of the analyte in the ion source (in-source collision-induced dissociation). In this paper different examples are discussed showing that the impact of these three causes is different for each analyte. For example, it is possible that an analyte with low signal intensity and an intense fragmentation in the ion source is detectable in a difficult matrix, whereas an analyte with a high sensitivity and a low fragmentation is not detectable in a simple matrix. Additionally, it could be shown that in-source fragments are a helpful tool for an unambiguous identification. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  5. A simple analytical aerodynamic model of Langley Winged-Cone Aerospace Plane concept

    NASA Technical Reports Server (NTRS)

    Pamadi, Bandu N.

    1994-01-01

    A simple three DOF analytical aerodynamic model of the Langley Winged-Coned Aerospace Plane concept is presented in a form suitable for simulation, trajectory optimization, and guidance and control studies. The analytical model is especially suitable for methods based on variational calculus. Analytical expressions are presented for lift, drag, and pitching moment coefficients from subsonic to hypersonic Mach numbers and angles of attack up to +/- 20 deg. This analytical model has break points at Mach numbers of 1.0, 1.4, 4.0, and 6.0. Across these Mach number break points, the lift, drag, and pitching moment coefficients are made continuous but their derivatives are not. There are no break points in angle of attack. The effect of control surface deflection is not considered. The present analytical model compares well with the APAS calculations and wind tunnel test data for most angles of attack and Mach numbers.

  6. Theory of linear sweep voltammetry with diffuse charge: Unsupported electrolytes, thin films, and leaky membranes

    NASA Astrophysics Data System (ADS)

    Yan, David; Bazant, Martin Z.; Biesheuvel, P. M.; Pugh, Mary C.; Dawson, Francis P.

    2017-03-01

    Linear sweep and cyclic voltammetry techniques are important tools for electrochemists and have a variety of applications in engineering. Voltammetry has classically been treated with the Randles-Sevcik equation, which assumes an electroneutral supported electrolyte. In this paper, we provide a comprehensive mathematical theory of voltammetry in electrochemical cells with unsupported electrolytes and for other situations where diffuse charge effects play a role, and present analytical and simulated solutions of the time-dependent Poisson-Nernst-Planck equations with generalized Frumkin-Butler-Volmer boundary conditions for a 1:1 electrolyte and a simple reaction. Using these solutions, we construct theoretical and simulated current-voltage curves for liquid and solid thin films, membranes with fixed background charge, and cells with blocking electrodes. The full range of dimensionless parameters is considered, including the dimensionless Debye screening length (scaled to the electrode separation), Damkohler number (ratio of characteristic diffusion and reaction times), and dimensionless sweep rate (scaled to the thermal voltage per diffusion time). The analysis focuses on the coupling of Faradaic reactions and diffuse charge dynamics, although capacitive charging of the electrical double layers is also studied, for early time transients at reactive electrodes and for nonreactive blocking electrodes. Our work highlights cases where diffuse charge effects are important in the context of voltammetry, and illustrates which regimes can be approximated using simple analytical expressions and which require more careful consideration.

  7. The effect of stimulus strength on the speed and accuracy of a perceptual decision.

    PubMed

    Palmer, John; Huk, Alexander C; Shadlen, Michael N

    2005-05-02

    Both the speed and the accuracy of a perceptual judgment depend on the strength of the sensory stimulation. When stimulus strength is high, accuracy is high and response time is fast; when stimulus strength is low, accuracy is low and response time is slow. Although the psychometric function is well established as a tool for analyzing the relationship between accuracy and stimulus strength, the corresponding chronometric function for the relationship between response time and stimulus strength has not received as much consideration. In this article, we describe a theory of perceptual decision making based on a diffusion model. In it, a decision is based on the additive accumulation of sensory evidence over time to a bound. Combined with simple scaling assumptions, the proportional-rate and power-rate diffusion models predict simple analytic expressions for both the chronometric and psychometric functions. In a series of psychophysical experiments, we show that this theory accounts for response time and accuracy as a function of both stimulus strength and speed-accuracy instructions. In particular, the results demonstrate a close coupling between response time and accuracy. The theory is also shown to subsume the predictions of Piéron's Law, a power function dependence of response time on stimulus strength. The theory's analytic chronometric function allows one to extend theories of accuracy to response time.

  8. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    The rise of innovative unmanned aeronautical systems and the emergence of commercial space activities have resulted in a number of relatively new aerospace organizations that are designing innovative systems and solutions. These organizations use a variety of commercial off-the-shelf and in-house-developed simulation and analysis tools including 6-degree-of-freedom (6-DOF) flight simulation tools. The increased affordability of computing capability has made highfidelity flight simulation practical for all participants. Verification of the tools' equations-of-motion and environment models (e.g., atmosphere, gravitation, and geodesy) is desirable to assure accuracy of results. However, aside from simple textbook examples, minimal verification data exists in open literature for 6-DOF flight simulation problems. This assessment compared multiple solution trajectories to a set of verification check-cases that covered atmospheric and exo-atmospheric (i.e., orbital) flight. Each scenario consisted of predefined flight vehicles, initial conditions, and maneuvers. These scenarios were implemented and executed in a variety of analytical and real-time simulation tools. This tool-set included simulation tools in a variety of programming languages based on modified flat-Earth, round- Earth, and rotating oblate spheroidal Earth geodesy and gravitation models, and independently derived equations-of-motion and propagation techniques. The resulting simulated parameter trajectories were compared by over-plotting and difference-plotting to yield a family of solutions. In total, seven simulation tools were exercised.

  9. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    NASA Astrophysics Data System (ADS)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  10. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  11. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  12. Preliminary Validation of Direct Detection of Foot-And-Mouth Disease Virus within Clinical Samples Using Reverse Transcription Loop-Mediated Isothermal Amplification Coupled with a Simple Lateral Flow Device for Detection

    PubMed Central

    Waters, Ryan A.; Fowler, Veronica L.; Armson, Bryony; Nelson, Noel; Gloster, John; Paton, David J.; King, Donald P.

    2014-01-01

    Rapid, field-based diagnostic assays are desirable tools for the control of foot-and-mouth disease (FMD). Current approaches involve either; 1) Detection of FMD virus (FMDV) with immuochromatographic antigen lateral flow devices (LFD), which have relatively low analytical sensitivity, or 2) portable RT-qPCR that has high analytical sensitivity but is expensive. Loop-mediated isothermal amplification (LAMP) may provide a platform upon which to develop field based assays without these drawbacks. The objective of this study was to modify an FMDV-specific reverse transcription–LAMP (RT-LAMP) assay to enable detection of dual-labelled LAMP products with an LFD, and to evaluate simple sample processing protocols without nucleic acid extraction. The limit of detection of this assay was demonstrated to be equivalent to that of a laboratory based real-time RT-qPCR assay and to have a 10,000 fold higher analytical sensitivity than the FMDV-specific antigen LFD currently used in the field. Importantly, this study demonstrated that FMDV RNA could be detected from epithelial suspensions without the need for prior RNA extraction, utilising a rudimentary heat source for amplification. Once optimised, this RT-LAMP-LFD protocol was able to detect multiple serotypes from field epithelial samples, in addition to detecting FMDV in the air surrounding infected cattle, pigs and sheep, including pre-clinical detection. This study describes the development and evaluation of an assay format, which may be used as a future basis for rapid and low cost detection of FMDV. In addition it provides providing “proof of concept” for the future use of LAMP assays to tackle other challenging diagnostic scenarios encompassing veterinary and human health. PMID:25165973

  13. Headspace-SPME-GC/MS as a simple cleanup tool for sensitive 2,6-diisopropylphenol analysis from lipid emulsions and adaptable to other matrices.

    PubMed

    Pickl, Karin E; Adamek, Viktor; Gorges, Roland; Sinner, Frank M

    2011-07-15

    Due to increased regulatory requirements, the interaction of active pharmaceutical ingredients with various surfaces and solutions during production and storage is gaining interest in the pharmaceutical research field, in particular with respect to development of new formulations, new packaging material and the evaluation of cleaning processes. Experimental adsorption/absorption studies as well as the study of cleaning processes require sophisticated analytical methods with high sensitivity for the drug of interest. In the case of 2,6-diisopropylphenol - a small lipophilic drug which is typically formulated as lipid emulsion for intravenous injection - a highly sensitive method in the concentration range of μg/l suitable to be applied to a variety of different sample matrices including lipid emulsions is needed. We hereby present a headspace-solid phase microextraction (HS-SPME) approach as a simple cleanup procedure for sensitive 2,6-diisopropylphenol quantification from diverse matrices choosing a lipid emulsion as the most challenging matrix with regard to complexity. By combining the simple and straight forward HS-SPME sample pretreatment with an optimized GC-MS quantification method a robust and sensitive method for 2,6-diisopropylphenol was developed. This method shows excellent sensitivity in the low μg/l concentration range (5-200μg/l), good accuracy (94.8-98.8%) and precision (intraday-precision 0.1-9.2%, inter-day precision 2.0-7.7%). The method can be easily adapted to other, less complex, matrices such as water or swab extracts. Hence, the presented method holds the potential to serve as a single and simple analytical procedure for 2,6-diisopropylphenol analysis in various types of samples such as required in, e.g. adsorption/absorption studies which typically deal with a variety of different surfaces (steel, plastic, glass, etc.) and solutions/matrices including lipid emulsions. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Low-cost electron-gun pulser for table-top maser experiments

    NASA Astrophysics Data System (ADS)

    Grinberg, V.; Jerby, E.; Shahadi, A.

    1995-04-01

    A simple 10 kV electron-gun pulser for small-scale maser experiments is presented. This low-cost pulser has operated successfully in various table-top cyclotron-resonance maser (CRM) and free-electron maser (FEM) experiments. It consists of a low-voltage capacitor bank, an SCR control circuit and a transformer bank (car ignition coils) connected directly to the e-gun. The pulser produces a current of 3 A at 10 kV voltage in a Gaussian like shape of 1 ms pulse width. The voltage sweep during the pulse provides a useful tool to locate resonances of CRM and FEM interactions. Analytical expressions for the pulser design and experimental measurements are presented.

  15. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Bridgman growth of semiconductors

    NASA Technical Reports Server (NTRS)

    Carlson, F. M.

    1985-01-01

    The purpose of this study was to improve the understanding of the transport phenomena which occurs in the directional solidification of alloy semiconductors. In particular, emphasis was placed on the strong role of convection in the melt. Analytical solutions were not deemed possible for such an involved problem. Accordingly, a numerical model of the process was developed which simulated the transport. This translates into solving the partial differential equations of energy, mass, species, and momentum transfer subject to various boundary and initial conditions. A finite element method with simple elements was initially chosen. This simulation tool will enable the crystal grower to systematically identify and modify the important design factors within her control to produce better crystals.

  17. Cosmological Perturbation Theory and the Spherical Collapse model - I. Gaussian initial conditions

    NASA Astrophysics Data System (ADS)

    Fosalba, Pablo; Gaztanaga, Enrique

    1998-12-01

    We present a simple and intuitive approximation for solving the perturbation theory (PT) of small cosmic fluctuations. We consider only the spherically symmetric or monopole contribution to the PT integrals, which yields the exact result for tree-graphs (i.e. at leading order). We find that the non-linear evolution in Lagrangian space is then given by a simple local transformation over the initial conditions, although it is not local in Euler space. This transformation is found to be described by the spherical collapse (SC) dynamics, as it is the exact solution in the shearless (and therefore local) approximation in Lagrangian space. Taking advantage of this property, it is straightforward to derive the one-point cumulants, xi_J, for both the unsmoothed and smoothed density fields to arbitrary order in the perturbative regime. To leading-order this reproduces, and provides us with a simple explanation for, the exact results obtained by Bernardeau. We then show that the SC model leads to accurate estimates for the next corrective terms when compared with the results derived in the exact perturbation theory making use of the loop calculations. The agreement is within a few per cent for the hierarchical ratios S_J=xi_J/xi^J-1_2. We compare our analytic results with N-body simulations, which turn out to be in very good agreement up to scales where sigma~1. A similar treatment is presented to estimate higher order corrections in the Zel'dovich approximation. These results represent a powerful and readily usable tool to produce analytical predictions that describe the gravitational clustering of large-scale structure in the weakly non-linear regime.

  18. A Novel Shape Parameterization Approach

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper presents a novel parameterization approach for complex shapes suitable for a multidisciplinary design optimization application. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft objects animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity analysis tools (e.g., nonlinear computational fluid dynamics and detailed finite element modeling). This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, and camber. The results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, performance, and a simple propulsion module.

  19. Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in the same manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminate plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling) analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.

  20. Multidisciplinary Aerodynamic-Structural Shape Optimization Using Deformation (MASSOUD)

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    This paper presents a multidisciplinary shape parameterization approach. The approach consists of two basic concepts: (1) parameterizing the shape perturbations rather than the geometry itself and (2) performing the shape deformation by means of the soft object animation algorithms used in computer graphics. Because the formulation presented in this paper is independent of grid topology, we can treat computational fluid dynamics and finite element grids in a similar manner. The proposed approach is simple, compact, and efficient. Also, the analytical sensitivity derivatives are easily computed for use in a gradient-based optimization. This algorithm is suitable for low-fidelity (e.g., linear aerodynamics and equivalent laminated plate structures) and high-fidelity (e.g., nonlinear computational fluid dynamics and detailed finite element modeling analysis tools. This paper contains the implementation details of parameterizing for planform, twist, dihedral, thickness, camber, and free-form surface. Results are presented for a multidisciplinary design optimization application consisting of nonlinear computational fluid dynamics, detailed computational structural mechanics, and a simple performance module.

  1. A simple, analytical, axisymmetric microburst model for downdraft estimation

    NASA Technical Reports Server (NTRS)

    Vicroy, Dan D.

    1991-01-01

    A simple analytical microburst model was developed for use in estimating vertical winds from horizontal wind measurements. It is an axisymmetric, steady state model that uses shaping functions to satisfy the mass continuity equation and simulate boundary layer effects. The model is defined through four model variables: the radius and altitude of the maximum horizontal wind, a shaping function variable, and a scale factor. The model closely agrees with a high fidelity analytical model and measured data, particularily in the radial direction and at lower altitudes. At higher altitudes, the model tends to overestimate the wind magnitude relative to the measured data.

  2. Simple analytical model of a thermal diode

    NASA Astrophysics Data System (ADS)

    Kaushik, Saurabh; Kaushik, Sachin; Marathe, Rahul

    2018-05-01

    Recently there is a lot of attention given to manipulation of heat by constructing thermal devices such as thermal diodes, transistors and logic gates. Many of the models proposed have an asymmetry which leads to the desired effect. Presence of non-linear interactions among the particles is also essential. But, such models lack analytical understanding. Here we propose a simple, analytically solvable model of a thermal diode. Our model consists of classical spins in contact with multiple heat baths and constant external magnetic fields. Interestingly the magnetic field is the only parameter required to get the effect of heat rectification.

  3. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    PubMed

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  4. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  5. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  6. Miniaturized Temperature-Controlled Planar Chromatography (Micro-TLC) as a Versatile Technique for Fast Screening of Micropollutants and Biomarkers Derived from Surface Water Ecosystems and During Technological Processes of Wastewater Treatment.

    PubMed

    Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2017-07-01

    There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.

  7. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROTECTION

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  8. Horizontal lifelines - review of regulations and simple design method considering anchorage rigidity.

    PubMed

    Galy, Bertrand; Lan, André

    2018-03-01

    Among the many occupational risks construction workers encounter every day falling from a height is the most dangerous. The objective of this article is to propose a simple analytical design method for horizontal lifelines (HLLs) that considers anchorage flexibility. The article presents a short review of the standards and regulations/acts/codes concerning HLLs in Canada the USA and Europe. A static analytical approach is proposed considering anchorage flexibility. The analytical results are compared with a series of 42 dynamic fall tests and a SAP2000 numerical model. The experimental results show that the analytical method is a little conservative and overestimates the line tension in most cases with a maximum of 17%. The static SAP2000 results show a maximum 2.1% difference with the analytical method. The analytical method is accurate enough to safely design HLLs and quick design abaci are provided to allow the engineer to make quick on-site verification if needed.

  9. Simultaneous Voltammetric Determination of Acetaminophen and Isoniazid (Hepatotoxicity-Related Drugs) Utilizing Bismuth Oxide Nanorod Modified Screen-Printed Electrochemical Sensing Platforms.

    PubMed

    Mahmoud, Bahaa G; Khairy, Mohamed; Rashwan, Farouk A; Banks, Craig E

    2017-02-07

    To overcome the recent outbreaks of hepatotoxicity-related drugs, a new analytical tool for the continuously determination of these drugs in human fluids is required. Electrochemical-based analytical methods offer an effective, rapid, and simple tool for on-site determination of various organic and inorganic species. However, the design of a sensitive, selective, stable, and reproducible sensor is still a major challenge. In the present manuscript, a facile, one-pot hydrothermal synthesis of bismuth oxide (Bi 2 O 2.33 ) nanostructures (nanorods) was developed. These BiO nanorods were cast onto mass disposable graphite screen-printed electrodes (BiO-SPEs), allowing the ultrasensitive determination of acetaminophen (APAP) in the presence of its common interference isoniazid (INH), which are both found in drug samples. The simultaneous electroanalytical sensing using BiO-SPEs exhibited strong electrocatalytic activity toward the sensing of APAP and INH with an enhanced analytical signal (voltammetric peak) over that achievable at unmodified (bare) SPEs. The electroanalytical sensing of APAP and INH are possible with accessible linear ranges from 0.5 to 1250 μM and 5 to 1760 μM with limits of detection (3σ) of 30 nM and 1.85 μM, respectively. The stability, reproducibility, and repeatability of BiO-SPE were also investigated. The BiO-SPEs were evaluated toward the sensing of APAP and INH in human serum, urine, saliva, and tablet samples. The results presented in this paper demonstrate that BiO-SPEs sensing platforms provide a potential candidate for the accurate determination of APAP and INH within human fluids and pharmaceutical formulations.

  10. Experimental and analytical tools for evaluation of Stirling engine rod seal behavior

    NASA Technical Reports Server (NTRS)

    Krauter, A. I.; Cheng, H. S.

    1979-01-01

    The first year of a two year experimental and analytical program is reported. The program is directed at the elastohydrodynamic behavior of sliding elastomeric rod seals for the Stirling engine. During the year, experimental and analytical tools were developed for evaluating seal leakage, seal friction, and the fluid film thickness at the seal/cylinder interface.

  11. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  12. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  13. A simple rapid process for semi-automated brain extraction from magnetic resonance images of the whole mouse head.

    PubMed

    Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L

    2016-01-15

    Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  15. An Analytical State Transition Matrix for Orbits Perturbed by an Oblate Spheroid

    NASA Technical Reports Server (NTRS)

    Mueller, A. C.

    1977-01-01

    An analytical state transition matrix and its inverse, which include the short period and secular effects of the second zonal harmonic, were developed from the nonsingular PS satellite theory. The fact that the independent variable in the PS theory is not time is in no respect disadvantageous, since any explicit analytical solution must be expressed in the true or eccentric anomaly. This is shown to be the case for the simple conic matrix. The PS theory allows for a concise, accurate, and algorithmically simple state transition matrix. The improvement over the conic matrix ranges from 2 to 4 digits accuracy.

  16. Mining Mathematics in Textbook Lessons

    ERIC Educational Resources Information Center

    Ronda, Erlina; Adler, Jill

    2017-01-01

    In this paper, we propose an analytic tool for describing the mathematics made available to learn in a "textbook lesson". The tool is an adaptation of the Mathematics Discourse in Instruction (MDI) analytic tool that we developed to analyze what is made available to learn in teachers' lessons. Our motivation to adapt the use of the MDI…

  17. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  18. Guidance for the Design and Adoption of Analytic Tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bandlow, Alisa

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  19. Helicase-dependent isothermal amplification: a novel tool in the development of molecular-based analytical systems for rapid pathogen detection.

    PubMed

    Barreda-García, Susana; Miranda-Castro, Rebeca; de-Los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J; Lobo-Castañón, María Jesús

    2018-01-01

    Highly sensitive testing of nucleic acids is essential to improve the detection of pathogens, which pose a major threat for public health worldwide. Currently available molecular assays, mainly based on PCR, have a limited utility in point-of-need control or resource-limited settings. Consequently, there is a strong interest in developing cost-effective, robust, and portable platforms for early detection of these harmful microorganisms. Since its description in 2004, isothermal helicase-dependent amplification (HDA) has been successfully applied in the development of novel molecular-based technologies for rapid, sensitive, and selective detection of viruses and bacteria. In this review, we highlight relevant analytical systems using this simple nucleic acid amplification methodology that takes place at a constant temperature and that is readily compatible with microfluidic technologies. Different strategies for monitoring HDA amplification products are described. In addition, we present technological advances for integrating sample preparation, HDA amplification, and detection. Future perspectives and challenges toward point-of-need use not only for clinical diagnosis but also in food safety testing and environmental monitoring are also discussed. Graphical Abstract Expanding the analytical toolbox for the detection of DNA sequences specific of pathogens with isothermal helicase dependent amplification (HDA).

  20. Constraints on the nuclear equation of state from nuclear masses and radii in a Thomas-Fermi meta-modeling approach

    NASA Astrophysics Data System (ADS)

    Chatterjee, D.; Gulminelli, F.; Raduta, Ad. R.; Margueron, J.

    2017-12-01

    The question of correlations among empirical equation of state (EoS) parameters constrained by nuclear observables is addressed in a Thomas-Fermi meta-modeling approach. A recently proposed meta-modeling for the nuclear EoS in nuclear matter is augmented with a single finite size term to produce a minimal unified EoS functional able to describe the smooth part of the nuclear ground state properties. This meta-model can reproduce the predictions of a large variety of models, and interpolate continuously between them. An analytical approximation to the full Thomas-Fermi integrals is further proposed giving a fully analytical meta-model for nuclear masses. The parameter space is sampled and filtered through the constraint of nuclear mass reproduction with Bayesian statistical tools. We show that this simple analytical meta-modeling has a predictive power on masses, radii, and skins comparable to full Hartree-Fock or extended Thomas-Fermi calculations with realistic energy functionals. The covariance analysis on the posterior distribution shows that no physical correlation is present between the different EoS parameters. Concerning nuclear observables, a strong correlation between the slope of the symmetry energy and the neutron skin is observed, in agreement with previous studies.

  1. Liquid chromatography-tandem mass spectrometry analysis of perfluorooctane sulfonate and perfluorooctanoic Acid in fish fillet samples.

    PubMed

    Paiano, Viviana; Fattore, Elena; Carrà, Andrea; Generoso, Caterina; Fanelli, Roberto; Bagnati, Renzo

    2012-01-01

    Perfluorooctane sulfonate (PFOS) and perfluorooctanoic (PFOA) acid are persistent contaminants which can be found in environmental and biological samples. A new and fast analytical method is described here for the analysis of these compounds in the edible part of fish samples. The method uses a simple liquid extraction by sonication, followed by a direct determination using liquid chromatography-tandem mass spectrometry (LC-MS/MS). The linearity of the instrumental response was good, with average regression coefficients of 0.9971 and 0.9979 for PFOS and PFOA, respectively, and the coefficients of variation (CV) of the method ranged from 8% to 20%. Limits of detection (LOD) were 0.04 ng/g for both the analytes and recoveries were 90% for PFOS and 76% for PFOA. The method was applied to samples of homogenized fillets of wild and farmed fish from the Mediterranean Sea. Most of the samples showed little or no contamination by perfluorooctane sulfonate and perfluorooctanoic acid, and the highest concentrations detected among the fish species analyzed were, respectively, 5.96 ng/g and 1.89 ng/g. The developed analytical methodology can be used as a tool to monitor and to assess human exposure to perfluorinated compounds through sea food consumption.

  2. Quantitative imaging for discovery and assembly of the metabo-regulome

    PubMed Central

    Okumoto, Sakiko; Takanaga, Hitomi; Frommer, Wolf B.

    2009-01-01

    Summary Little is known about regulatory networks that control metabolic flux in plant cells. Detailed understanding of regulation is crucial for synthetic biology. The difficulty of measuring metabolites with cellular and subcellular precision is a major roadblock. New tools have been developed for monitoring extracellular, cytosolic, organellar and vacuolar ion and metabolite concentrations with a time resolution of milliseconds to hours. Genetically encoded sensors allow quantitative measurement of steady-state concentrations of ions, signaling molecules and metabolites and their respective changes over time. Fluorescence resonance energy transfer (FRET) sensors exploit conformational changes in polypeptides as a proxy for analyte concentrations. Subtle effects of analyte binding on the conformation of the recognition element are translated into a FRET change between two fused green fluorescent protein (GFP) variants, enabling simple monitoring of analyte concentrations using fluorimetry or fluorescence microscopy. Fluorimetry provides information averaged over cell populations, while microscopy detects differences between cells or populations of cells. The genetically encoded sensors can be targeted to subcellular compartments or the cell surface. Confocal microscopy ultimately permits observation of gradients or local differences within a compartment. The FRET assays can be adapted to high-throughput analysis to screen mutant populations in order to systematically identify signaling networks that control individual steps in metabolic flux. PMID:19138219

  3. A simple framework for relating variations in runoff to variations in climatic conditions and catchment properties

    NASA Astrophysics Data System (ADS)

    Roderick, Michael L.; Farquhar, Graham D.

    2011-12-01

    We use the Budyko framework to calculate catchment-scale evapotranspiration (E) and runoff (Q) as a function of two climatic factors, precipitation (P) and evaporative demand (Eo = 0.75 times the pan evaporation rate), and a third parameter that encodes the catchment properties (n) and modifies how P is partitioned between E and Q. This simple theory accurately predicted the long-term evapotranspiration (E) and runoff (Q) for the Murray-Darling Basin (MDB) in southeast Australia. We extend the theory by developing a simple and novel analytical expression for the effects on E and Q of small perturbations in P, Eo, and n. The theory predicts that a 10% change in P, with all else constant, would result in a 26% change in Q in the MDB. Future climate scenarios (2070-2099) derived using Intergovernmental Panel on Climate Change AR4 climate model output highlight the diversity of projections for P (±30%) with a correspondingly large range in projections for Q (±80%) in the MDB. We conclude with a qualitative description about the impact of changes in catchment properties on water availability and focus on the interaction between vegetation change, increasing atmospheric [CO2], and fire frequency. We conclude that the modern version of the Budyko framework is a useful tool for making simple and transparent estimates of changes in water availability.

  4. Near-Earth object intercept trajectory design for planetary defense

    NASA Astrophysics Data System (ADS)

    Vardaxis, George; Wie, Bong

    2014-08-01

    Tracking the orbit of asteroids and planning for asteroid missions have ceased to be a simple exercise, and become more of a necessity, as the number of identified potentially hazardous near-Earth asteroids increases. Several software tools such as Mystic, MALTO, Copernicus, SNAP, OTIS, and GMAT have been developed by NASA for spacecraft trajectory optimization and mission design. However, this paper further expands upon the development and validation of an Asteroid Mission Design Software Tool (AMiDST), through the use of approach and post-encounter orbital variations and analytic keyhole theory. Combining these new capabilities with that of a high-precision orbit propagator, this paper describes fictional mission trajectory design examples of using AMiDST as applied to a fictitious asteroid 2013 PDC-E. During the 2013 IAA Planetary Defense Conference, the asteroid 2013 PDC-E was used for an exercise where participants simulated the decision-making process for developing deflection and civil defense responses to a hypothetical asteroid threat.

  5. Planungsmodelle und Planungsmethoden. Anhaltspunkte zur Strukturierung und Gestaltung von Planungsprozessen

    NASA Astrophysics Data System (ADS)

    Diller, Christian; Karic, Sarah; Oberding, Sarah

    2017-06-01

    The topic of this article ist the question, in which phases oft he political planning process planners apply their methodological set of tools. That for the results of a research-project are presented, which were gained by an examination of planning-cases in learned journals. Firstly it is argued, which model oft he planning-process is most suitable to reflect the regarded cases and how it is positioned to models oft he political process. Thereafter it is analyzed, which types of planning methods are applied in the several stages oft he planning process. The central findings: Although complex, many planning processes can be thouroughly pictured by a linear modell with predominantly simple feedback loops. Even in times of he communicative turn, concerning their set of tools, planners should pay attention to apply not only communicative methods but as well the classical analytical-rational methods. They are helpful especially for the understanding of the political process before and after the actual planning phase.

  6. Non-Gated Laser Induced Breakdown Spectroscopy Provides a Powerful Segmentation Tool on Concomitant Treatment of Characteristic and Continuum Emission

    PubMed Central

    Dasari, Ramachandra Rao; Barman, Ishan; Gundawar, Manoj Kumar

    2014-01-01

    We demonstrate the application of non-gated laser induced breakdown spectroscopy (LIBS) for characterization and classification of organic materials with similar chemical composition. While use of such a system introduces substantive continuum background in the spectral dataset, we show that appropriate treatment of the continuum and characteristic emission results in accurate discrimination of pharmaceutical formulations of similar stoichiometry. Specifically, our results suggest that near-perfect classification can be obtained by employing suitable multivariate analysis on the acquired spectra, without prior removal of the continuum background. Indeed, we conjecture that pre-processing in the form of background removal may introduce spurious features in the signal. Our findings in this report significantly advance the prior results in time-integrated LIBS application and suggest the possibility of a portable, non-gated LIBS system as a process analytical tool, given its simple instrumentation needs, real-time capability and lack of sample preparation requirements. PMID:25084522

  7. SAM Radiochemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.

  8. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL (EPA/600/SR-94/210)

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a groundwater flo...

  9. Hybridization chain reaction: a versatile molecular tool for biosensing, bioimaging, and biomedicine.

    PubMed

    Bi, Sai; Yue, Shuzhen; Zhang, Shusheng

    2017-07-17

    Developing powerful, simple and low-cost DNA amplification techniques is of great significance to bioanalysis and biomedical research. Thus far, many signal amplification strategies have been developed, such as polymerase chain reaction (PCR), rolling circle amplification (RCA), and DNA strand displacement amplification (SDA). In particular, hybridization chain reaction (HCR), a type of toehold-mediated strand displacement (TMSD) reaction, has attracted great interest because of its enzyme-free nature, isothermal conditions, simple protocols, and excellent amplification efficiency. In a typical HCR, an analyte initiates the cross-opening of two DNA hairpins, yielding nicked double helices that are analogous to alternating copolymers. As an efficient amplification platform, HCR has been utilized for the sensitive detection of a wide variety of analytes, including nucleic acids, proteins, small molecules, and cells. In recent years, more complicated sets of monomers have been designed to develop nonlinear HCR, such as branched HCR and even dendritic systems, achieving quadratic and exponential growth mechanisms. In addition, HCR has attracted enormous attention in the fields of bioimaging and biomedicine, including applications in fluorescence in situ hybridization (FISH) imaging, live cell imaging, and targeted drug delivery. In this review, we introduce the fundamentals of HCR and examine the visualization and analysis techniques for HCR products in detail. The most recent HCR developments in biosensing, bioimaging, and biomedicine are subsequently discussed with selected examples. Finally, the review provides insight into the challenges and future perspectives of HCR.

  10. Monitoring chemical reactions by low-field benchtop NMR at 45 MHz: pros and cons.

    PubMed

    Silva Elipe, Maria Victoria; Milburn, Robert R

    2016-06-01

    Monitoring chemical reactions is the key to controlling chemical processes where NMR can provide support. High-field NMR gives detailed structural information on chemical compounds and reactions; however, it is expensive and complex to operate. Conversely, low-field NMR instruments are simple and relatively inexpensive alternatives. While low-field NMR does not provide the detailed information as the high-field instruments as a result of their smaller chemical shift dispersion and the complex secondary coupling, it remains of practical value as a process analytical technology (PAT) tool and is complimentary to other established methods, such as ReactIR and Raman spectroscopy. We have tested a picoSpin-45 (currently under ThermoFisher Scientific) benchtop NMR instrument to monitor three types of reactions by 1D (1) H NMR: a Fischer esterification, a Suzuki cross-coupling, and the formation of an oxime. The Fischer esterification is a relatively simple reaction run at high concentration and served as proof of concept. The Suzuki coupling is an example of a more complex, commonly used reaction involving overlapping signals. Finally, the oxime formation involved a reaction in two phases that cannot be monitored by other PAT tools. Here, we discuss the pros and cons of monitoring these reactions at a low-field of 45 MHz by 1D (1) H NMR. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data

    NASA Astrophysics Data System (ADS)

    Jern, Mikael

    Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.

  12. Modeling of human operator dynamics in simple manual control utilizing time series analysis. [tracking (position)

    NASA Technical Reports Server (NTRS)

    Agarwal, G. C.; Osafo-Charles, F.; Oneill, W. D.; Gottlieb, G. L.

    1982-01-01

    Time series analysis is applied to model human operator dynamics in pursuit and compensatory tracking modes. The normalized residual criterion is used as a one-step analytical tool to encompass the processes of identification, estimation, and diagnostic checking. A parameter constraining technique is introduced to develop more reliable models of human operator dynamics. The human operator is adequately modeled by a second order dynamic system both in pursuit and compensatory tracking modes. In comparing the data sampling rates, 100 msec between samples is adequate and is shown to provide better results than 200 msec sampling. The residual power spectrum and eigenvalue analysis show that the human operator is not a generator of periodic characteristics.

  13. Optimal low thrust geocentric transfer. [mission analysis computer program

    NASA Technical Reports Server (NTRS)

    Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.

    1973-01-01

    A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.

  14. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  15. Stochastic dynamics of cholera epidemics

    NASA Astrophysics Data System (ADS)

    Azaele, Sandro; Maritan, Amos; Bertuzzo, Enrico; Rodriguez-Iturbe, Ignacio; Rinaldo, Andrea

    2010-05-01

    We describe the predictions of an analytically tractable stochastic model for cholera epidemics following a single initial outbreak. The exact model relies on a set of assumptions that may restrict the generality of the approach and yet provides a realm of powerful tools and results. Without resorting to the depletion of susceptible individuals, as usually assumed in deterministic susceptible-infected-recovered models, we show that a simple stochastic equation for the number of ill individuals provides a mechanism for the decay of the epidemics occurring on the typical time scale of seasonality. The model is shown to provide a reasonably accurate description of the empirical data of the 2000/2001 cholera epidemic which took place in the Kwa Zulu-Natal Province, South Africa, with possibly notable epidemiological implications.

  16. Direct determination of trace phthalate esters in alcoholic spirits by spray-inlet microwave plasma torch ionization tandem mass spectrometry.

    PubMed

    Miao, Meng; Zhao, Gaosheng; Xu, Li; Dong, Junguo; Cheng, Ping

    2018-03-01

    A direct analytical method based on spray-inlet microwave plasma torch tandem mass spectrometry was applied to simultaneously determine 4 phthalate esters (PAEs), namely, benzyl butyl phthalate, diethyl phthalate, dipentyl phthalate, and dodecyl phthalate with extremely high sensitivity in spirits without sample treatment. Among the 4 brands of spirit products, 3 kinds of PAE compounds were directly determined at very low concentrations from 1.30 to 114 ng·g -1 . Compared with other online and off-line methods, the spray-inlet microwave plasma torch tandem mass spectrometry technique is extremely simple, rapid, sensitive, and high efficient, providing an ideal screening tool for PAEs in spirits. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Hydrolyzable tannins with the hexahydroxydiphenoyl unit and the m-depsidic link: HPLC-DAD-MS identification and model synthesis.

    PubMed

    Arapitsas, Panagiotis; Menichetti, Stefano; Vincieri, Franco F; Romani, Annalisa

    2007-01-10

    This study was designed to develop efficient analytical tools for the difficult HPLC-DAD-MS identification of hydrolyzable tannins in natural tissue extracts. Throughout the study of the spectroscopic characteristics of properly synthesized stereodefined standards, it was observed that the UV-vis spectra of compounds with the m-depsidic link showed a characteristic shoulder at 300 nm, consistent with the simple glucogalloyl esters, whereas compounds with the hexahydroxydiphenoyl (HHDP) unit gave a diagnostic fragmentation pattern, caused by a spontaneous lactonization in the mass spectrometer. These observations were confirmed by HPLC-DAD-MS analyses of tannic acid and raspberry extracts, which are rich in hydrolyzable tannins with the m-depsidic link and the HHDP unit, respectively.

  18. Applications of capillary electrophoresis in characterizing recombinant protein therapeutics.

    PubMed

    Zhao, Shuai Sherry; Chen, David D Y

    2014-01-01

    The use of recombinant protein for therapeutic applications has increased significantly in the last three decades. The heterogeneity of these proteins, often caused by the complex biosynthesis pathways and the subsequent PTMs, poses a challenge for drug characterization to ensure its safety, quality, integrity, and efficacy. CE, with its simple instrumentation, superior separation efficiency, small sample consumption, and short analysis time, is a well-suited analytical tool for therapeutic protein characterization. Different separation modes, including CIEF, SDS-CGE, CZE, and CE-MS, provide complementary information of the proteins. The CE applications for recombinant therapeutic proteins from the year 2000 to June 2013 are reviewed and technical concerns are discussed in this article. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Information integration for a sky survey by data warehousing

    NASA Astrophysics Data System (ADS)

    Luo, A.; Zhang, Y.; Zhao, Y.

    The virtualization service of data system for a sky survey LAMOST is very important for astronomers The service needs to integrate information from data collections catalogs and references and support simple federation of a set of distributed files and associated metadata Data warehousing has been in existence for several years and demonstrated superiority over traditional relational database management systems by providing novel indexing schemes that supported efficient on-line analytical processing OLAP of large databases Now relational database systems such as Oracle etc support the warehouse capability which including extensions to the SQL language to support OLAP operations and a number of metadata management tools have been created The information integration of LAMOST by applying data warehousing is to effectively provide data and knowledge on-line

  20. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  1. Teaching Theory Construction With Initial Grounded Theory Tools: A Reflection on Lessons and Learning.

    PubMed

    Charmaz, Kathy

    2015-12-01

    This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.

  2. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  3. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...

  4. Web Analytics

    EPA Pesticide Factsheets

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  5. Consistent Yokoya-Chen Approximation to Beamstrahlung(LCC-0010)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peskin, M

    2004-04-22

    I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.I reconsider the Yokoya-Chen approximate evolution equation for beamstrahlung and modify it slightly to generate simple, consistent analytical approximations for the electron and photon energy spectra. I compare these approximations to previous ones, and to simulation data.

  6. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  7. COBRA ATD multispectral camera response model

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.

  8. Multivariate curve resolution-assisted determination of pseudoephedrine and methamphetamine by HPLC-DAD in water samples.

    PubMed

    Vosough, Maryam; Mohamedian, Hadi; Salemi, Amir; Baheri, Tahmineh

    2015-02-01

    In the present study, a simple strategy based on solid-phase extraction (SPE) with a cation exchange sorbent (Finisterre SCX) followed by fast high-performance liquid chromatography (HPLC) with diode array detection coupled with chemometrics tools has been proposed for the determination of methamphetamine and pseudoephedrine in ground water and river water. At first, the HPLC and SPE conditions were optimized and the analytical performance of the method was determined. In the case of ground water, determination of analytes was successfully performed through univariate calibration curves. For river water sample, multivariate curve resolution and alternating least squares was implemented and the second-order advantage was achieved in samples containing uncalibrated interferences and uncorrected background signals. The calibration curves showed good linearity (r(2) > 0.994).The limits of detection for pseudoephedrine and methamphetamine were 0.06 and 0.08 μg/L and the average recovery values were 104.7 and 102.3% in river water, respectively. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Solvent signal suppression for high-resolution MAS-DNP

    NASA Astrophysics Data System (ADS)

    Lee, Daniel; Chaudhari, Sachin R.; De Paëpe, Gaël

    2017-05-01

    Dynamic nuclear polarization (DNP) has become a powerful tool to substantially increase the sensitivity of high-field magic angle spinning (MAS) solid-state NMR experiments. The addition of dissolved hyperpolarizing agents usually results in the presence of solvent signals that can overlap and obscure those of interest from the analyte. Here, two methods are proposed to suppress DNP solvent signals: a Forced Echo Dephasing experiment (FEDex) and TRAnsfer of Populations in DOuble Resonance Echo Dephasing (TRAPDORED) NMR. These methods reintroduce a heteronuclear dipolar interaction that is specific to the solvent, thereby forcing a dephasing of recoupled solvent spins and leaving acquired NMR spectra free of associated resonance overlap with the analyte. The potency of these methods is demonstrated on sample types common to MAS-DNP experiments, namely a frozen solution (of L-proline) and a powdered solid (progesterone), both containing deuterated glycerol as a DNP solvent. The proposed methods are efficient, simple to implement, compatible with other NMR experiments, and extendable past spectral editing for just DNP solvents. The sensitivity gains from MAS-DNP in conjunction with FEDex or TRAPDORED then permits rapid and uninterrupted sample analysis.

  10. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  11. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  12. PyFolding: Open-Source Graphing, Simulation, and Analysis of the Biophysical Properties of Proteins.

    PubMed

    Lowe, Alan R; Perez-Riba, Albert; Itzhaki, Laura S; Main, Ewan R G

    2018-02-06

    For many years, curve-fitting software has been heavily utilized to fit simple models to various types of biophysical data. Although such software packages are easy to use for simple functions, they are often expensive and present substantial impediments to applying more complex models or for the analysis of large data sets. One field that is reliant on such data analysis is the thermodynamics and kinetics of protein folding. Over the past decade, increasingly sophisticated analytical models have been generated, but without simple tools to enable routine analysis. Consequently, users have needed to generate their own tools or otherwise find willing collaborators. Here we present PyFolding, a free, open-source, and extensible Python framework for graphing, analysis, and simulation of the biophysical properties of proteins. To demonstrate the utility of PyFolding, we have used it to analyze and model experimental protein folding and thermodynamic data. Examples include: 1) multiphase kinetic folding fitted to linked equations, 2) global fitting of multiple data sets, and 3) analysis of repeat protein thermodynamics with Ising model variants. Moreover, we demonstrate how PyFolding is easily extensible to novel functionality beyond applications in protein folding via the addition of new models. Example scripts to perform these and other operations are supplied with the software, and we encourage users to contribute notebooks and models to create a community resource. Finally, we show that PyFolding can be used in conjunction with Jupyter notebooks as an easy way to share methods and analysis for publication and among research teams. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  13. Analytical technique to address terrorist threats by chemical weapons of mass destruction

    NASA Astrophysics Data System (ADS)

    Dempsey, Patrick M.

    1997-01-01

    Terrorism is no longer an issue without effect on the American mind. We now live with the same concerns and fears that have been commonplace in other developed and third world countries for a long time. Citizens of other countries have long lived with the specter of terrorism and now the U.S. needs to be concerned and prepared for terrorist activities.T he terrorist has the ability to cause great destructive effects by focusing their effort on unaware and unprepared civilian populations. Attacks can range from simple explosives to sophisticated nuclear, chemical and biological weapons. Intentional chemical releases of hazardous chemicals or chemical warfare agents pose a great threat because of their ready availability and/or ease of production, and their ability to cause widespread damage. As this battlefront changes from defined conflicts and enemies to unnamed terrorists, we must implement the proper analytical tools to provide a fast and efficient response. Each chemical uses in a terrorists weapon leaves behind a chemical signature that can be used to identify the materials involved and possibly lead investigators to the source and to those responsible. New tools to provide fast and accurate detection for battlefield chemical and biological agent attack are emerging. Gas chromatography/mass spectrometry (GC/MS) is one of these tools that has found increasing use by the military to respond to chemical agent attacks. As the technology becomes smaller and more portable, it can be used by law enforcement personnel to identify suspected terrorist releases and to help prepare the response; define contaminated areas for evacuation and safety concerns, identify the proper treatment of exposed or affected civilians, and suggest decontamination and cleanup procedures.

  14. Test of a potential link between analytic and nonanalytic category learning and automatic, effortful processing.

    PubMed

    Tracy, J I; Pinsk, M; Helverson, J; Urban, G; Dietz, T; Smith, D J

    2001-08-01

    The link between automatic and effortful processing and nonanalytic and analytic category learning was evaluated in a sample of 29 college undergraduates using declarative memory, semantic category search, and pseudoword categorization tasks. Automatic and effortful processing measures were hypothesized to be associated with nonanalytic and analytic categorization, respectively. Results suggested that contrary to prediction strong criterion-attribute (analytic) responding on the pseudoword categorization task was associated with strong automatic, implicit memory encoding of frequency-of-occurrence information. Data are discussed in terms of the possibility that criterion-attribute category knowledge, once established, may be expressed with few attentional resources. The data indicate that attention resource requirements, even for the same stimuli and task, vary depending on the category rule system utilized. Also, the automaticity emerging from familiarity with analytic category exemplars is very different from the automaticity arising from extensive practice on a semantic category search task. The data do not support any simple mapping of analytic and nonanalytic forms of category learning onto the automatic and effortful processing dichotomy and challenge simple models of brain asymmetries for such procedures. Copyright 2001 Academic Press.

  15. [Model of Analysis and Prevention of Accidents - MAPA: tool for operational health surveillance].

    PubMed

    de Almeida, Ildeberto Muniz; Vilela, Rodolfo Andrade de Gouveia; da Silva, Alessandro José Nunes; Beltran, Sandra Lorena

    2014-12-01

    The analysis of work-related accidents is important for accident surveillance and prevention. Current methods of analysis seek to overcome reductionist views that see these occurrences as simple events explained by operator error. The objective of this paper is to analyze the Model of Analysis and Prevention of Accidents (MAPA) and its use in monitoring interventions, duly highlighting aspects experienced in the use of the tool. The descriptive analytical method was used, introducing the steps of the model. To illustrate contributions and or difficulties, cases where the tool was used in the context of service were selected. MAPA integrates theoretical approaches that have already been tried in studies of accidents by providing useful conceptual support from the data collection stage until conclusion and intervention stages. Besides revealing weaknesses of the traditional approach, it helps identify organizational determinants, such as management failings, system design and safety management involved in the accident. The main challenges lie in the grasp of concepts by users, in exploring organizational aspects upstream in the chain of decisions or at higher levels of the hierarchy, as well as the intervention to change the determinants of these events.

  16. MOLEonline: a web-based tool for analyzing channels, tunnels and pores (2018 update).

    PubMed

    Pravda, Lukáš; Sehnal, David; Toušek, Dominik; Navrátilová, Veronika; Bazgier, Václav; Berka, Karel; Svobodová Vareková, Radka; Koca, Jaroslav; Otyepka, Michal

    2018-04-30

    MOLEonline is an interactive, web-based application for the detection and characterization of channels (pores and tunnels) within biomacromolecular structures. The updated version of MOLEonline overcomes limitations of the previous version by incorporating the recently developed LiteMol Viewer visualization engine and providing a simple, fully interactive user experience. The application enables two modes of calculation: one is dedicated to the analysis of channels while the other was specifically designed for transmembrane pores. As the application can use both PDB and mmCIF formats, it can be leveraged to analyze a wide spectrum of biomacromolecular structures, e.g. stemming from NMR, X-ray and cryo-EM techniques. The tool is interconnected with other bioinformatics tools (e.g., PDBe, CSA, ChannelsDB, OPM, UniProt) to help both setup and the analysis of acquired results. MOLEonline provides unprecedented analytics for the detection and structural characterization of channels, as well as information about their numerous physicochemical features. Here we present the application of MOLEonline for structural analyses of α-hemolysin and transient receptor potential mucolipin 1 (TRMP1) pores. The MOLEonline application is freely available via the Internet at https://mole.upol.cz.

  17. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  18. Development of Advanced Life Prediction Tools for Elastic-Plastic Fatigue Crack Growth

    NASA Technical Reports Server (NTRS)

    Gregg, Wayne; McGill, Preston; Swanson, Greg; Wells, Doug; Throckmorton, D. A. (Technical Monitor)

    2001-01-01

    The objective of this viewgraph presentation is to develop a systematic approach to improving the fracture control process, including analytical tools, standards, guidelines, and awareness. Analytical tools specifically for elastic-plastic fracture analysis is a regime that is currently empirical for the Space Shuttle External Tank (ET) and is handled by simulated service testing of pre-cracked panels.

  19. The Role of Nanoparticle Design in Determining Analytical Performance of Lateral Flow Immunoassays.

    PubMed

    Zhan, Li; Guo, Shuang-Zhuang; Song, Fayi; Gong, Yan; Xu, Feng; Boulware, David R; McAlpine, Michael C; Chan, Warren C W; Bischof, John C

    2017-12-13

    Rapid, simple, and cost-effective diagnostics are needed to improve healthcare at the point of care (POC). However, the most widely used POC diagnostic, the lateral flow immunoassay (LFA), is ∼1000-times less sensitive and has a smaller analytical range than laboratory tests, requiring a confirmatory test to establish truly negative results. Here, a rational and systematic strategy is used to design the LFA contrast label (i.e., gold nanoparticles) to improve the analytical sensitivity, analytical detection range, and antigen quantification of LFAs. Specifically, we discovered that the size (30, 60, or 100 nm) of the gold nanoparticles is a main contributor to the LFA analytical performance through both the degree of receptor interaction and the ultimate visual or thermal contrast signals. Using the optimal LFA design, we demonstrated the ability to improve the analytical sensitivity by 256-fold and expand the analytical detection range from 3 log 10 to 6 log 10 for diagnosing patients with inflammatory conditions by measuring C-reactive protein. This work demonstrates that, with appropriate design of the contrast label, a simple and commonly used diagnostic technology can compete with more expensive state-of-the-art laboratory tests.

  20. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  1. ThinkHazard!: an open-source, global tool for understanding hazard information

    NASA Astrophysics Data System (ADS)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  2. Forgetfulness can help you win games.

    PubMed

    Burridge, James; Gao, Yu; Mao, Yong

    2015-09-01

    We present a simple game model where agents with different memory lengths compete for finite resources. We show by simulation and analytically that an instability exists at a critical memory length, and as a result, different memory lengths can compete and coexist in a dynamical equilibrium. Our analytical formulation makes a connection to statistical urn models, and we show that temperature is mirrored by the agent's memory. Our simple model of memory may be incorporated into other game models with implications that we briefly discuss.

  3. Simple functionalization method for single conical pores with a polydopamine layer

    NASA Astrophysics Data System (ADS)

    Horiguchi, Yukichi; Goda, Tatsuro; Miyahara, Yuji

    2018-04-01

    Resistive pulse sensing (RPS) is an interesting analytical system in which micro- to nanosized pores are used to evaluate particles or small analytes. Recently, molecular immobilization techniques to improve the performance of RPS have been reported. The problem in functionalization for RPS is that molecular immobilization by chemical reaction is restricted by the pore material type. Herein, a simple functionalization is performed using mussel-inspired polydopamine as an intermediate layer to connect the pore material with functional molecules.

  4. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  5. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  6. Experimental Validation of Lightning-Induced Electromagnetic (Indirect) Coupling to Short Monopole Antennas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crull, E W; Brown Jr., C G; Perkins, M P

    2008-07-30

    For short monopoles in this low-power case, it has been shown that a simple circuit model is capable of accurate predictions for the shape and magnitude of the antenna response to lightning-generated electric field coupling effects, provided that the elements of the circuit model have accurate values. Numerical EM simulation can be used to provide more accurate values for the circuit elements than the simple analytical formulas, since the analytical formulas are used outside of their region of validity. However, even with the approximate analytical formulas the simple circuit model produces reasonable results, which would improve if more accurate analyticalmore » models were used. This report discusses the coupling analysis approaches taken to understand the interaction between a time-varying EM field and a short monopole antenna, within the context of lightning safety for nuclear weapons at DOE facilities. It describes the validation of a simple circuit model using laboratory study in order to understand the indirect coupling of energy into a part, and the resulting voltage. Results show that in this low-power case, the circuit model predicts peak voltages within approximately 32% using circuit component values obtained from analytical formulas and about 13% using circuit component values obtained from numerical EM simulation. We note that the analytical formulas are used outside of their region of validity. First, the antenna is insulated and not a bare wire and there are perhaps fringing field effects near the termination of the outer conductor that the formula does not take into account. Also, the effective height formula is for a monopole directly over a ground plane, while in the time-domain measurement setup the monopole is elevated above the ground plane by about 1.5-inch (refer to Figure 5).« less

  7. Pilot testing of SHRP 2 reliability data and analytical products: Minnesota.

    DOT National Transportation Integrated Search

    2015-01-01

    The Minnesota pilot site has undertaken an effort to test data and analytical tools developed through the Strategic Highway Research Program (SHRP) 2 Reliability focus area. The purpose of these tools is to facilitate the improvement of travel time r...

  8. Characterization of echoes: A Dyson-series representation of individual pulses

    NASA Astrophysics Data System (ADS)

    Correia, Miguel R.; Cardoso, Vitor

    2018-04-01

    The ability to detect and scrutinize gravitational waves from the merger and coalescence of compact binaries opens up the possibility to perform tests of fundamental physics. One such test concerns the dark nature of compact objects: are they really black holes? It was recently pointed out that the absence of horizons—while keeping the external geometry very close to that of General Relativity—would manifest itself in a series of echoes in gravitational wave signals. The observation of echoes by LIGO/Virgo or upcoming facilities would likely inform us on quantum gravity effects or unseen types of matter. Detection of such signals is in principle feasible with relatively simple tools but would benefit enormously from accurate templates. Here we analytically individualize each echo waveform and show that it can be written as a Dyson series, for arbitrary effective potential and boundary conditions. We further apply the formalism to explicitly determine the echoes of a simple toy model: the Dirac delta potential. Our results allow to read off a few known features of echoes and may find application in the modeling for data analysis.

  9. Moment expansion for ionospheric range error

    NASA Technical Reports Server (NTRS)

    Mallinckrodt, A.; Reich, R.; Parker, H.; Berbert, J.

    1972-01-01

    On a plane earth, the ionospheric or tropospheric range error depends only on the total refractivity content or zeroth moment of the refracting layer and the elevation angle. On a spherical earth, however, the dependence is more complex; so for more accurate results it has been necessary to resort to complex ray-tracing calculations. A simple, high-accuracy alternative to the ray-tracing calculation is presented. By appropriate expansion of the angular dependence in the ray-tracing integral in a power series in height, an expression is obtained for the range error in terms of a simple function of elevation angle, E, at the expansion height and of the mth moment of the refractivity, N, distribution about the expansion height. The rapidity of convergence is heavily dependent on the choice of expansion height. For expansion heights in the neighborhood of the centroid of the layer (300-490 km), the expansion to N = 2 (three terms) gives results accurate to about 0.4% at E = 10 deg. As an analytic tool, the expansion affords some insight on the influence of layer shape on range errors in special problems.

  10. [Developments in preparation and experimental method of solid phase microextraction fibers].

    PubMed

    Yi, Xu; Fu, Yujie

    2004-09-01

    Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.

  11. Fabrication of Protein Microparticles and Microcapsules with Biomolecular Tools

    NASA Astrophysics Data System (ADS)

    Cheung, Kwan Yee; Lai, Kwok Kei; Mak, Wing Cheung

    2018-05-01

    Microparticles have attracted much attention for medical, analytical and biological applications. Calcium carbonate (CaCO3) templating method with the advantages of having narrow size distribution, controlled morphology and good biocompatibility that has been widely used for the synthesis of various protein-based microparticles. Despite CaCO3 template is biocompatible, most of the conventional methods to create stable protein microparticles are mainly driven by chemical crosslink reagents which may induce potential harmful effect and remains undesirable especially for biomedical or clinical applications. In this article, we demonstrate the fabrication of protein microparticles and microcapsules with an innovative method using biomolecular tools such as enzymes and affinity molecules to trigger the assembling of protein molecules within a porous CaCO3 template followed by a template removal step. We demonstrated the enzyme-assisted fabrication of collagen microparticles triggered by transglutaminase, as well as the affinity-assisted fabrication of BSA-biotin avidin microcapsules triggered by biotin-avidin affinity interaction, respectively. Based on the different protein assemble mechanisms, the collagen microparticles appeared as a solid-structured particles, while the BSA-biotin avidin microcapsules appeared as hollow-structured morphology. The fabrication procedures are simple and robust that allows producing protein microparticles or microcapsules under mild conditions at physiological pH and temperature. In addition, the microparticle morphologies, protein compositions and the assemble mechanisms were studied. Our technology provides a facile approach to design and fabricate protein microparticles and microcapsules that are useful in the area of biomaterials, pharmaceuticals and analytical chemistry.

  12. Low-parachor solvents extraction and thermostated micro-thin-layer chromatography separation for fast screening and classification of spirulina from pharmaceutical formulations and food samples.

    PubMed

    Zarzycki, Paweł K; Zarzycka, Magdalena B; Clifton, Vicki L; Adamski, Jerzy; Głód, Bronisław K

    2011-08-19

    The goal of this paper is to demonstrate the separation and detection capability of eco-friendly micro-TLC technique for the classification of spirulina and selected herbs from pharmaceutical and food products. Target compounds were extracted using relatively low-parachor liquids. A number of the spirulina samples which originated from pharmaceutical formulations and food products, were isolated using a simple one step extraction with small volume of methanol, acetone or tetrahydrofuran. Herb samples rich in chlorophyll dyes were analyzed as reference materials. Quantitative data derived from micro-plates under visible light conditions and after iodine staining were explored using chemometrics tools including cluster analysis and principal components analysis. Using this method we could easily distinguish genuine spirulina and non-spirulina samples as well as fresh from expired commercial products and furthermore, we could identify some biodegradation peaks appearing on micro-TLC profiles. This methodology can be applied as a fast screening or fingerprinting tool for the classification of genuine spirulina and herb samples and in particular may be used commercially for the rapid quality control screening of products. Furthermore, this approach allows low-cost fractionation of target substances including cyanobacteria pigments in raw biological or environmental samples for preliminary chemotaxonomic investigations. Due to the low consumption of the mobile phase (usually less than 1 mL per run), this method can be considered as environmentally friendly analytical tool, which may be an alternative for fingerprinting protocols based on HPLC machines and simple separation systems involving planar micro-fluidic or micro-chip devices. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Analytical estimation of annual runoff distribution in ungauged seasonally dry basins based on a first order Taylor expansion of the Fu's equation

    NASA Astrophysics Data System (ADS)

    Caracciolo, D.; Deidda, R.; Viola, F.

    2017-11-01

    The assessment of the mean annual runoff and its interannual variability in a basin is the first and fundamental task for several activities related to water resources management and water quality analysis. The scarcity of observed runoff data is a common problem worldwide so that the runoff estimation in ungauged basins is still an open question. In this context, the main aim of this work is to propose and test a simple tool able to estimate the probability distribution of the annual surface runoff in ungauged river basins in arid and semi-arid areas using a simplified Fu's parameterization of the Budyko's curve at regional scale. Starting from a method recently developed to derive the distribution of annual runoff, under the assumption of negligible inter-annual change in basin water storage, we here generalize the application to any catchment where the parameter of the Fu's curve is known. Specifically, we provide a closed-form expression of the annual runoff distribution as a function of the mean and standard deviation of annual rainfall and potential evapotranspiration, and the Fu's parameter. The proposed method is based on a first order Taylor expansion of the Fu's equation and allows calculating the probability density function of annual runoff in seasonally dry arid and semi-arid geographic context around the world by taking advantage of simple easy-to-find climatic data and the many studies with estimates of the Fu's parameter worldwide. The computational simplicity of the proposed tool makes it a valuable supporting tool in the field of water resources assessment for practitioners, regional agencies and authorities.

  14. SAM Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation

  15. FPI: FM Success through Analytics

    ERIC Educational Resources Information Center

    Hickling, Duane

    2013-01-01

    The APPA Facilities Performance Indicators (FPI) is perhaps one of the most powerful analytical tools that institutional facilities professionals have at their disposal. It is a diagnostic facilities performance management tool that addresses the essential questions that facilities executives must answer to effectively perform their roles. It…

  16. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  17. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  18. High-Throughput Incubation and Quantification of Agglutination Assays in a Microfluidic System.

    PubMed

    Castro, David; Conchouso, David; Kodzius, Rimantas; Arevalo, Arpys; Foulds, Ian G

    2018-06-04

    In this paper, we present a two-phase microfluidic system capable of incubating and quantifying microbead-based agglutination assays. The microfluidic system is based on a simple fabrication solution, which requires only laboratory tubing filled with carrier oil, driven by negative pressure using a syringe pump. We provide a user-friendly interface, in which a pipette is used to insert single droplets of a 1.25-µL volume into a system that is continuously running and therefore works entirely on demand without the need for stopping, resetting or washing the system. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5⁻10-fold improvement over traditional agglutination assays. We study system parameters such as channel length, incubation time and flow speed to select optimal assay conditions, using the streptavidin-biotin interaction as a model analyte quantified using optical image processing. We then investigate the effect of changing the concentration of both analyte and microbead concentrations, with a minimum detection limit of 100 ng/mL. The system can be both low- and high-throughput, depending on the rate at which assays are inserted. In our experiments, we were able to easily produce throughputs of 360 assays per hour by simple manual pipetting, which could be increased even further by automation and parallelization. Agglutination assays are a versatile tool, capable of detecting an ever-growing catalog of infectious diseases, proteins and metabolites. A system such as this one is a step towards being able to produce high-throughput microfluidic diagnostic solutions with widespread adoption. The development of analytical techniques in the microfluidic format, such as the one presented in this work, is an important step in being able to continuously monitor the performance and microfluidic outputs of organ-on-chip devices.

  19. Development of FEA Models to Study Contusion Patterning in Layered Tissue and the Shaft Loaded Blister Test

    NASA Astrophysics Data System (ADS)

    Giuffre, Christopher James

    In the natural world there is no such thing as a perfectly sharp edge, either thru wear or machining imprecation at the macroscopic scale all edges have curvature. This curvature can have significant impact when comparing results with theory. Both numerical and analytic models for the contact of an object with a sharp edge predict infinite stresses which are not present in the physical world. It is for this reason that the influence of rounded edges must be studied to better understand how they affect model response. Using a commercial available finite element package this influence will be studied in two different problems; how this edge geometry effects the shape of a contusion (bruise) and the accuracy of analytic models for the shaft loaded blister test (SLBT). The contusion study presents work that can be used to enable medical examiners to better determine if the object in question was capable of causing the contusions present. Using a simple layered tissue model which represents a generic location on the human body, a sweep of objects with different edges properties is studied using a simple strain based injury metric. This analysis aims to examine the role that contact area and energy have on the formation, location, and shape of the resulting contusion. In studying the SLBT with finite element analysis and cohesive zone modeling, the assessment of various analytic models will provide insight into how to accurately measure the fracture energy for both the simulation and experiment. This provides insight into the interactions between a film, the substrate it is bonded to and the loading plug. In addition, parametric studies are used to examine potential experimental designs and enable future work in this field. The final product of this project provides tools and insight into future study of the effect rounded edges have on contact and this work enables for more focused studies within desired regimes of interest.

  20. SAM Pathogen Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target pathogen analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select pathogens.

  1. Vitamin K

    USDA-ARS?s Scientific Manuscript database

    A wide range of analytical techniques are available for the detection, quantitation, and evaluation of vitamin K in foods. The methods vary from simple to complex depending on extraction, separation, identification and detection of the analyte. Among the extraction methods applied for vitamin K anal...

  2. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  4. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  5. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  7. SAM Biotoxin Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target biotoxin analytes in environmental samples can use this online query tool to identify analytical methods included in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select biotoxins.

  8. SAM Chemical Methods Query

    EPA Pesticide Factsheets

    Laboratories measuring target chemical, radiochemical, pathogens, and biotoxin analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery

  9. Using Learning Analytics to Support Engagement in Collaborative Writing

    ERIC Educational Resources Information Center

    Liu, Ming; Pardo, Abelardo; Liu, Li

    2017-01-01

    Online collaborative writing tools provide an efficient way to complete a writing task. However, existing tools only focus on technological affordances and ignore the importance of social affordances in a collaborative learning environment. This article describes a learning analytic system that analyzes writing behaviors, and creates…

  10. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  11. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    ERIC Educational Resources Information Center

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  12. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  13. Analytical model for BTEX natural attenuation in the presence of fuel ethanol and its anaerobic metabolite acetate.

    PubMed

    da Silva, Marcio L B; Gomez, Diego E; Alvarez, Pedro J J

    2013-03-01

    Flow-through column studies were conducted to mimic the natural attenuation of ethanol and BTEX mixtures, and to consider potential inhibitory effects of ethanol and its anaerobic metabolite acetate on BTEX biodegradation. Results were analyzed using a one-dimensional analytical model that was developed using consecutive reaction differential equations based on first-order kinetics. Decrease in pH due to acetogenesis was also modeled, using charge balance equations under CaCO(3) dissolution conditions. Delay in BTEX removal was observed and simulated in the presence of ethanol and acetate. Acetate was the major volatile fatty acid intermediate produced during anaerobic ethanol biodegradation (accounting for about 58% of the volatile fatty acid mass) as suggested by the model data fit. Acetate accumulation (up to 1.1 g/L) near the source zone contributed to a pH decrease by almost one unit. The anaerobic degradation of ethanol (2 g/L influent concentration) at the source zone produced methane at concentrations exceeding its solubility (~/=26mg/L). Overall, this simple analytical model adequately described ethanol degradation, acetate accumulation and methane production patterns, suggesting that it could be used as a screening tool to simulate lag times in BTEX biodegradation, changes in groundwater pH and methane generation following ethanol-blended fuel releases. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Fabrication of chitosan-silver nanoparticle hybrid 3D porous structure as a SERS substrate for biomedical applications

    NASA Astrophysics Data System (ADS)

    Jung, Gyeong-Bok; Kim, Ji-Hye; Burm, Jin Sik; Park, Hun-Kuk

    2013-05-01

    We propose a simple, low-cost, large-area, and functional surface enhanced Raman scattering (SERS) substrate for biomedical applications. The SERS substrate with chitosan-silver nanoparticles (chitosan-Ag NPs) hybrid 3D porous structure was fabricated simply by a one-step method. The chitosan was used as a template for the Ag NPs deposition. SERS enhancement by the chitosan-Ag NPs substrate was experimentally verified using rhodamine B as an analyte. Thiolated single stranded DNA was also measured for atopic dermatitis genetic markers (chemokines CCL17) at a low concentration of 5 pM. We successfully designed a novel SERS substrate with silver nanoparticle hybridized 3D porous chitosan that has the potential to become a highly sensitive and selective tool for biomedical applications.

  15. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  16. Stress Analysis of Beams with Shear Deformation of the Flanges

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul

    1937-01-01

    This report discusses the fundamental action of shear deformation of the flanges on the basis of simplifying assumptions. The theory is developed to the point of giving analytical solutions for simple cases of beams and of skin-stringer panels under axial load. Strain-gage tests on a tension panel and on a beam corresponding to these simple cases are described and the results are compared with analytical results. For wing beams, an approximate method of applying the theory is given. As an alternative, the construction of a mechanical analyzer is advocated.

  17. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    NASA Astrophysics Data System (ADS)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) and to the Spanish MINECO (Project CTQ2015-68049-R) for financial support

  18. Analytic Simulation of the Elastic Waves Propagation in the Neighborhood of Fluid Filled Wells with Monopole Sources

    NASA Astrophysics Data System (ADS)

    Ávila-Carrera, R.; Sánchez-Sesma, F. J.; Spurlin, James H.; Valle-Molina, C.; Rodríguez-Castellanos, A.

    2014-09-01

    An analytic formulation to understand the scattering, diffraction and attenuation of elastic waves at the neighborhood of fluid filled wells is presented. An important, and not widely exploited, technique to carefully investigate the wave propagation in exploration wells is the logging of sonic waveforms. Fundamental decisions and production planning in petroleum reservoirs are made by interpretation of such recordings. Nowadays, geophysicists and engineers face problems related to the acquisition and interpretation under complex conditions associated with conducting open-hole measurements. A crucial problem that directly affects the response of sonic logs is the eccentricity of the measuring tool with respect to the center of the borehole. Even with the employment of centralizers, this simple variation, dramatically changes the physical conditions on the wave propagation around the well. Recent works in the numerical field reported advanced studies in modeling and simulation of acoustic wave propagation around wells, including complex heterogeneities and anisotropy. However, no analytical efforts have been made to formally understand the wireline sonic logging measurements acquired with borehole-eccentered tools. In this paper, the Graf's addition theorem was used to describe monopole sources in terms of solutions of the wave equation. The formulation was developed from the three-dimensional discrete wave-number method in the frequency domain. The cylindrical Bessel functions of the third kind and order zero were re-derived to obtain a simplified set of equations projected into a bi-dimensional plane-space for displacements and stresses. This new and condensed analytic formulation allows the straightforward calculation of all converted modes and their visualization in the time domain via Fourier synthesis. The main aim was to obtain spectral surfaces of transfer functions and synthetic seismograms that might be useful to understand the wave motion produced by the eccentricity of the source and explain in detail the new arising borehole propagation modes. Finally, time histories and amplitude spectra for relevant examples are presented and the validation of time traces using the spectral element method is reported.

  19. New directions in photonics simulation: Lanczos recursion and finite-difference time-domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hawkins, R.J.; McLeod, R.R.; Kallman, J.S.

    1992-06-01

    Computational Integrated Photonics (CIP) is the area of computational physics that treats the propagation of light in optical fibers and in integrated optical circuits. The purpose of integrated photonics simulation is to develop the computational tools that will support the design of photonic and optoelectronic integrated devices. CIP has, in general, two thrusts: (1) predictive models of photonic device behavior that can be used reliably to enhance significantly the speed with which designs axe optimized for development applications, and (2) to further our ability to describe the linear and nonlinear processes that occur - and can be exploited - inmore » real photonic devices. Experimental integrated optics has been around for over a decade with much of the work during this period. centered on proof-of-principle devices that could be described using simple analytic and numerical models. Recent advances in material growths, photolithography, and device complexity have conspired to reduce significantly the number of devices that can be designed with simple models and to increase dramatically the interest in CIP. In the area of device design, CIP is viewed as critical to understanding device behavior and to optimization. In the area of propagation physics, CIP is an important tool in the study of nonlinear processes in integrated optical devices and fibers. In this talk I will discuss two of the new directions we have been investigating in CIP: Lanczos recursion and finite-difference time-domain.« less

  20. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  1. UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.

    PubMed

    Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel

    2013-09-01

    In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Purification of Single-Stranded cDNA Based on RNA Degradation Treatment and Adsorption Chromatography.

    PubMed

    Trujillo-Esquivel, Elías; Franco, Bernardo; Flores-Martínez, Alberto; Ponce-Noyola, Patricia; Mora-Montes, Héctor M

    2016-08-02

    Analysis of gene expression is a common research tool to study networks controlling gene expression, the role of genes with unknown function, and environmentally induced responses of organisms. Most of the analytical tools used to analyze gene expression rely on accurate cDNA synthesis and quantification to obtain reproducible and quantifiable results. Thus far, most commercial kits for isolation and purification of cDNA target double-stranded molecules, which do not accurately represent the abundance of transcripts. In the present report, we provide a simple and fast method to purify single-stranded cDNA, exhibiting high purity and yield. This method is based on the treatment with RNase H and RNase A after cDNA synthesis, followed by separation in silica spin-columns and ethanol precipitation. In addition, our method avoids the use of DNase I to eliminate genomic DNA from RNA preparations, which improves cDNA yield. As a case report, our method proved to be useful in the purification of single-stranded cDNA from the pathogenic fungus Sporothrix schenckii.

  3. Environmental screening tools for assessment of infrastructure plans based on biodiversity preservation and global warming (PEIT, Spain)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia-Montero, Luis G., E-mail: luisgonzaga.garcia@upm.e; Lopez, Elena, E-mail: elopez@caminos.upm.e; Monzon, Andres, E-mail: amonzon@caminos.upm.e

    Most Strategic Environmental Assessment (SEA) research has been concerned with SEA as a procedure, and there have been relatively few developments and tests of analytical methodologies. The first stage of the SEA is the 'screening', which is the process whereby a decision is taken on whether or not SEA is required for a particular programme or plan. The effectiveness of screening and SEA procedures will depend on how well the assessment fits into the planning from the early stages of the decision-making process. However, it is difficult to prepare the environmental screening for an infrastructure plan involving a whole country.more » To be useful, such methodologies must be fast and simple. We have developed two screening tools which would make it possible to estimate promptly the overall impact an infrastructure plan might have on biodiversity and global warming for a whole country, in order to generate planning alternatives, and to determine whether or not SEA is required for a particular infrastructure plan.« less

  4. Biomedical semantics in the Semantic Web

    PubMed Central

    2011-01-01

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences? We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th. PMID:21388570

  5. Biomedical semantics in the Semantic Web.

    PubMed

    Splendiani, Andrea; Burger, Albert; Paschke, Adrian; Romano, Paolo; Marshall, M Scott

    2011-03-07

    The Semantic Web offers an ideal platform for representing and linking biomedical information, which is a prerequisite for the development and application of analytical tools to address problems in data-intensive areas such as systems biology and translational medicine. As for any new paradigm, the adoption of the Semantic Web offers opportunities and poses questions and challenges to the life sciences scientific community: which technologies in the Semantic Web stack will be more beneficial for the life sciences? Is biomedical information too complex to benefit from simple interlinked representations? What are the implications of adopting a new paradigm for knowledge representation? What are the incentives for the adoption of the Semantic Web, and who are the facilitators? Is there going to be a Semantic Web revolution in the life sciences?We report here a few reflections on these questions, following discussions at the SWAT4LS (Semantic Web Applications and Tools for Life Sciences) workshop series, of which this Journal of Biomedical Semantics special issue presents selected papers from the 2009 edition, held in Amsterdam on November 20th.

  6. A Simple Analytical Model for Magnetization and Coercivity of Hard/Soft Nanocomposite Magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jihoon; Hong, Yang-Ki; Lee, Woncheol

    Here, we present a simple analytical model to estimate the magnetization (σ s) and intrinsic coercivity (Hci) of a hard/soft nanocomposite magnet using the mass fraction. Previously proposed models are based on the volume fraction of the hard phase of the composite. But, it is difficult to measure the volume of the hard or soft phase material of a composite. We synthesized Sm 2Co 7/Fe-Co, MnAl/Fe-Co, MnBi/Fe-Co, and BaFe 12O 19/Fe-Co composites for characterization of their σs and Hci. The experimental results are in good agreement with the present model. Therefore, this analytical model can be extended to predict themore » maximum energy product (BH) max of hard/soft composite.« less

  7. A Simple Analytical Model for Magnetization and Coercivity of Hard/Soft Nanocomposite Magnets

    DOE PAGES

    Park, Jihoon; Hong, Yang-Ki; Lee, Woncheol; ...

    2017-07-10

    Here, we present a simple analytical model to estimate the magnetization (σ s) and intrinsic coercivity (Hci) of a hard/soft nanocomposite magnet using the mass fraction. Previously proposed models are based on the volume fraction of the hard phase of the composite. But, it is difficult to measure the volume of the hard or soft phase material of a composite. We synthesized Sm 2Co 7/Fe-Co, MnAl/Fe-Co, MnBi/Fe-Co, and BaFe 12O 19/Fe-Co composites for characterization of their σs and Hci. The experimental results are in good agreement with the present model. Therefore, this analytical model can be extended to predict themore » maximum energy product (BH) max of hard/soft composite.« less

  8. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  9. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  10. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  11. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and Fire... used to support the life safety equivalency evaluation? Analytical and empirical tools, including fire models and grading schedules such as the Fire Safety Evaluation System (Alternative Approaches to Life...

  12. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    ERIC Educational Resources Information Center

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  13. Feasibility model of a high reliability five-year tape transport. Volume 3: Appendices. [detailed drawing and analytical tools used in analyses

    NASA Technical Reports Server (NTRS)

    Meyers, A. P.; Davidson, W. A.; Gortowski, R. C.

    1973-01-01

    Detailed drawings of the five year tape transport are presented. Analytical tools used in the various analyses are described. These analyses include: tape guidance, tape stress over crowned rollers, tape pack stress program, response (computer) program, and control system electronics description.

  14. Challenges and Opportunities in Analysing Students Modelling

    ERIC Educational Resources Information Center

    Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín

    2017-01-01

    Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…

  15. Equity Analytics: A Methodological Approach for Quantifying Participation Patterns in Mathematics Classroom Discourse

    ERIC Educational Resources Information Center

    Reinholz, Daniel L.; Shah, Niral

    2018-01-01

    Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…

  16. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    PubMed Central

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  17. The challenge of big data in public health: an opportunity for visual analytics.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.

  18. The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2014-01-01

    Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376

  19. Determination of Carbonyl Compounds in Cork Agglomerates by GDME-HPLC-UV: Identification of the Extracted Compounds by HPLC-MS/MS.

    PubMed

    Brandão, Pedro Francisco; Ramos, Rui Miguel; Almeida, Paulo Joaquim; Rodrigues, José António

    2017-02-08

    A new approach is proposed for the extraction and determination of carbonyl compounds in solid samples, such as wood or cork materials. Cork products are used as building materials due to their singular characteristics; however, little is known about its aldehyde emission potential and content. Sample preparation was done by using a gas-diffusion microextraction (GDME) device for the direct extraction of volatile aldehydes and derivatization with 2,4-dinitrophenylhydrazine. Analytical determination of the extracts was done by HPLC-UV, with detection at 360 nm. The developed methodology proved to be a reliable tool for aldehyde determination in cork agglomerate samples with suitable method features. Mass spectrometry studies were performed for each sample, which enabled the identification, in the extracts, of the derivatization products of a total of 13 aldehydes (formaldehyde, acetaldehyde, furfural, propanal, 5-methylfurfural, butanal, benzaldehyde, pentanal, hexanal, trans-2-heptenal, heptanal, octanal, and trans-2-nonenal) and 4 ketones (3-hydroxy-2-butanone, acetone, cyclohexanone, and acetophenone). This new analytical methodology simultaneously proved to be consistent for the identification and determination of aldehydes in cork agglomerates and a very simple and straightforward procedure.

  20. Human health risk assessment: models for predicting the effective exposure duration of on-site receptors exposed to contaminated groundwater.

    PubMed

    Baciocchi, Renato; Berardi, Simona; Verginelli, Iason

    2010-09-15

    Clean-up of contaminated sites is usually based on a risk-based approach for the definition of the remediation goals, which relies on the well known ASTM-RBCA standard procedure. In this procedure, migration of contaminants is described through simple analytical models and the source contaminants' concentration is supposed to be constant throughout the entire exposure period, i.e. 25-30 years. The latter assumption may often result over-protective of human health, leading to unrealistically low remediation goals. The aim of this work is to propose an alternative model taking in account the source depletion, while keeping the original simplicity and analytical form of the ASTM-RBCA approach. The results obtained by the application of this model are compared with those provided by the traditional ASTM-RBCA approach, by a model based on the source depletion algorithm of the RBCA ToolKit software and by a numerical model, allowing to assess its feasibility for inclusion in risk analysis procedures. The results discussed in this work are limited to on-site exposure to contaminated water by ingestion, but the approach proposed can be extended to other exposure pathways. Copyright 2010 Elsevier B.V. All rights reserved.

  1. Kinetic modeling and fitting software for interconnected reaction schemes: VisKin.

    PubMed

    Zhang, Xuan; Andrews, Jared N; Pedersen, Steen E

    2007-02-15

    Reaction kinetics for complex, highly interconnected kinetic schemes are modeled using analytical solutions to a system of ordinary differential equations. The algorithm employs standard linear algebra methods that are implemented using MatLab functions in a Visual Basic interface. A graphical user interface for simple entry of reaction schemes facilitates comparison of a variety of reaction schemes. To ensure microscopic balance, graph theory algorithms are used to determine violations of thermodynamic cycle constraints. Analytical solutions based on linear differential equations result in fast comparisons of first order kinetic rates and amplitudes as a function of changing ligand concentrations. For analysis of higher order kinetics, we also implemented a solution using numerical integration. To determine rate constants from experimental data, fitting algorithms that adjust rate constants to fit the model to imported data were implemented using the Levenberg-Marquardt algorithm or using Broyden-Fletcher-Goldfarb-Shanno methods. We have included the ability to carry out global fitting of data sets obtained at varying ligand concentrations. These tools are combined in a single package, which we have dubbed VisKin, to guide and analyze kinetic experiments. The software is available online for use on PCs.

  2. Capillary Flow in Containers of Polygonal Section: Theory and Experiment

    NASA Technical Reports Server (NTRS)

    Weislogel, Mark M.; Rame, Enrique (Technical Monitor)

    2001-01-01

    An improved understanding of the large-length-scale capillary flows arising in a low-gravity environment is critical to that engineering community concerned with the design and analysis of spacecraft fluids management systems. Because a significant portion of liquid behavior in spacecraft is capillary dominated it is natural to consider designs that best exploit the spontaneous character of such flows. In the present work, a recently verified asymptotic analysis is extended to approximate spontaneous capillary flows in a large class of cylindrical containers of irregular polygonal section experiencing a step reduction in gravitational acceleration. Drop tower tests are conducted using partially-filled irregular triangular containers for comparison with the theoretical predictions. The degree to which the experimental data agree with the theory is a testament to the robustness of the basic analytical assumption of predominantly parallel flow. As a result, the closed form analytical expressions presented serve as simple, accurate tools for predicting bulk flow characteristics essential to practical low-g system design and analysis. Equations for predicting corner wetting rates, total container flow rates, and transient surfaces shapes are provided that are relevant also to terrestrial applications such as capillary flow in porous media.

  3. An Application of X-Ray Fluorescence as Process Analytical Technology (PAT) to Monitor Particle Coating Processes.

    PubMed

    Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka

    2018-06-01

    An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.

  4. A hybrid finite element-transfer matrix model for vibroacoustic systems with flat and homogeneous acoustic treatments.

    PubMed

    Alimonti, Luca; Atalla, Noureddine; Berry, Alain; Sgard, Franck

    2015-02-01

    Practical vibroacoustic systems involve passive acoustic treatments consisting of highly dissipative media such as poroelastic materials. The numerical modeling of such systems at low to mid frequencies typically relies on substructuring methodologies based on finite element models. Namely, the master subsystems (i.e., structural and acoustic domains) are described by a finite set of uncoupled modes, whereas condensation procedures are typically preferred for the acoustic treatments. However, although accurate, such methodology is computationally expensive when real life applications are considered. A potential reduction of the computational burden could be obtained by approximating the effect of the acoustic treatment on the master subsystems without introducing physical degrees of freedom. To do that, the treatment has to be assumed homogeneous, flat, and of infinite lateral extent. Under these hypotheses, simple analytical tools like the transfer matrix method can be employed. In this paper, a hybrid finite element-transfer matrix methodology is proposed. The impact of the limiting assumptions inherent within the analytical framework are assessed for the case of plate-cavity systems involving flat and homogeneous acoustic treatments. The results prove that the hybrid model can capture the qualitative behavior of the vibroacoustic system while reducing the computational effort.

  5. Simulation of laser generated ultrasound with application to defect detection

    NASA Astrophysics Data System (ADS)

    Pantano, A.; Cerniglia, D.

    2008-06-01

    Laser generated ultrasound holds substantial promise for use as a tool for defect detection in remote inspection thanks to its ability to produce frequencies in the MHz range, enabling fine spatial resolution of defects. Despite the potential impact of laser generated ultrasound in many areas of science and industry, robust tools for studying the phenomenon are lacking and thus limit the design and optimization of non-destructive testing and evaluation techniques. The laser generated ultrasound propagation in complex structures is an intricate phenomenon and is extremely hard to analyze. Only simple geometries can be studied analytically. Numerical techniques found in the literature have proved to be limited in their applicability, by the frequencies in the MHz range and very short wavelengths. The objective of this research is to prove that by using an explicit integration rule together with diagonal element mass matrices, instead of the almost universally adopted implicit integration rule to integrate the equations of motion in a dynamic analysis, it is possible to efficiently and accurately solve ultrasound wave propagation problems with frequencies in the MHz range travelling in relatively large bodies. Presented results on NDE testing of rails demonstrate that the proposed FE technique can provide a valuable tool for studying the laser generated ultrasound propagation.

  6. Development of a software tool to support chemical and biological terrorism intelligence analysis

    NASA Astrophysics Data System (ADS)

    Hunt, Allen R.; Foreman, William

    1997-01-01

    AKELA has developed a software tool which uses a systems analytic approach to model the critical processes which support the acquisition of biological and chemical weapons by terrorist organizations. This tool has four major components. The first is a procedural expert system which describes the weapon acquisition process. It shows the relationship between the stages a group goes through to acquire and use a weapon, and the activities in each stage required to be successful. It applies to both state sponsored and small group acquisition. An important part of this expert system is an analysis of the acquisition process which is embodied in a list of observables of weapon acquisition activity. These observables are cues for intelligence collection The second component is a detailed glossary of technical terms which helps analysts with a non- technical background understand the potential relevance of collected information. The third component is a linking capability which shows where technical terms apply to the parts of the acquisition process. The final component is a simple, intuitive user interface which shows a picture of the entire process at a glance and lets the user move quickly to get more detailed information. This paper explains e each of these five model components.

  7. Quality Indicators for Learning Analytics

    ERIC Educational Resources Information Center

    Scheffel, Maren; Drachsler, Hendrik; Stoyanov, Slavi; Specht, Marcus

    2014-01-01

    This article proposes a framework of quality indicators for learning analytics that aims to standardise the evaluation of learning analytics tools and to provide a mean to capture evidence for the impact of learning analytics on educational practices in a standardised manner. The criteria of the framework and its quality indicators are based on…

  8. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  9. Automated clustering-based workload characterization

    NASA Technical Reports Server (NTRS)

    Pentakalos, Odysseas I.; Menasce, Daniel A.; Yesha, Yelena

    1996-01-01

    The demands placed on the mass storage systems at various federal agencies and national laboratories are continuously increasing in intensity. This forces system managers to constantly monitor the system, evaluate the demand placed on it, and tune it appropriately using either heuristics based on experience or analytic models. Performance models require an accurate workload characterization. This can be a laborious and time consuming process. It became evident from our experience that a tool is necessary to automate the workload characterization process. This paper presents the design and discusses the implementation of a tool for workload characterization of mass storage systems. The main features of the tool discussed here are: (1)Automatic support for peak-period determination. Histograms of system activity are generated and presented to the user for peak-period determination; (2) Automatic clustering analysis. The data collected from the mass storage system logs is clustered using clustering algorithms and tightness measures to limit the number of generated clusters; (3) Reporting of varied file statistics. The tool computes several statistics on file sizes such as average, standard deviation, minimum, maximum, frequency, as well as average transfer time. These statistics are given on a per cluster basis; (4) Portability. The tool can easily be used to characterize the workload in mass storage systems of different vendors. The user needs to specify through a simple log description language how the a specific log should be interpreted. The rest of this paper is organized as follows. Section two presents basic concepts in workload characterization as they apply to mass storage systems. Section three describes clustering algorithms and tightness measures. The following section presents the architecture of the tool. Section five presents some results of workload characterization using the tool.Finally, section six presents some concluding remarks.

  10. A Simple Analytic Model for Estimating Mars Ascent Vehicle Mass and Performance

    NASA Technical Reports Server (NTRS)

    Woolley, Ryan C.

    2014-01-01

    The Mars Ascent Vehicle (MAV) is a crucial component in any sample return campaign. In this paper we present a universal model for a two-stage MAV along with the analytic equations and simple parametric relationships necessary to quickly estimate MAV mass and performance. Ascent trajectories can be modeled as two-burn transfers from the surface with appropriate loss estimations for finite burns, steering, and drag. Minimizing lift-off mass is achieved by balancing optimized staging and an optimized path-to-orbit. This model allows designers to quickly find optimized solutions and to see the effects of design choices.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  12. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  13. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    ERIC Educational Resources Information Center

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  14. On the analytical form of the Earth's magnetic attraction expressed as a function of time

    NASA Technical Reports Server (NTRS)

    Carlheim-Gyllenskold, V.

    1983-01-01

    An attempt is made to express the Earth's magnetic attraction in simple analytical form using observations during the 16th to 19th centuries. Observations of the magnetic inclination in the 16th and 17th centuries are discussed.

  15. Evaluation of manometric temperature measurement, a process analytical technology tool for freeze-drying: part II measurement of dry-layer resistance.

    PubMed

    Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J

    2006-01-01

    The purpose of this work was to study the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to determine product dry-layer resistance to vapor flow. Product temperature and dry-layer resistance were obtained using MTM software installed on a laboratory freeze-dryer. The MTM resistance values were compared with the resistance values obtained using the "vial method." The product dry-layer resistances obtained by MTM, assuming fixed temperature difference (DeltaT; 2 degrees C), were lower than the actual values, especially when the product temperatures and sublimation rates were low, but with DeltaT determined from the pressure rise data, more accurate results were obtained. MTM resistance values were generally lower than the values obtained with the vial method, particularly whenever freeze-drying was conducted under conditions that produced large variations in product temperature (ie, low shelf temperature, low chamber pressure, and without thermal shields). In an experiment designed to magnify temperature heterogeneity, MTM resistance values were much lower than the simple average of the product resistances. However, in experiments where product temperatures were homogenous, good agreement between MTM and "vial-method" resistances was obtained. The reason for the low MTM resistance problem is the fast vapor pressure rise from a few "warm" edge vials or vials with low resistance. With proper use of thermal shields, and the evaluation of DeltaT from the data, MTM resistance data are accurate. Thus, the MTM method for determining dry-layer resistance is a useful tool for freeze-drying process analytical technology.

  16. Spacecraft formation control using analytical finite-duration approaches

    NASA Astrophysics Data System (ADS)

    Ben Larbi, Mohamed Khalil; Stoll, Enrico

    2018-03-01

    This paper derives a control concept for formation flight (FF) applications assuming circular reference orbits. The paper focuses on a general impulsive control concept for FF which is then extended to the more realistic case of non-impulsive thrust maneuvers. The control concept uses a description of the FF in relative orbital elements (ROE) instead of the classical Cartesian description since the ROE provide a direct insight into key aspects of the relative motion and are particularly suitable for relative orbit control purposes and collision avoidance analysis. Although Gauss' variational equations have been first derived to offer a mathematical tool for processing orbit perturbations, they are suitable for several different applications. If the perturbation acceleration is due to a control thrust, Gauss' variational equations show the effect of such a control thrust on the Keplerian orbital elements. Integrating the Gauss' variational equations offers a direct relation between velocity increments in the local vertical local horizontal frame and the subsequent change of Keplerian orbital elements. For proximity operations, these equations can be generalized from describing the motion of single spacecraft to the description of the relative motion of two spacecraft. This will be shown for impulsive and finite-duration maneuvers. Based on that, an analytical tool to estimate the error induced through impulsive maneuver planning is presented. The resulting control schemes are simple and effective and thus also suitable for on-board implementation. Simulations show that the proposed concept improves the timing of the thrust maneuver executions and thus reduces the residual error of the formation control.

  17. Rapid Design of Gravity Assist Trajectories

    NASA Technical Reports Server (NTRS)

    Carrico, J.; Hooper, H. L.; Roszman, L.; Gramling, C.

    1991-01-01

    Several International Solar Terrestrial Physics (ISTP) missions require the design of complex gravity assisted trajectories in order to investigate the interaction of the solar wind with the Earth's magnetic field. These trajectories present a formidable trajectory design and optimization problem. The philosophy and methodology that enable an analyst to design and analyse such trajectories are discussed. The so called 'floating end point' targeting, which allows the inherently nonlinear multiple body problem to be solved with simple linear techniques, is described. The combination of floating end point targeting with analytic approximations with a Newton method targeter to achieve trajectory design goals quickly, even for the very sensitive double lunar swingby trajectories used by the ISTP missions, is demonstrated. A multiconic orbit integration scheme allows fast and accurate orbit propagation. A prototype software tool, Swingby, built for trajectory design and launch window analysis, is described.

  18. Tuning the spectral emittance of α-SiC open-cell foams up to 1300 K with their macro porosity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rousseau, B., E-mail: benoit.rousseau@univ-nantes.fr; Guevelou, S.; Mekeze-Monthe, A.

    2016-06-15

    A simple and robust analytical model is used to finely predict the spectral emittance under air up to 1300 K of α-SiC open-cell foams constituted of optically thick struts. The model integrates both the chemical composition and the macro-porosity and is valid only if foams have volumes higher than their Representative Elementary Volumes required for determining their emittance. Infrared emission spectroscopy carried out on a doped silicon carbide single crystal associated to homemade numerical tools based on 3D meshed images (Monte Carlo Ray Tracing code, foam generator) make possible to understand the exact role of the cell network in emittance.more » Finally, one can tune the spectral emittance of α-SiC foams up to 1300 K by simply changing their porosity.« less

  19. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  20. Athermal operation of multi-section slotted tunable lasers.

    PubMed

    Wallace, M J; O'Reilly Meehan, R; Enright, R; Bello, F; McCloskey, D; Barabadi, B; Wang, E N; Donegan, J F

    2017-06-26

    Two distinct athermal bias current procedures based on thermal tuning are demonstrated for a low-cost, monotlithic, three section slotted single mode laser, achieving mode-hop free wavelength stability of ± 0.04 nm / 5 GHz over a temperature range of 8-47 °C. This is the first time that athermal performance has been demonstrated for a three-section slotted laser with simple fabrication, and is well within the 50 GHz grid spacing specified for DWDM systems. This performance is similar to experiments on more complex DS-DBR lasers, indicating that strong athermal performance can be achieved using our lower-cost three section devices. An analytical model and thermoreflectance measurements provide further insight into the operation of multi-section lasers and lay the foundation for an accurate predictive tool for optimising such devices for athermal operation.

  1. Assessment of Trading Partners for China's Rare Earth Exports Using a Decision Analytic Approach

    PubMed Central

    He, Chunyan; Lei, Yalin; Ge, Jianping

    2014-01-01

    Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies. PMID:25051534

  2. Adiabatic topological quantum computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  3. Adiabatic topological quantum computing

    DOE PAGES

    Cesare, Chris; Landahl, Andrew J.; Bacon, Dave; ...

    2015-07-31

    Topological quantum computing promises error-resistant quantum computation without active error correction. However, there is a worry that during the process of executing quantum gates by braiding anyons around each other, extra anyonic excitations will be created that will disorder the encoded quantum information. Here, we explore this question in detail by studying adiabatic code deformations on Hamiltonians based on topological codes, notably Kitaev’s surface codes and the more recently discovered color codes. We develop protocols that enable universal quantum computing by adiabatic evolution in a way that keeps the energy gap of the system constant with respect to the computationmore » size and introduces only simple local Hamiltonian interactions. This allows one to perform holonomic quantum computing with these topological quantum computing systems. The tools we develop allow one to go beyond numerical simulations and understand these processes analytically.« less

  4. Electrochemical Sensing, Photocatalytic and Biological Activities of ZnO Nanoparticles: Synthesis via Green Chemistry Route

    NASA Astrophysics Data System (ADS)

    Yadav, L. S. Reddy; Archana, B.; Lingaraju, K.; Kavitha, C.; Suresh, D.; Nagabhushana, H.; Nagaraju, G.

    2016-05-01

    In this paper, we have successfully synthesized ZnO nanoparticles (Nps) via solution combustion method using sugarcane juice as the novel fuel. The structure and morphology of the synthesized ZnO Nps have been analyzed using various analytical tools. The synthesized ZnO Nps exhibit excellent photocatalytic activity for the degradation of methylene blue dye, indicating that the ZnO Nps are potential photocatalytic semiconductor materials. The synthesized ZnO Nps also show good electrochemical sensing of dopamine. ZnO Nps exhibit significant bactericidal activity against Klebsiella aerogenes, Pseudomonas aeruginosa, Eschesichia coli and Staphylococcus aureus using agar well diffusion method. Furthermore, the ZnO Nps show good antioxidant activity by potentially scavenging 1-diphenyl-2-picrylhydrazyl (DPPH) radicals. The above studies clearly demonstrate versatile applications of ZnO synthesized by simple eco-friendly route.

  5. Mechanical break junctions: enormous information in a nanoscale package.

    PubMed

    Natelson, Douglas

    2012-04-24

    Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.

  6. Numerical study of read scheme in one-selector one-resistor crossbar array

    NASA Astrophysics Data System (ADS)

    Kim, Sungho; Kim, Hee-Dong; Choi, Sung-Jin

    2015-12-01

    A comprehensive numerical circuit analysis of read schemes of a one selector-one resistance change memory (1S1R) crossbar array is carried out. Three schemes-the ground, V/2, and V/3 schemes-are compared with each other in terms of sensing margin and power consumption. Without the aid of a complex analytical approach or SPICE-based simulation, a simple numerical iteration method is developed to simulate entire current flows and node voltages within a crossbar array. Understanding such phenomena is essential in successfully evaluating the electrical specifications of selectors for suppressing intrinsic drawbacks of crossbar arrays, such as sneaky current paths and series line resistance problems. This method provides a quantitative tool for the accurate analysis of crossbar arrays and provides guidelines for developing an optimal read scheme, array configuration, and selector device specifications.

  7. Plotting equation for gaussian percentiles and a spreadsheet program for generating probability plots

    USGS Publications Warehouse

    Balsillie, J.H.; Donoghue, J.F.; Butler, K.M.; Koch, J.L.

    2002-01-01

    Two-dimensional plotting tools can be of invaluable assistance in analytical scientific pursuits, and have been widely used in the analysis and interpretation of sedimentologic data. We consider, in this work, the use of arithmetic probability paper (APP). Most statistical computer applications do not allow for the generation of APP plots, because of apparent intractable nonlinearity of the percentile (or probability) axis of the plot. We have solved this problem by identifying an equation(s) for determining plotting positions of Gaussian percentiles (or probabilities), so that APP plots can easily be computer generated. An EXCEL example is presented, and a programmed, simple-to-use EXCEL application template is hereby made publicly available, whereby a complete granulometric analysis including data listing, moment measure calculations, and frequency and cumulative APP plots, is automatically produced.

  8. Assessment of trading partners for China's rare earth exports using a decision analytic approach.

    PubMed

    He, Chunyan; Lei, Yalin; Ge, Jianping

    2014-01-01

    Chinese rare earth export policies currently result in accelerating its depletion. Thus adopting an optimal export trade selection strategy is crucial to determining and ultimately identifying the ideal trading partners. This paper introduces a multi-attribute decision-making methodology which is then used to select the optimal trading partner. In the method, an evaluation criteria system is established to assess the seven top trading partners based on three dimensions: political relationships, economic benefits and industrial security. Specifically, a simple additive weighing model derived from an additive utility function is utilized to calculate, rank and select alternatives. Results show that Japan would be the optimal trading partner for Chinese rare earths. The criteria evaluation method of trading partners for China's rare earth exports provides the Chinese government with a tool to enhance rare earth industrial policies.

  9. minimega

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Fritz, John Floren

    2013-08-27

    Minimega is a simple emulytics platform for creating testbeds of networked devices. The platform consists of easily deployable tools to facilitate bringing up large networks of virtual machines including Windows, Linux, and Android. Minimega attempts to allow experiments to be brought up quickly with nearly no configuration. Minimega also includes tools for simple cluster management, as well as tools for creating Linux based virtual machine images.

  10. Phonon scattering in nanoscale systems: lowest order expansion of the current and power expressions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Brandbyge, Mads

    2006-04-01

    We use the non-equilibrium Green's function method to describe the effects of phonon scattering on the conductance of nano-scale devices. Useful and accurate approximations are developed that both provide (i) computationally simple formulas for large systems and (ii) simple analytical models. In addition, the simple models can be used to fit experimental data and provide physical parameters.

  11. On the Application of Euler Deconvolution to the Analytic Signal

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Florio, G.; Pasteka, R.

    2005-05-01

    In the last years papers on Euler deconvolution (ED) used formulations that accounted for the unknown background field, allowing to consider the structural index (N) an unknown to be solved for, together with the source coordinates. Among them, Hsu (2002) and Fedi and Florio (2002) independently pointed out that the use of an adequate m-order derivative of the field, instead than the field itself, allowed solving for both N and source position. For the same reason, Keating and Pilkington (2004) proposed the ED of the analytic signal. A function being analyzed by ED must be homogeneous but also harmonic, because it must be possible to compute its vertical derivative, as well known from potential field theory. Huang et al. (1995), demonstrated that analytic signal is a homogeneous function, but, for instance, it is rather obvious that the magnetic field modulus (corresponding to the analytic signal of a gravity field) is not a harmonic function (e.g.: Grant & West, 1965). Thus, it appears that a straightforward application of ED to the analytic signal is not possible because a vertical derivation of this function is not correct by using standard potential fields analysis tools. In this note we want to theoretically and empirically check what kind of error are caused in the ED by such wrong assumption about analytic signal harmonicity. We will discuss results on profile and map synthetic data, and use a simple method to compute the vertical derivative of non-harmonic functions measured on a horizontal plane. Our main conclusions are: 1. To approximate a correct evaluation of the vertical derivative of a non-harmonic function it is useful to compute it with finite-difference, by using upward continuation. 2. We found that the errors on the vertical derivative computed as if the analytic signal was harmonic reflects mainly on the structural index estimate; these errors can mislead an interpretation even though the depth estimates are almost correct. 3. Consistent estimates of depth and S.I. are instead obtained by using a finite-difference vertical derivative of the analytic signal. 4. Analysis of a case history confirms the strong error in the estimation of structural index if the analytic signal is treated as an harmonic function.

  12. Development of a simple intensified fermentation strategy for growth of Magnetospirillum gryphiswaldense MSR-1: Physiological responses to changing environmental conditions.

    PubMed

    Fernández-Castané, Alfred; Li, Hong; Thomas, Owen R T; Overton, Tim W

    2018-06-01

    The development of a simple pH-stat fed-batch fermentation strategy for the production of Magnetospirillum gryphiswaldense MSR-1 and magnetosomes (nanoscale magnetic organelles with biotechnological applications) is described. Flow cytometry was exploited as a powerful analytical tool for process development, enabling rapid monitoring of cell morphology, physiology and polyhydroxyalkanoate production. The pH-stat fed-batch growth strategy was developed by varying the concentrations of the carbon source (lactic acid) and the alternative electron acceptor (sodium nitrate) in the feed. Growth conditions were optimized on the basis of biomass concentration, cellular magnetism (indicative of magnetosome production), and intracellular iron concentration. The highest biomass concentration and cellular iron content achieved were an optical density at 565 nm of 15.5 (equivalent to 4.2 g DCW·L -1 ) and 33.1 mg iron·g -1 DCW, respectively. This study demonstrates the importance of analyzing bacterial physiology during fermentation development and will potentially aid the industrial production of magnetosomes, which can be used in a wide range of biotechnology and healthcare applications. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  13. Study on the potential application of salivary inorganic anions in clinical diagnosis by capillary electrophoresis coupled with contactless conductivity detection.

    PubMed

    Guo, Lin; Wang, Yu; Zheng, Yiliang; Huang, Zhipeng; Cheng, Yiyuan; Ye, Jiannong; Chu, Qingcui; Huang, Dongping

    2016-03-01

    A capillary electrophoresis approach with capacitively coupled contactless conductivity detection method has been developed for the determination of inorganic metabolites (thiocyanate, nitrite and nitrate) in human saliva. Field amplified sample injection, as a simple sample stacking technique, was used in conjunction for online preconcentration of above inorganic anions. A selective separation for the target anions from other coexisting constituents present in saliva could be obtained within 14min in a 10mmol/L His-90mmol/L HAc buffer (pH 3.70) at the separation voltage of -18kV. The limits of detection and limits of quantification of the three analytes were within the range of 3.1-4.9ng/mL (S/N=3) and 10-16ng/mL (S/N=10), respectively. The average recovery data were in the range of 81-108% at three different concentrations. This method provides a simple, rapid and direct approach for metabolite analyses of nitric oxide and cyanide based on noninvasive saliva sample, which presents a potential fast screening tool for clinical test. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A simple analytical method for determining the atmospheric dispersion of upward-directed high velocity releases

    NASA Astrophysics Data System (ADS)

    Palazzi, E.

    The evaluation of atmospheric dispersion of a cloud, arising from a sudden release of flammable or toxic materials, is an essential tool for properly designing flares, vents and other safety devices and to quantify the potential risk related to the existing ones or arising from the various kinds of accidents which can occur in chemical plants. Among the methods developed to treat the important case of upward-directed jets, Hoehne's procedure for determining the behaviour and extent of flammability zone is extensively utilized, particularly concerning petrochemical plants. In a previous study, a substantial simplification of the aforesaid procedure was achieved, by correlating the experimental data with an empirical formula, allowing to obtain a mathematical description of the boundaries of the flammable cloud. Following a theoretical approach, a most general model is developed in the present work, applicable to the various kinds of design problems and/or risk evaluation regarding upward-directed releases from high velocity sources. It is also demonstrated that the model gives conservative results, if applied outside the range of the Hoehne's experimental conditions. Moreover, with simple modifications, the same approach could be easily applied to deal with the atmospheric dispersion of anyhow directed releases.

  15. Keeping It Simple: Can We Estimate Malting Quality Potential Using an Isothermal Mashing Protocol and Common Laboratory Instrumentation?

    USDA-ARS?s Scientific Manuscript database

    Current methods for generating malting quality metrics have been developed largely to support commercial malting and brewing operations, providing accurate, reproducible analytical data to guide malting and brewing production. Infrastructure to support these analytical operations often involves sub...

  16. Analytical evaluation of current starch methods used in the international sugar industry: Part I

    USDA-ARS?s Scientific Manuscript database

    Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...

  17. IBM's Health Analytics and Clinical Decision Support.

    PubMed

    Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W

    2014-08-15

    This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.

  18. Analytical analysis of the temporal asymmetry between seawater intrusion and retreat

    NASA Astrophysics Data System (ADS)

    Rathore, Saubhagya Singh; Zhao, Yue; Lu, Chunhui; Luo, Jian

    2018-01-01

    The quantification of timescales associated with the movement of the seawater-freshwater interface is useful for developing effective management strategies for controlling seawater intrusion (SWI). In this study, for the first time, we derive an explicit analytical solution for the timescales of SWI and seawater retreat (SWR) in a confined, homogeneous coastal aquifer system under the quasi-steady assumption, based on a classical sharp-interface solution for approximating freshwater outflow rates into the sea. The flow continuity and hydrostatic equilibrium across the interface are identified as two primary mechanisms governing timescales of the interface movement driven by an abrupt change in discharge rates or hydraulic heads at the inland boundary. Through theoretical analysis, we quantified the dependence of interface-movement timescales on porosity, hydraulic conductivity, aquifer thickness, aquifer length, density ratio, and boundary conditions. Predictions from the analytical solution closely agreed with those from numerical simulations. In addition, we define a temporal asymmetry index (the ratio of the SWI timescale to the SWR timescale) to represent the resilience of the coastal aquifer in response to SWI. The developed analytical solutions provide a simple tool for the quick assessment of SWI and SWR timescales and reveal that the temporal asymmetry between SWI and SWR mainly relies on the initial and final values of the freshwater flux at the inland boundary, and is weakly affected by aquifer parameters. Furthermore, we theoretically examined the log-linearity relationship between the timescale and the freshwater flux at the inland boundary, and found that the relationship may be approximated by two linear functions with a slope of -2 and -1 for large changes at the boundary flux for SWI and SWR, respectively.

  19. Therapeutic drug monitoring of beta-lactam antibiotics - Influence of sample stability on the analysis of piperacillin, meropenem, ceftazidime and flucloxacillin by HPLC-UV.

    PubMed

    Pinder, Nadine; Brenner, Thorsten; Swoboda, Stefanie; Weigand, Markus A; Hoppe-Tichy, Torsten

    2017-09-05

    Therapeutic drug monitoring (TDM) is a useful tool to optimize antibiotic therapy. Increasing interest in alternative dosing strategies of beta-lactam antibiotics, e.g. continuous or prolonged infusion, require a feasible analytical method for quantification of these antimicrobial agents. However, pre-analytical issues including sample handling and stability are to be considered to provide valuable analytical results. For the simultaneous determination of piperacillin, meropenem, ceftazidime and flucloxacillin, a high performance liquid chromatography (HPLC) method including protein precipitation was established utilizing ertapenem as internal standard. Long-term stability of stock solutions and plasma samples were monitored. Furthermore, whole blood stability of the analytes in heparinized blood tubes was investigated comparing storage under ambient conditions and 2-8°C. A calibration range of 5-200μg/ml (piperacillin, ceftazidime, flucloxacillin) and 2-200μg/ml (meropenem) was linear with r 2 >0.999, precision and inaccuracy were <9% and <11%, respectively. The successfully validated HPLC assay was applied to clinical samples and stability investigations. At -80°C, plasma samples were stable for 9 months (piperacillin, meropenem) or 13 months (ceftazidime, flucloxacillin). Concentrations of the four beta-lactam antibiotics in whole blood tubes were found to remain within specifications for 8h when stored at 2-8°C but not at room temperature. The presented method is a rapid and simple option for routine TDM of piperacillin, meropenem, ceftazidime and flucloxacillin. Whereas long-term storage of beta-lactam samples at -80°C is possible for at least 9 months, whole blood tubes are recommended to be kept refrigerated until analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. The Superior Lambert Algorithm

    NASA Astrophysics Data System (ADS)

    der, G.

    2011-09-01

    Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most numerical integration methods.

  1. Harnessing Scientific Literature Reports for Pharmacovigilance

    PubMed Central

    Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-01-01

    Summary Objectives We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers’ capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. Methods A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. Results All usability test participants cited the tool’s ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool’s automated literature search relative to a manual ‘all fields’ PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Conclusions Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction. PMID:28326432

  2. Integrative workflows for metagenomic analysis

    PubMed Central

    Ladoukakis, Efthymios; Kolisis, Fragiskos N.; Chatziioannou, Aristotelis A.

    2014-01-01

    The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS), have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e., Sanger). From a bioinformatic perspective, this boils down to many GB of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control, and annotation of metagenomic data, embracing various, major sequencing technologies and applications. PMID:25478562

  3. Simulating ground water-lake interactions: Approaches and insights

    USGS Publications Warehouse

    Hunt, R.J.; Haitjema, H.M.; Krohelski, J.T.; Feinstein, D.T.

    2003-01-01

    Approaches for modeling lake-ground water interactions have evolved significantly from early simulations that used fixed lake stages specified as constant head to sophisticated LAK packages for MODFLOW. Although model input can be complex, the LAK package capabilities and output are superior to methods that rely on a fixed lake stage and compare well to other simple methods where lake stage can be calculated. Regardless of the approach, guidelines presented here for model grid size, location of three-dimensional flow, and extent of vertical capture can facilitate the construction of appropriately detailed models that simulate important lake-ground water interactions without adding unnecessary complexity. In addition to MODFLOW approaches, lake simulation has been formulated in terms of analytic elements. The analytic element lake package had acceptable agreement with a published LAK1 problem, even though there were differences in the total lake conductance and number of layers used in the two models. The grid size used in the original LAK1 problem, however, violated a grid size guideline presented in this paper. Grid sensitivity analyses demonstrated that an appreciable discrepancy in the distribution of stream and lake flux was related to the large grid size used in the original LAK1 problem. This artifact is expected regardless of MODFLOW LAK package used. When the grid size was reduced, a finite-difference formulation approached the analytic element results. These insights and guidelines can help ensure that the proper lake simulation tool is being selected and applied.

  4. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  5. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    ERIC Educational Resources Information Center

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  6. Analytical Tools for Affordability Analysis

    DTIC Science & Technology

    2015-05-01

    function (Womer)  Unit cost as a function of learning and rate  Learning with forgetting (Benkard)  Learning depreciates over time  Discretionary...Analytical Tools for Affordability Analysis David Tate Cost Analysis and Research Division Institute for Defense Analyses Report Documentation...ES) Institute for Defense Analyses, Cost Analysis and Research Division,4850 Mark Center Drive,Alexandria,VA,22311-1882 8. PERFORMING ORGANIZATION

  7. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    DTIC Science & Technology

    2013-03-01

    imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic Hierarchy Process (AHP)-based...the Singapore-developed X-SAT imaging spacecraft. The analysis is facilitated through the use of AGI’s Systems Tool Kit ( STK ) software. An Analytic...89  B.  FUTURE WORK................................................................................. 90  APPENDIX A. STK DATA AND BENEFIT

  8. Social Capital: An Analytical Tool for Exploring Lifelong Learning and Community Development. CRLRA Discussion Paper.

    ERIC Educational Resources Information Center

    Kilpatrick, Sue; Field, John; Falk, Ian

    The possibility of using the concept of social capital as an analytical tool for exploring lifelong learning and community development was examined. The following were among the topics considered: (1) differences between definitions of the concept of social capital that are based on collective benefit and those that define social capital as a…

  9. The Metaphorical Department Head: Using Metaphors as Analytic Tools to Investigate the Role of Department Head

    ERIC Educational Resources Information Center

    Paranosic, Nikola; Riveros, Augusto

    2017-01-01

    This paper reports the results of a study that examined the ways a group of department heads in Ontario, Canada, describe their role. Despite their ubiquity and importance, department heads have been seldom investigated in the educational leadership literature. The study uses the metaphor as an analytic tool to examine the ways participants talked…

  10. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  11. Analysis of anabolic androgenic steroids in urine by full-capillary sample injection combined with a sweeping CE stacking method.

    PubMed

    Wang, Chun-Chi; Cheng, Shu-Fang; Cheng, Hui-Ling; Chen, Yen-Ling

    2013-02-01

    This study describes an on-line stacking CE approach by sweeping with whole capillary sample filling for analyzing five anabolic androgenic steroids in urine samples. The five anabolic steroids for detection were androstenedione, testosterone, epitestosterone, boldenone, and clostebol. Anabolic androgenic steroids are abused in sport doping because they can promote muscle growth. Therefore, a sensitive detection method is imperatively required for monitoring the urine samples of athletes. In this research, an interesting and reliable stacking capillary electrophoresis method was established for analysis of anabolic steroids in urine. After liquid-liquid extraction by n-hexane, the supernatant was dried and reconstituted with 30 mM phosphate buffer (pH 5.00) and loaded into the capillary by hydrodynamic injection (10 psi, 99.9 s). The stacking and separation were simultaneously accomplished at -20 kV in phosphate buffer (30 mM, pH 5.0) containing 100 mM sodium dodecyl sulfate and 40 % methanol. During the method validation, calibration curves were linear (r≥0.990) over a range of 50-1,000 ng/mL for the five analytes. In the evaluation of precision and accuracy for this method, the absolute values of the RSD and the RE in the intra-day (n=3) and inter-day (n=5) analyses were all less than 6.6 %. The limit of detection for the five analytes was 30 ng/mL (S/N=5, sampling 99.9 s at 10 psi). Compared with simple MECK, this stacking method possessed a 108- to 175-fold increase in sensitivity. This simple and sensitive stacking method could be used as a powerful tool for monitoring the illegal use of doping.

  12. An immunochromatographic assay for rapid and direct detection of 3-amino-5-morpholino-2-oxazolidone (AMOZ) in meat and feed samples.

    PubMed

    Li, Shuqun; Song, Juan; Yang, Hong; Cao, Biyun; Chang, Huafang; Deng, Anping

    2014-03-15

    Furaltadone (FTD) is a type of nitrofuran and has been banned in many countries as a veterinary drug in food-producing animals owing to its potential carcinogenicity and mutagenicity. FTD is unstable in vivo, rapidly metabolizing to 3-amino-5-methylmorpholino-2-oxazolidinone (AMOZ); thus AMOZ can be used as an indicator for illegal usage of FTD. Usually, for the determination of nitrofurans, the analyte is often a derivative of the metabolite rather than the metabolite itself. In this study, based on the monoclonal antibody (mAb) against AMOZ, a competitive immunochromatographic assay (ICA) using a colloidal gold-mAb probe for rapid and direct detection of AMOZ without a derivatization step in meat and feed samples was developed. The intensity of red color in the test line is inversely related to the analyte concentration and the visual detection limit was found to be 10 ng mL⁻¹. The performance of this assay was simple and convenient because the tedious and time-consuming derivatization step was avoided. The ICA detection was completed within 10 min. The ICA strips could be used for 7 weeks at room temperature without significant loss of activity. The AMOZ spiked samples were detected by ICA and confirmed by enzyme-linked immunosorbent assay. The results of the two methods were in good agreement. The proposed ICA provides a feasible tool for simple, sensitive, rapid, convenient and semi-quantitative detection of AMOZ in meat and feed samples on site. To our knowledge, this is the first report of the ICA for direct detection of AMOZ. © 2013 Society of Chemical Industry.

  13. A SIMPLE, EFFICIENT SOLUTION OF FLUX-PROFILE RELATIONSHIPS IN THE ATMOSPHERIC SURFACE LAYER

    EPA Science Inventory

    This note describes a simple scheme for analytical estimation of the surface layer similarity functions from state variables. What distinguishes this note from the many previous papers on this topic is that this method is specifically targeted for numerical models where simplici...

  14. Facilitating Research and Learning in Petrology and Geochemistry through Classroom Applications of Remotely Operable Research Instrumentation

    NASA Astrophysics Data System (ADS)

    Ryan, J. G.

    2012-12-01

    Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as well as requesting additional Geology elective courses offering similar kinds of experiences. I have sustained these activities post-project via student lab fees to pay for in-class microprobe time.

  15. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  16. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD [The Energy-Energy Correlation at Next-to-Leading Order in QCD, Analytically

    DOE PAGES

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav; ...

    2018-03-09

    Here, the energy-energy correlation (EEC) between two detectors in e +e – annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  17. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD [The Energy-Energy Correlation at Next-to-Leading Order in QCD, Analytically

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav

    Here, the energy-energy correlation (EEC) between two detectors in e +e – annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  18. Identification of "At Risk" Students Using Learning Analytics: The Ethical Dilemmas of Intervention Strategies in a Higher Education Institution

    ERIC Educational Resources Information Center

    Lawson, Celeste; Beer, Colin; Rossi, Dolene; Moore, Teresa; Fleming, Julie

    2016-01-01

    Learning analytics is an emerging field in which sophisticated analytic tools are used to inform and improve learning and teaching. Researchers within a regional university in Australia identified an association between interaction and student success in online courses and subsequently developed a learning analytics system aimed at informing…

  19. Tools for studying dry-cured ham processing by using computed tomography.

    PubMed

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  20. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    NASA Astrophysics Data System (ADS)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  1. A simple formula for the effective complex conductivity of periodic fibrous composites with interfacial impedance and applications to biological tissues

    NASA Astrophysics Data System (ADS)

    Bisegna, Paolo; Caselli, Federica

    2008-06-01

    This paper presents a simple analytical expression for the effective complex conductivity of a periodic hexagonal arrangement of conductive circular cylinders embedded in a conductive matrix, with interfaces exhibiting a capacitive impedance. This composite material may be regarded as an idealized model of a biological tissue comprising tubular cells, such as skeletal muscle. The asymptotic homogenization method is adopted, and the corresponding local problem is solved by resorting to Weierstrass elliptic functions. The effectiveness of the present analytical result is proved by convergence analysis and comparison with finite-element solutions and existing models.

  2. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  3. From Complex to Simple: Interdisciplinary Stochastic Models

    ERIC Educational Resources Information Center

    Mazilu, D. A.; Zamora, G.; Mazilu, I.

    2012-01-01

    We present two simple, one-dimensional, stochastic models that lead to a qualitative understanding of very complex systems from biology, nanoscience and social sciences. The first model explains the complicated dynamics of microtubules, stochastic cellular highways. Using the theory of random walks in one dimension, we find analytical expressions…

  4. A simple analytical thermo-mechanical model for liquid crystal elastomer bilayer structures

    NASA Astrophysics Data System (ADS)

    Cui, Yun; Wang, Chengjun; Sim, Kyoseung; Chen, Jin; Li, Yuhang; Xing, Yufeng; Yu, Cunjiang; Song, Jizhou

    2018-02-01

    The bilayer structure consisting of thermal-responsive liquid crystal elastomers (LCEs) and other polymer materials with stretchable heaters has attracted much attention in applications of soft actuators and soft robots due to its ability to generate large deformations when subjected to heat stimuli. A simple analytical thermo-mechanical model, accounting for the non-uniform feature of the temperature/strain distribution along the thickness direction, is established for this type of bilayer structure. The analytical predictions of the temperature and bending curvature radius agree well with finite element analysis and experiments. The influences of the LCE thickness and the heat generation power on the bending deformation of the bilayer structure are fully investigated. It is shown that a thinner LCE layer and a higher heat generation power could yield more bending deformation. These results may help the design of soft actuators and soft robots involving thermal responsive LCEs.

  5. Using geovisual analytics in Google Earth to understand disease distribution: a case study of campylobacteriosis in the Czech Republic (2008-2012).

    PubMed

    Marek, Lukáš; Tuček, Pavel; Pászto, Vít

    2015-01-28

    Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.

  6. Application of analytical quality by design principles for the determination of alkyl p-toluenesulfonates impurities in Aprepitant by HPLC. Validation using total-error concept.

    PubMed

    Zacharis, Constantinos K; Vastardi, Elli

    2018-02-20

    In the research presented we report the development of a simple and robust liquid chromatographic method for the quantification of two genotoxic alkyl sulphonate impurities (namely methyl p-toluenesulfonate and isopropyl p-toluenesulfonate) in Aprepitant API substances using the Analytical Quality by Design (AQbD) approach. Following the steps of AQbD protocol, the selected critical method attributes (CMAs) were the separation criterions between the critical peak pairs, the analysis time and the peak efficiencies of the analytes. The critical method parameters (CMPs) included the flow rate, the gradient slope and the acetonitrile content at the first step of the gradient elution program. Multivariate experimental designs namely Plackett-Burman and Box-Behnken designs were conducted sequentially for factor screening and optimization of the method parameters. The optimal separation conditions were estimated using the desirability function. The method was fully validated in the range of 10-200% of the target concentration limit of the analytes using the "total error" approach. Accuracy profiles - a graphical decision making tool - were constructed using the results of the validation procedures. The β-expectation tolerance intervals did not exceed the acceptance criteria of±10%, meaning that 95% of future results will be included in the defined bias limits. The relative bias ranged between - 1.3-3.8% for both analytes, while the RSD values for repeatability and intermediate precision were less than 1.9% in all cases. The achieved limit of detection (LOD) and the limit of quantification (LOQ) were adequate for the specific purpose and found to be 0.02% (corresponding to 48μgg -1 in sample) for both methyl and isopropyl p-toluenesulfonate. As proof-of-concept, the validated method was successfully applied in the analysis of several Aprepitant batches indicating that this methodology could be used for routine quality control analyses. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. A new approach for downscaling of electromembrane extraction as a lab on-a-chip device followed by sensitive Red-Green-Blue detection.

    PubMed

    Baharfar, Mahroo; Yamini, Yadollah; Seidi, Shahram; Arain, Muhammad Balal

    2018-05-30

    A new design of electromembrane extraction (EME) as a lab on-a-chip device was proposed for the extraction and determination of phenazopyridine as the model analyte. The extraction procedure was accomplished by coupling of EME and the packing of a sorbent. The analyte was extracted under the applied electrical field across a membrane sheet impregnated by nitrophenyl octylether (NPOE) into an acceptor phase. It was followed by the absorption of the analyte on strong cation exchanger as a sorbent. The designed chip contained separate spiral channels for donor and acceptor phases featuring embedded platinum electrodes to enhance extraction efficiency. The selected donor and acceptor phases were 0 mM HCl and 100 mM HCl, respectively. The on-chip electromembrane extraction was carried out under the voltage level of 70 V for 50 min. The analysis was carried out by two modes of a simple Red-Green-Blue (RGB) image analysis tool and a conventional HPLC-UV system. After the absorption of the analyte on the solid phase, its color changed and a digital picture of the sorbent was taken for the RGB analysis. The effective parameters on the performance of the chip device, comprising the EME and solid phase microextraction steps, were distinguished and optimized. The accumulation of the analyte on the solid phase showed excellent sensitivity and a limit of detection (LOD) lower than 1.0 μg L-1 achieved by an image analysis using a smartphone. This device also offered acceptable intra- and inter-assay RSD% (<10%). The calibration curves were linear within the range of 10-1000 μg L-1 and 30-1000 μg L-1 (r2 > 0.9969) for HPLC-UV and RGB analysis, respectively. To investigate the applicability of the method in complicated matrices, urine samples of patients being treated with phenazopyridine were analyzed.

  8. Using Presentation Software to Flip an Undergraduate Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Fitzgerald, Neil; Li, Luisa

    2015-01-01

    An undergraduate analytical chemistry course has been adapted to a flipped course format. Course content was provided by video clips, text, graphics, audio, and simple animations organized as concept maps using the cloud-based presentation platform, Prezi. The advantages of using Prezi to present course content in a flipped course format are…

  9. Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.

    ERIC Educational Resources Information Center

    Gostowski, Rudy

    A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…

  10. Operational Environmental Assessment

    DTIC Science & Technology

    1988-09-01

    Chemistry Branch - Physical Chemistry Branch " Analytical Research Division - Analytical Systems Branch - Methodology Research Branch - Spectroscopy Branch...electromagnetic frequency spec- trum and includes radio frequencies, infrared , visible light, ultraviolet, X-rays and gamma rays (in ascending order of...Verruculogen Aflatrem Picrotoxin Ciguatoxin Mycotoxins Simple Tr ichothecenes T-2 Toxin T-2 Tetraol Neosolaniol * Nivalenol Deoxynivalenol Verrucarol B-3 B lank

  11. Numerical Simulation of the Perrin-Like Experiments

    ERIC Educational Resources Information Center

    Mazur, Zygmunt; Grech, Dariusz

    2008-01-01

    A simple model of the random Brownian walk of a spherical mesoscopic particle in viscous liquids is proposed. The model can be solved analytically and simulated numerically. The analytic solution gives the known Einstein-Smoluchowski diffusion law r[superscript 2] = 2Dt, where the diffusion constant D is expressed by the mass and geometry of a…

  12. Quantitative Ultrasound-Assisted Extraction for Trace-Metal Determination: An Experiment for Analytical Chemistry

    ERIC Educational Resources Information Center

    Lavilla, Isela; Costas, Marta; Pena-Pereira, Francisco; Gil, Sandra; Bendicho, Carlos

    2011-01-01

    Ultrasound-assisted extraction (UAE) is introduced to upper-level analytical chemistry students as a simple strategy focused on sample preparation for trace-metal determination in biological tissues. Nickel extraction in seafood samples and quantification by electrothermal atomic absorption spectrometry (ETAAS) are carried out by a team of four…

  13. A simple analytical model of coupled single flow channel over porous electrode in vanadium redox flow battery with serpentine flow channel

    NASA Astrophysics Data System (ADS)

    Ke, Xinyou; Alexander, J. Iwan D.; Prahl, Joseph M.; Savinell, Robert F.

    2015-08-01

    A simple analytical model of a layered system comprised of a single passage of a serpentine flow channel and a parallel underlying porous electrode (or porous layer) is proposed. This analytical model is derived from Navier-Stokes motion in the flow channel and Darcy-Brinkman model in the porous layer. The continuities of flow velocity and normal stress are applied at the interface between the flow channel and the porous layer. The effects of the inlet volumetric flow rate, thickness of the flow channel and thickness of a typical carbon fiber paper porous layer on the volumetric flow rate within this porous layer are studied. The maximum current density based on the electrolyte volumetric flow rate is predicted, and found to be consistent with reported numerical simulation. It is found that, for a mean inlet flow velocity of 33.3 cm s-1, the analytical maximum current density is estimated to be 377 mA cm-2, which compares favorably with experimental result reported by others of ∼400 mA cm-2.

  14. Boundary condition determined wave functions for the ground states of one- and two-electron homonuclear molecules

    NASA Astrophysics Data System (ADS)

    Patil, S. H.; Tang, K. T.; Toennies, J. P.

    1999-10-01

    Simple analytical wave functions satisfying appropriate boundary conditions are constructed for the ground states of one-and two-electron homonuclear molecules. Both the asymptotic condition when one electron is far away and the cusp condition when the electron coalesces with a nucleus are satisfied by the proposed wave function. For H2+, the resulting wave function is almost identical to the Guillemin-Zener wave function which is known to give very good energies. For the two electron systems H2 and He2++, the additional electron-electron cusp condition is rigorously accounted for by a simple analytic correlation function which has the correct behavior not only for r12→0 and r12→∞ but also for R→0 and R→∞, where r12 is the interelectronic distance and R, the internuclear distance. Energies obtained from these simple wave functions agree within 2×10-3 a.u. with the results of the most sophisticated variational calculations for all R and for all systems studied. This demonstrates that rather simple physical considerations can be used to derive very accurate wave functions for simple molecules thereby avoiding laborious numerical variational calculations.

  15. Learner Dashboards a Double-Edged Sword? Students' Sense-Making of a Collaborative Critical Reading and Learning Analytics Environment for Fostering 21st-Century Literacies

    ERIC Educational Resources Information Center

    Pei-Ling Tan, Jennifer; Koh, Elizabeth; Jonathan, Christin; Yang, Simon

    2017-01-01

    The affordances of learning analytics (LA) tools and solutions are being increasingly harnessed for enhancing 21st century pedagogical and learning strategies and outcomes. However, use cases and empirical understandings of students' experiences with LA tools and environments aimed at fostering 21st century literacies, especially in the K-12…

  16. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  17. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    ERIC Educational Resources Information Center

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  18. Search Analytics: Automated Learning, Analysis, and Search with Open Source

    NASA Astrophysics Data System (ADS)

    Hundman, K.; Mattmann, C. A.; Hyon, J.; Ramirez, P.

    2016-12-01

    The sheer volume of unstructured scientific data makes comprehensive human analysis impossible, resulting in missed opportunities to identify relationships, trends, gaps, and outliers. As the open source community continues to grow, tools like Apache Tika, Apache Solr, Stanford's DeepDive, and Data-Driven Documents (D3) can help address this challenge. With a focus on journal publications and conference abstracts often in the form of PDF and Microsoft Office documents, we've initiated an exploratory NASA Advanced Concepts project aiming to use the aforementioned open source text analytics tools to build a data-driven justification for the HyspIRI Decadal Survey mission. We call this capability Search Analytics, and it fuses and augments these open source tools to enable the automatic discovery and extraction of salient information. In the case of HyspIRI, a hyperspectral infrared imager mission, key findings resulted from the extractions and visualizations of relationships from thousands of unstructured scientific documents. The relationships include links between satellites (e.g. Landsat 8), domain-specific measurements (e.g. spectral coverage) and subjects (e.g. invasive species). Using the above open source tools, Search Analytics mined and characterized a corpus of information that would be infeasible for a human to process. More broadly, Search Analytics offers insights into various scientific and commercial applications enabled through missions and instrumentation with specific technical capabilities. For example, the following phrases were extracted in close proximity within a publication: "In this study, hyperspectral images…with high spatial resolution (1 m) were analyzed to detect cutleaf teasel in two areas. …Classification of cutleaf teasel reached a users accuracy of 82 to 84%." Without reading a single paper we can use Search Analytics to automatically identify that a 1 m spatial resolution provides a cutleaf teasel detection users accuracy of 82-84%, which could have tangible, direct downstream implications for crop protection. Automatically assimilating this information expedites and supplements human analysis, and, ultimately, Search Analytics and its foundation of open source tools will result in more efficient scientific investment and research.

  19. All-organic microelectromechanical systems integrating specific molecular recognition--a new generation of chemical sensors.

    PubMed

    Ayela, Cédric; Dubourg, Georges; Pellet, Claude; Haupt, Karsten

    2014-09-03

    Cantilever-type all-organic microelectromechanical systems based on molecularly imprinted polymers for specific analyte recognition are used as chemical sensors. They are produced by a simple spray-coating-shadow-masking process. Analyte binding to the cantilever generates a measurable change in its resonance frequency. This allows label-free detection by direct mass sensing of low-molecular-weight analytes at nanomolar concentrations. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Automation effects in a multiloop manual control system

    NASA Technical Reports Server (NTRS)

    Hess, R. A.; Mcnally, B. D.

    1986-01-01

    An experimental and analytical study was undertaken to investigate human interaction with a simple multiloop manual control system in which the human's activity was systematically varied by changing the level of automation. The system simulated was the longitudinal dynamics of a hovering helicopter. The automation-systems-stabilized vehicle responses from attitude to velocity to position and also provided for display automation in the form of a flight director. The control-loop structure resulting from the task definition can be considered a simple stereotype of a hierarchical control system. The experimental study was complemented by an analytical modeling effort which utilized simple crossover models of the human operator. It was shown that such models can be extended to the description of multiloop tasks involving preview and precognitive human operator behavior. The existence of time optimal manual control behavior was established for these tasks and the role which internal models may play in establishing human-machine performance was discussed.

  2. Spectral properties of thermal fluctuations on simple liquid surfaces below shot-noise levels.

    PubMed

    Aoki, Kenichiro; Mitsui, Takahisa

    2012-07-01

    We study the spectral properties of thermal fluctuations on simple liquid surfaces, sometimes called ripplons. Analytical properties of the spectral function are investigated and are shown to be composed of regions with simple analytic behavior with respect to the frequency or the wave number. The derived expressions are compared to spectral measurements performed orders of magnitude below shot-noise levels, which is achieved using a novel noise reduction method. The agreement between the theory of thermal surface fluctuations and the experiment is found to be excellent, elucidating the spectral properties of the surface fluctuations. The measurement method requires relatively only a small sample both spatially (few μm) and temporally (~20 s). The method also requires relatively weak light power (~0.5 mW) so that it has a broad range of applicability, including local measurements, investigations of time-dependent phenomena, and noninvasive measurements.

  3. Experimental evaluation of expendable supersonic nozzle concepts

    NASA Technical Reports Server (NTRS)

    Baker, V.; Kwon, O.; Vittal, B.; Berrier, B.; Re, R.

    1990-01-01

    Exhaust nozzles for expendable supersonic turbojet engine missile propulsion systems are required to be simple, short and compact, in addition to having good broad-range thrust-minus-drag performance. A series of convergent-divergent nozzle scale model configurations were designed and wind tunnel tested for a wide range of free stream Mach numbers and nozzle pressure ratios. The models included fixed geometry and simple variable exit area concepts. The experimental and analytical results show that the fixed geometry configurations tested have inferior off-design thrust-minus-drag performance in the transonic Mach range. A simple variable exit area configuration called the Axi-Quad nozzle, combining features of both axisymmetric and two-dimensional convergent-divergent nozzles, performed well over a broad range of operating conditions. Analytical predictions of the flow pattern as well as overall performance of the nozzles, using a fully viscous, compressible CFD code, compared very well with the test data.

  4. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  5. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  6. IBM’s Health Analytics and Clinical Decision Support

    PubMed Central

    Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.

    2014-01-01

    Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736

  7. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  8. Analytical Computation of Energy-Energy Correlation at Next-to-Leading Order in QCD

    NASA Astrophysics Data System (ADS)

    Dixon, Lance J.; Luo, Ming-xing; Shtabovenko, Vladyslav; Yang, Tong-Zhi; Zhu, Hua Xing

    2018-03-01

    The energy-energy correlation (EEC) between two detectors in e+e- annihilation was computed analytically at leading order in QCD almost 40 years ago, and numerically at next-to-leading order (NLO) starting in the 1980s. We present the first analytical result for the EEC at NLO, which is remarkably simple, and facilitates analytical study of the perturbative structure of the EEC. We provide the expansion of the EEC in the collinear and back-to-back regions through next-to-leading power, information which should aid resummation in these regions.

  9. New software solutions for analytical spectroscopists

    NASA Astrophysics Data System (ADS)

    Davies, Antony N.

    1999-05-01

    Analytical spectroscopists must be computer literate to effectively carry out the tasks assigned to them. This has often been resisted within organizations with insufficient funds to equip their staff properly, a lack of desire to deliver the essential training and a basic resistance amongst staff to learn the new techniques required for computer assisted analysis. In the past these problems were compounded by seriously flawed software which was being sold for spectroscopic applications. Owing to the limited market for such complex products the analytical spectroscopist often was faced with buying incomplete and unstable tools if the price was to remain reasonable. Long product lead times meant spectrometer manufacturers often ended up offering systems running under outdated and sometimes obscure operating systems. Not only did this mean special staff training for each instrument where the knowledge gained on one system could not be transferred to the neighbouring system but these spectrometers were often only capable of running in a stand-alone mode, cut-off from the rest of the laboratory environment. Fortunately a number of developments in recent years have substantially changed this depressing picture. A true multi-tasking operating system with a simple graphical user interface, Microsoft Windows NT4, has now been widely introduced into the spectroscopic computing environment which has provided a desktop operating system which has proved to be more stable and robust as well as requiring better programming techniques of software vendors. The opening up of the Internet has provided an easy way to access new tools for data handling and has forced a substantial re-think about results delivery (for example Chemical MIME types, IUPAC spectroscopic data exchange standards). Improved computing power and cheaper hardware now allows large spectroscopic data sets to be handled without too many problems. This includes the ability to carry out chemometric operations in minutes rather than hours. Fast networks now enable data analysis of even multi-dimensional spectroscopic data sets remote from the measuring instrument. A strong tendency to opt for a more unified graphical user interface which is substantially more user friendly allows even inexperienced users to rapidly get acquainted with even the complex mathematical analyses. Some examples of new spectroscopic software products will be given to demonstrate the aforesaid points and highlight the ease of integration into a modern analytical spectroscopy workplace.

  10. Data Intensive Computing on Amazon Web Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Magana-Zook, S. A.

    The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less

  11. Using an analytical geometry method to improve tiltmeter data presentation

    USGS Publications Warehouse

    Su, W.-J.

    2000-01-01

    The tiltmeter is a useful tool for geologic and geotechnical applications. To obtain full benefit from the tiltmeter, easy and accurate data presentations should be used. Unfortunately, the most commonly used method for tilt data reduction now may yield inaccurate and low-resolution results. This article describes a simple, accurate, and high-resolution approach developed at the Illinois State Geological Survey for data reduction and presentation. The orientation of tiltplates is determined first by using a trigonometric relationship, followed by a matrix transformation, to obtain the true amount of rotation change of the tiltplate at any given time. The mathematical derivations used for the determination and transformation are then coded into an integrated PC application by adapting the capabilities of commercial spreadsheet, database, and graphics software. Examples of data presentation from tiltmeter applications in studies of landfill covers, characterizations of mine subsidence, and investigations of slope stability are also discussed.

  12. Energy distribution from vertical impact of a three-dimensional solid body onto the flat free surface of an ideal fluid

    NASA Astrophysics Data System (ADS)

    Scolan, Y.-M.; Korobkin, A. A.

    2003-02-01

    Hydrodynamic impact phenomena are three dimensional in nature and naval architects need more advanced tools than a simple strip theory to calculate impact loads at the preliminary design stage. Three-dimensional analytical solutions have been obtained with the help of the so-called inverse Wagner problem as discussed by Scolan and Korobkin in 2001. The approach by Wagner provides a consistent way to evaluate the flow caused by a blunt body entering liquid through its free surface. However, this approach does not account for the spray jets and gives no idea regarding the energy evacuated from the main flow by the jets. Clear insight into the jet formation is required. Wagner provided certain elements of the answer for two-dimensional configurations. On the basis of those results, the energy distribution pattern is analysed for three-dimensional configurations in the present paper.

  13. Simulation of wind turbine wakes using the actuator line technique

    PubMed Central

    Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.

    2015-01-01

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862

  14. A three-dimensional actuated origami-inspired transformable metamaterial with multiple degrees of freedom

    NASA Astrophysics Data System (ADS)

    Overvelde, Johannes T. B.; de Jong, Twan A.; Shevchenko, Yanina; Becerra, Sergio A.; Whitesides, George M.; Weaver, James C.; Hoberman, Chuck; Bertoldi, Katia

    2016-03-01

    Reconfigurable devices, whose shape can be drastically altered, are central to expandable shelters, deployable space structures, reversible encapsulation systems and medical tools and robots. All these applications require structures whose shape can be actively controlled, both for deployment and to conform to the surrounding environment. While most current reconfigurable designs are application specific, here we present a mechanical metamaterial with tunable shape, volume and stiffness. Our approach exploits a simple modular origami-like design consisting of rigid faces and hinges, which are connected to form a periodic structure consisting of extruded cubes. We show both analytically and experimentally that the transformable metamaterial has three degrees of freedom, which can be actively deformed into numerous specific shapes through embedded actuation. The proposed metamaterial can be used to realize transformable structures with arbitrary architectures, highlighting a robust strategy for the design of reconfigurable devices over a wide range of length scales.

  15. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1987-01-01

    Future aerospace propulsion concepts involve the combination of liquid or gaseous fuels in a highly turbulent internal air stream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at Lewis to better understand chemical reacting flows with the long term goal of establishing these reliable computer codes. The approach to understanding chemical reacting flows is to look at separate simple parts of this complex phenomena as well as to study the full turbulent reacting flow process. As a result research on the fluid mechanics associated with chemical reacting flows was initiated. The chemistry of fuel-air combustion is also being studied. Finally, the phenomena of turbulence-combustion interaction is being investigated. This presentation will highlight research, both experimental and analytical, in each of these three major areas.

  16. Chemical reacting flows

    NASA Technical Reports Server (NTRS)

    Mularz, Edward J.; Sockol, Peter M.

    1990-01-01

    Future aerospace propulsion concepts involve the combustion of liquid or gaseous fuels in a highly turbulent internal airstream. Accurate predictive computer codes which can simulate the fluid mechanics, chemistry, and turbulence-combustion interaction of these chemical reacting flows will be a new tool that is needed in the design of these future propulsion concepts. Experimental and code development research is being performed at LeRC to better understand chemical reacting flows with the long-term goal of establishing these reliable computer codes. Our approach to understand chemical reacting flows is to look at separate, more simple parts of this complex phenomenon as well as to study the full turbulent reacting flow process. As a result, we are engaged in research on the fluid mechanics associated with chemical reacting flows. We are also studying the chemistry of fuel-air combustion. Finally, we are investigating the phenomenon of turbulence-combustion interaction. Research, both experimental and analytical, is highlighted in each of these three major areas.

  17. ZnO supported CoFe2O4 nanophotocatalysts for the mineralization of Direct Blue 71 in aqueous environments.

    PubMed

    Sathishkumar, Panneerselvam; Pugazhenthiran, Nalenthiran; Mangalaraja, Ramalinga Viswanathan; Asiri, Abdullah M; Anandan, Sambandam

    2013-05-15

    In this study, an attempt was made to render both the magnetic and photocatalytic properties in a semiconductor material to enhance the efficiency of degradation and recycling possibility of magnetic nanophotocatalysts. CoFe2O4 and CoFe2O4 loaded ZnO nanoparticles were prepared by a simple co-precipitation method and characterized using various analytical tools and in addition to check its visible light assisted photocatalytic activity. CoFe2O4/ZnO nanocatalyst coupled with acceptor, peroxomonosulphate (PMS) showed 1.69-fold enhancement in Direct Blue 71 (triazo dye; DB71) mineralization within 5h. The accomplished enrichment in decolorization was due to the production of more number of non-selective and active free radicals at the catalyst surface. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Prediction of sound absorption in rigid porous media with the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    da Silva, Andrey Ricardo; Mareze, Paulo; Brandão, Eric

    2016-02-01

    In this work, sound absorption phenomena associated with the viscous shear stress within rigid porous media is investigated with a simple isothermal lattice Boltzmann BGK model. Simulations are conducted for different macroscopic material properties such as sample thickness and porosity and the results are compared with the exact analytical solution for materials with slit-like structure in terms of acoustic impedance and sound absorption coefficient. The numerical results agree very well with the exact solution, particularly for the sound absorption coefficient. The small deviations found in the low frequency limit for the real part of the acoustic impedance are attributed to the ratio between the thicknesses of the slit and the viscous boundary layer. The results suggest that the lattice Boltzmann method can be a very compelling numerical tool for simulating viscous sound absorption phenomena in the time domain, particularly due to its computational simplicity when compared to traditional continuum based techniques.

  19. Analyte-Triggered DNA-Probe Release from a Triplex Molecular Beacon for Nanopore Sensing.

    PubMed

    Guo, Bingyuan; Sheng, Yingying; Zhou, Ke; Liu, Quansheng; Liu, Lei; Wu, Hai-Chen

    2018-03-26

    A new nanopore sensing strategy based on triplex molecular beacon was developed for the detection of specific DNA or multivalent proteins. The sensor is composed of a triplex-forming molecular beacon and a stem-forming DNA component that is modified with a host-guest complex. Upon target DNA hybridizing with the molecular beacon loop or multivalent proteins binding to the recognition elements on the stem, the DNA probe is released and produces highly characteristic current signals when translocated through α-hemolysin. The frequency of current signatures can be used to quantify the concentrations of the target molecules. This sensing approach provides a simple, quick, and modular tool for the detection of specific macromolecules with high sensitivity and excellent selectivity. It may find useful applications in point-of-care diagnostics with a portable nanopore kit in the future. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A charge-based model of Junction Barrier Schottky rectifiers

    NASA Astrophysics Data System (ADS)

    Latorre-Rey, Alvaro D.; Mudholkar, Mihir; Quddus, Mohammed T.; Salih, Ali

    2018-06-01

    A new charge-based model of the electric field distribution for Junction Barrier Schottky (JBS) diodes is presented, based on the description of the charge-sharing effect between the vertical Schottky junction and the lateral pn-junctions that constitute the active cell of the device. In our model, the inherently 2-D problem is transformed into a simple but accurate 1-D problem which has a closed analytical solution that captures the reshaping and reduction of the electric field profile responsible for the improved electrical performance of these devices, while preserving physically meaningful expressions that depend on relevant device parameters. The validation of the model is performed by comparing calculated electric field profiles with drift-diffusion simulations of a JBS device showing good agreement. Even though other fully 2-D models already available provide higher accuracy, they lack physical insight making the proposed model an useful tool for device design.

  1. Theoretical predictor for candidate structure assignment from IMS data of biomolecule-related conformational space.

    PubMed

    Schenk, Emily R; Nau, Frederic; Fernandez-Lima, Francisco

    2015-06-01

    The ability to correlate experimental ion mobility data with candidate structures from theoretical modeling provides a powerful analytical and structural tool for the characterization of biomolecules. In the present paper, a theoretical workflow is described to generate and assign candidate structures for experimental trapped ion mobility and H/D exchange (HDX-TIMS-MS) data following molecular dynamics simulations and statistical filtering. The applicability of the theoretical predictor is illustrated for a peptide and protein example with multiple conformations and kinetic intermediates. The described methodology yields a low computational cost and a simple workflow by incorporating statistical filtering and molecular dynamics simulations. The workflow can be adapted to different IMS scenarios and CCS calculators for a more accurate description of the IMS experimental conditions. For the case of the HDX-TIMS-MS experiments, molecular dynamics in the "TIMS box" accounts for a better sampling of the molecular intermediates and local energy minima.

  2. Investigating anomalous transport of electrolytes in charged porous media

    NASA Astrophysics Data System (ADS)

    Skjøde Bolet, Asger Johannes; Mathiesen, Joachim

    2017-04-01

    Surface charge is know to play an important role in microfluidics devices when dealing with electrolytes and their transport properties. Similarly, surface charge could play a role for transport in porous rock with submicron pore sizes. Estimates of the streaming potentials and electro osmotic are mostly considered in simple geometries both using analytic and numerical tools, however it is unclear at present how realistic complex geometries will modify the dynamics. Our work have focused on doing numerical studies of the full three-dimensional Stokes-Poisson-Nernst-Planck problem for electrolyte transport in porous rock. As the numerical implementation, we have used a finite element solver made using the FEniCS project code base, which can both solve for a steady state configuration and the full transient. In the presentation, we will show our results on anomalous transport due to electro kinetic effects such as the streaming potential or the electro osmotic effect.

  3. A simple closed-form solution for assessing concentration uncertainty

    NASA Astrophysics Data System (ADS)

    de Barros, F. P. J.; Fiori, Aldo; Bellin, Alberto

    2011-12-01

    We propose closed-form approximate solutions for the moments of a nonreactive tracer that can be used in applications, such as risk analysis. This is in line with the tenet that analytical solutions provide useful information, with minimum cost, during initial site characterization efforts and can serve as a preliminary screening tool when used with prior knowledge. We show that with the help of a few assumptions, the first-order solutions of the concentration moments proposed by Fiori and Dagan (2000) can be further simplified to assume a form similar to well-known deterministic solutions, therefore facilitating their use in applications. A highly anisotropic formation is assumed, and we neglect the transverse components of the two-particle correlation trajectory. The proposed solution compares well with the work of Fiori and Dagan while presenting the same simplicity of use of existing solutions for homogeneous porous media.

  4. Spacecraft self-contamination due to back-scattering of outgas products

    NASA Technical Reports Server (NTRS)

    Robertson, S. J.

    1976-01-01

    The back-scattering of outgas contamination near an orbiting spacecraft due to intermolecular collisions was analyzed. Analytical tools were developed for making reasonably accurate quantitative estimates of the outgas contamination return flux, given a knowledge of the pertinent spacecraft and orbit conditions. Two basic collision mechanisms were considered: (1) collisions involving only outgas molecules (self-scattering) and (2) collisions between outgas molecules and molecules in the ambient atmosphere (ambient-scattering). For simplicity, the geometry was idealized to a uniformly outgassing sphere and to a disk oriented normal to the freestream. The method of solution involved an integration of an approximation of the Boltzmann kinetic equation known as the BGK (or Krook) model equation. Results were obtained in the form of simple equations relating outgas return flux to spacecraft and orbit parameters. Results were compared with previous analyses based on more simplistic models of the collision processes.

  5. Laser writing of single-crystalline gold substrates for surface enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Singh, Astha; Sharma, Geeta; Ranjan, Neeraj; Mittholiya, Kshitij; Bhatnagar, Anuj; Singh, B. P.; Mathur, Deepak; Vasa, Parinda

    2017-07-01

    Surface enhanced Raman scattering (SERS) spectroscopy, a powerful contemporary tool for studying low-concentration analytes via surface plasmon induced enhancement of local electric field, is of utility in biochemistry, material science, threat detection, and environmental studies. We have developed a simple, fast, scalable, and relatively low-cost optical method of fabricating and characterizing large-area, reusable and broadband SERS substrates with long storage lifetime. We use tightly focused, intense infra-red laser pulses to write gratings on single-crystalline, Au (1 1 1) gold films on mica which act as SERS substrates. Our single-crystalline SERS substrates compare favourably, in terms of surface quality and roughness, to those fabricated in poly-crystalline Au films. Tests show that our SERS substrates have the potential of detecting urea and 1,10-phenantroline adulterants in milk and water, respectively, at 0.01 ppm (or lower) concentrations.

  6. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Dipolar filtered magic-sandwich-echoes as a tool for probing molecular motions using time domain NMR

    NASA Astrophysics Data System (ADS)

    Filgueiras, Jefferson G.; da Silva, Uilson B.; Paro, Giovanni; d'Eurydice, Marcel N.; Cobo, Márcio F.; deAzevedo, Eduardo R.

    2017-12-01

    We present a simple 1 H NMR approach for characterizing intermediate to fast regime molecular motions using 1 H time-domain NMR at low magnetic field. The method is based on a Goldmann Shen dipolar filter (DF) followed by a Mixed Magic Sandwich Echo (MSE). The dipolar filter suppresses the signals arising from molecular segments presenting sub kHz mobility, so only signals from mobile segments are detected. Thus, the temperature dependence of the signal intensities directly evidences the onset of molecular motions with rates higher than kHz. The DF-MSE signal intensity is described by an analytical function based on the Anderson Weiss theory, from where parameters related to the molecular motion (e.g. correlation times and activation energy) can be estimated when performing experiments as function of the temperature. Furthermore, we propose the use of the Tikhonov regularization for estimating the width of the distribution of correlation times.

  8. MALDI-TOF mass spectrometry in the clinical mycology laboratory: identification of fungi and beyond.

    PubMed

    Posteraro, Brunella; De Carolis, Elena; Vella, Antonietta; Sanguinetti, Maurizio

    2013-04-01

    MALDI-TOF mass spectrometry (MS) is becoming essential in most clinical microbiology laboratories throughout the world. Its successful use is mainly attributable to the low operational costs, the universality and flexibility of detection, as well as the specificity and speed of analysis. Based on characteristic protein spectra obtained from intact cells - by means of simple, rapid and reproducible preanalytical and analytical protocols - MALDI-TOF MS allows a highly discriminatory identification of yeasts and filamentous fungi starting from colonies. Whenever used early, direct identification of yeasts from positive blood cultures has the potential to greatly shorten turnaround times and to improve laboratory diagnosis of fungemia. More recently, but still at an infancy stage, MALDI-TOF MS is used to perform strain typing and to determine antifungal drug susceptibility. In this article, the authors discuss how the MALDI-TOF MS technology is destined to become a powerful tool for routine mycological diagnostics.

  9. One-point fitting of the flux density produced by a heliostat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collado, Francisco J.

    Accurate and simple models for the flux density reflected by an isolated heliostat should be one of the basic tools for the design and optimization of solar power tower systems. In this work, the ability and the accuracy of the Universidad de Zaragoza (UNIZAR) and the DLR (HFCAL) flux density models to fit actual energetic spots are checked against heliostat energetic images measured at Plataforma Solar de Almeria (PSA). Both the fully analytic models are able to acceptably fit the spot with only one-point fitting, i.e., the measured maximum flux. As a practical validation of this one-point fitting, the interceptmore » percentage of the measured images, i.e., the percentage of the energetic spot sent by the heliostat that gets the receiver surface, is compared with the intercept calculated through the UNIZAR and HFCAL models. As main conclusions, the UNIZAR and the HFCAL models could be quite appropriate tools for the design and optimization, provided the energetic images from the heliostats to be used in the collector field were previously analyzed. Also note that the HFCAL model is much simpler and slightly more accurate than the UNIZAR model. (author)« less

  10. Ecotracer: analyzing concentration of contaminants and radioisotopes in an aquatic spatial-dynamic food web model.

    PubMed

    Walters, William J; Christensen, Villy

    2018-01-01

    Ecotracer is a tool in the Ecopath with Ecosim (EwE) software package used to simulate and analyze the transport of contaminants such as methylmercury or radiocesium through aquatic food webs. Ecotracer solves the contaminant dynamic equations simultaneously with the biomass dynamic equations in Ecosim/Ecospace. In this paper, we give a detailed description of the Ecotracer module and analyze the performance on two problems of differing complexity. Ecotracer was modified from previous versions to more accurately model contaminant excretion, and new numerical integration algorithms were implemented to increase accuracy and robustness. To test the mathematical robustness of the computational algorithm, Ecotracer was tested on a simple problem for which we know an analytical solution. These results demonstrated the effectiveness of the program numerics. A much more complex model, the release of the cesium radionuclide 137 Cs from the Fukushima Dai-ichi nuclear accident, was also modeled and analyzed. A comparison of the Ecotracer results to sampled 137 Cs measurements in the coastal ocean area around Fukushima show the promise of the tool but also highlight some important limitations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. 3-D discrete analytical ridgelet transform.

    PubMed

    Helbert, David; Carré, Philippe; Andres, Eric

    2006-12-01

    In this paper, we propose an implementation of the 3-D Ridgelet transform: the 3-D discrete analytical Ridgelet transform (3-D DART). This transform uses the Fourier strategy for the computation of the associated 3-D discrete Radon transform. The innovative step is the definition of a discrete 3-D transform with the discrete analytical geometry theory by the construction of 3-D discrete analytical lines in the Fourier domain. We propose two types of 3-D discrete lines: 3-D discrete radial lines going through the origin defined from their orthogonal projections and 3-D planes covered with 2-D discrete line segments. These discrete analytical lines have a parameter called arithmetical thickness, allowing us to define a 3-D DART adapted to a specific application. Indeed, the 3-D DART representation is not orthogonal, It is associated with a flexible redundancy factor. The 3-D DART has a very simple forward/inverse algorithm that provides an exact reconstruction without any iterative method. In order to illustrate the potentiality of this new discrete transform, we apply the 3-D DART and its extension to the Local-DART (with smooth windowing) to the denoising of 3-D image and color video. These experimental results show that the simple thresholding of the 3-D DART coefficients is efficient.

  12. A new disposable electrode for electrochemical study of leukemia K562 cells and anticancer drug sensitivity test.

    PubMed

    Yu, Chunmei; Zhu, Zhenkun; Wang, Li; Wang, Qiuhong; Bao, Ning; Gu, Haiying

    2014-03-15

    Developing cost-effective and simple analysis tools is of vital importance for practical applications in bioanalysis. In this work, a new disposable electrochemical cell sensor with low cost and simple fabrication was proposed to study the electrochemical behavior of leukemia K562 cells and the effect of anticancer drugs on cell viability. The analytical device was integrated by using ITO glass as the substrate of working electrodes and paper as the electrolytic cell. The cyclic voltammetry of the K562 cells at the disposable electrode exhibited an irreversible anodic peak and the peak current is proportional to the cell number. This anodic peak is attributed to the oxidation of guanine in cells involving two protons per transfer of two electrons. For the drug sensitivity tests, arsenic trioxide and cyclophosphamide were added to cell culture media. As a result, the electrochemical responses of the K562 cells decreased significantly. The cytotoxicity curves and results obtained corresponded well with the results of CCK-8 assays. In comparison to conventional methods, the proposed method is simple, rapid and inexpensive. More importantly, the developed sensor is supposed to be a single-use disposable device and electrodes were prepared "as new" for each experiment. We think that such disposable electrodes with these characteristics are suitable for experimental study with cancer cells or other types of pathogens for disease diagnosis, drug selection and on-site monitoring. © 2013 Elsevier B.V. All rights reserved.

  13. Equivalent circuit models for interpreting impedance perturbation spectroscopy data

    NASA Astrophysics Data System (ADS)

    Smith, R. Lowell

    2004-07-01

    As in-situ structural integrity monitoring disciplines mature, there is a growing need to process sensor/actuator data efficiently in real time. Although smaller, faster embedded processors will contribute to this, it is also important to develop straightforward, robust methods to reduce the overall computational burden for practical applications of interest. This paper addresses the use of equivalent circuit modeling techniques for inferring structure attributes monitored using impedance perturbation spectroscopy. In pioneering work about ten years ago significant progress was associated with the development of simple impedance models derived from the piezoelectric equations. Using mathematical modeling tools currently available from research in ultrasonics and impedance spectroscopy is expected to provide additional synergistic benefits. For purposes of structural health monitoring the objective is to use impedance spectroscopy data to infer the physical condition of structures to which small piezoelectric actuators are bonded. Features of interest include stiffness changes, mass loading, and damping or mechanical losses. Equivalent circuit models are typically simple enough to facilitate the development of practical analytical models of the actuator-structure interaction. This type of parametric structure model allows raw impedance/admittance data to be interpreted optimally using standard multiple, nonlinear regression analysis. One potential long-term outcome is the possibility of cataloging measured viscoelastic properties of the mechanical subsystems of interest as simple lists of attributes and their statistical uncertainties, whose evolution can be followed in time. Equivalent circuit models are well suited for addressing calibration and self-consistency issues such as temperature corrections, Poisson mode coupling, and distributed relaxation processes.

  14. Haze Gray Paint and the U.S. Navy: A Procurement Process Review

    DTIC Science & Technology

    2017-12-01

    support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing historical demand data for Silicone Alkyd...inventory level of 1K Polysiloxane in support of the fleet. The research encompasses both qualitative and quantitative analytical tools utilizing...Chapter I. C. CONCLUSIONS As discussed in the Summary section, this research used a qualitative and a quantitative approach to analyze the Polysiloxane

  15. Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools

    DTIC Science & Technology

    2014-01-14

    Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and

  16. Visualization and Analytics Software Tools for Peregrine System |

    Science.gov Websites

    R is a language and environment for statistical computing and graphics. Go to the R web site for System Visualization and Analytics Software Tools for Peregrine System Learn about the available visualization for OpenGL-based applications. For more information, please go to the FastX page. ParaView An open

  17. Dynamic Vision for Control

    DTIC Science & Technology

    2006-07-27

    unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project was to develop analytical and computational tools to make vision a Viable sensor for...vision.ucla. edu July 27, 2006 Abstract The goal of this project was to develop analytical and computational tools to make vision a viable sensor for the ... sensors . We have proposed the framework of stereoscopic segmentation where multiple images of the same obejcts were jointly processed to extract geometry

  18. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    DTIC Science & Technology

    2011-03-28

    Study Team Working Paper 3: Research Methods Discussion for the Study Team Methods229 Generating Empirical Materials In grounded theory ... research I have conducted using these methods . UNCLASSIFIED Analytical Tools for the Application of Operational Culture: A Case Study in the...Survey and a Case Study ,‖ Kjeller, Norway: FFI Glaser, B. G. & Strauss, A. L. (1967). ―The discovery of grounded theory

  19. A computational method for optimizing fuel treatment locations

    Treesearch

    Mark A. Finney

    2006-01-01

    Modeling and experiments have suggested that spatial fuel treatment patterns can influence the movement of large fires. On simple theoretical landscapes consisting of two fuel types (treated and untreated) optimal patterns can be analytically derived that disrupt fire growth efficiently (i.e. with less area treated than random patterns). Although conceptually simple,...

  20. Demonstrating Success: Web Analytics and Continuous Improvement

    ERIC Educational Resources Information Center

    Loftus, Wayne

    2012-01-01

    As free and low-cost Web analytics tools become more sophisticated, libraries' approach to user analysis can become more nuanced and precise. Tracking appropriate metrics with a well-formulated analytics program can inform design decisions, demonstrate the degree to which those decisions have succeeded, and thereby inform the next iteration in the…

  1. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  2. Transport of a decay chain in homogenous porous media: analytical solutions.

    PubMed

    Bauer, P; Attinger, S; Kinzelbach, W

    2001-06-01

    With the aid of integral transforms, analytical solutions for the transport of a decay chain in homogenous porous media are derived. Unidirectional steady-state flow and radial steady-state flow in single and multiple porosity media are considered. At least in Laplace domain, all solutions can be written in closed analytical formulae. Partly, the solutions can also be inverted analytically. If not, analytical calculation of the steady-state concentration distributions, evaluation of temporal moments and numerical inversion are still possible. Formulae for several simple boundary conditions are given and visualized in this paper. The derived novel solutions are widely applicable and are very useful for the validation of numerical transport codes.

  3. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  4. MI-Sim: A MATLAB package for the numerical analysis of microbial ecological interactions.

    PubMed

    Wade, Matthew J; Oakley, Jordan; Harbisher, Sophie; Parker, Nicholas G; Dolfing, Jan

    2017-01-01

    Food-webs and other classes of ecological network motifs, are a means of describing feeding relationships between consumers and producers in an ecosystem. They have application across scales where they differ only in the underlying characteristics of the organisms and substrates describing the system. Mathematical modelling, using mechanistic approaches to describe the dynamic behaviour and properties of the system through sets of ordinary differential equations, has been used extensively in ecology. Models allow simulation of the dynamics of the various motifs and their numerical analysis provides a greater understanding of the interplay between the system components and their intrinsic properties. We have developed the MI-Sim software for use with MATLAB to allow a rigorous and rapid numerical analysis of several common ecological motifs. MI-Sim contains a series of the most commonly used motifs such as cooperation, competition and predation. It does not require detailed knowledge of mathematical analytical techniques and is offered as a single graphical user interface containing all input and output options. The tools available in the current version of MI-Sim include model simulation, steady-state existence and stability analysis, and basin of attraction analysis. The software includes seven ecological interaction motifs and seven growth function models. Unlike other system analysis tools, MI-Sim is designed as a simple and user-friendly tool specific to ecological population type models, allowing for rapid assessment of their dynamical and behavioural properties.

  5. SEADS 3.0 Sectoral Energy/Employment Analysis and Data System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roop, Joseph M.; Anderson, David A.; Schultz, Robert W.

    2007-12-17

    SEADS 3.0, the Sectoral Energy/Employment Analysis and Data System, is a revision and upgrading of SEADS--PC, a software package designed for the analysis of policy that could be described by modifying final demands of consumer, businesses, or governments (Roop, et al., 1995). If a question can be formulated so that implications can be translated into changes in final demands for goods and services, then SEADS 3.0 provides a quick and easy tool to assess preliminary impacts. And SEADS 3.0 should be considered just that: a quick and easy way to get preliminary results. Often a thorough answer, even to suchmore » a simple question as, “What would be the effect on U. S. energy use and employment if the Federal Government doubled R&D expenditures?” requires a more sophisticated analytical framework than the input-output structure embedded in SEADS 3.0. This tool uses a static, input-output model to assess the impacts of changes in final demands on first industry output, then employment and energy use. The employment and energy impacts are derived by multiplying the industry outputs (derived from the changed final demands) by industry-specific energy and employment coefficients. The tool also allows for the specification of regional or state employment impacts, though this option is not available for energy impacts.« less

  6. Designing instruction to support mechanical reasoning: Three alternatives in the simple machines learning environment

    NASA Astrophysics Data System (ADS)

    McKenna, Ann Frances

    2001-07-01

    Creating a classroom environment that fosters a productive learning experience and engages students in the learning process is a complex endeavor. A classroom environment is dynamic and requires a unique synergy among students, teacher, classroom artifacts and events to achieve robust understanding and knowledge integration. This dissertation addresses this complex issue by developing, implementing, and investigating the simple machines learning environment (SIMALE) to support students' mechanical reasoning and understanding. SIMALE was designed to support reflection, collaborative learning, and to engage students in generative learning through multiple representations of concepts and successive experimentation and design activities. Two key components of SIMALE are an original web-based software tool and hands-on Lego activities. A research study consisting of three treatment groups was created to investigate the benefits of hands-on and web-based computer activities on students' analytic problem solving ability, drawing/modeling ability, and conceptual understanding. The study was conducted with two populations of students that represent a diverse group with respect to gender, ethnicity, academic achievement and social/economic status. One population of students in this dissertation study participated from the Mathematics, Engineering, and Science Achievement (MESA) program that serves minorities and under-represented groups in science and mathematics. The second group was recruited from the Academic Talent Development Program (ATDP) that is an academically competitive outreach program offered through the University of California at Berkeley. Results from this dissertation show success of the SIMALE along several dimensions. First, students in both populations achieved significant gains in analytic problem solving ability, drawing/modeling ability, and conceptual understanding. Second, significant differences that were found on pre-test measures were eliminated on post-test measures. Specifically, female students scored significantly lower than males on the overall pre-tests but scored as well as males on the same post-test measures. MESA students also scored significantly lower than ATDP students on pre-test measures but both populations scored equally well on the post-tests. This dissertation has therefore shown the SIMALE to support a collaborative, reflective, and generative learning environment. Furthermore, the SIMALE clearly contributes to students' mechanical reasoning and understanding of simple machines concepts for a diverse population of students.

  7. Surface enhanced Raman spectroscopy based nanoparticle assays for rapid, point-of-care diagnostics

    NASA Astrophysics Data System (ADS)

    Driscoll, Ashley J.

    Nucleotide and immunoassays are important tools for disease diagnostics. Many of the current laboratory-based analytical diagnostic techniques require multiple assay steps and long incubation times before results are acquired. In the development of bioassays designed for detecting the emergence and spread of diseases in point-of-care (POC) and remote settings, more rapid and portable analytical methods are necessary. Nanoparticles provide simple and reproducible synthetic methods for the preparation of substrates that can be applied in colloidal assays, providing gains in kinetics due to miniaturization and plasmonic substrates for surface enhanced spectroscopies. Specifically, surface enhanced Raman spectroscopy (SERS) is finding broad application as a signal transduction method in immunological and nucleotide assays due to the production of narrow spectral peaks from the scattering molecules and the potential for simultaneous multiple analyte detection. The application of SERS to a no-wash, magnetic capture assay for the detection of West Nile Virus Envelope and Rift Valley Fever Virus N antigens is described. The platform utilizes colloid based capture of the target antigen in solution, magnetic collection of the immunocomplexes and acquisition of SERS spectra by a handheld Raman spectrometer. The reagents for a core-shell nanoparticle, SERS based assay designed for the capture of target microRNA implicated in acute myocardial infarction are also characterized. Several new, small molecule Raman scatterers are introduced and used to analyze the enhancing properties of the synthesized gold coated-magnetic nanoparticles. Nucleotide and immunoassay platforms have shown improvements in speed and analyte capture through the miniaturization of the capture surface and particle-based capture systems can provide a route to further surface miniaturization. A reaction-diffusion model of the colloidal assay platform is presented to understand the interplay of system parameters such as particle diameter, initial analyte concentration and dissociation constants. The projected sensitivities over a broad range of assay conditions are examined and the governing regime of particle systems reported. The results provide metrics in the design of more robust analytics that are of particular interest for POC diagnostics.

  8. A Comprehensive Analytical Solution of the Nonlinear Pendulum

    ERIC Educational Resources Information Center

    Ochs, Karlheinz

    2011-01-01

    In this paper, an analytical solution for the differential equation of the simple but nonlinear pendulum is derived. This solution is valid for any time and is not limited to any special initial instance or initial values. Moreover, this solution holds if the pendulum swings over or not. The method of approach is based on Jacobi elliptic functions…

  9. Distributed Generation Interconnection Collaborative | NREL

    Science.gov Websites

    , reduce paperwork, and improve customer service. Analytical Methods for Interconnection Many utilities and jurisdictions are seeking the right screening and analytical methods and tools to meet their reliability

  10. Use of Speech Analyses within a Mobile Application for the Assessment of Cognitive Impairment in Elderly People.

    PubMed

    Konig, Alexandra; Satt, Aharon; Sorin, Alex; Hoory, Ran; Derreumaux, Alexandre; David, Renaud; Robert, Phillippe H

    2018-01-01

    Various types of dementia and Mild Cognitive Impairment (MCI) are manifested as irregularities in human speech and language, which have proven to be strong predictors for the disease presence and progress ion. Therefore, automatic speech analytics provided by a mobile application may be a useful tool in providing additional indicators for assessment and detection of early stage dementia and MCI. 165 participants (subjects with subjective cognitive impairment (SCI), MCI patients, Alzheimer's disease (AD) and mixed dementia (MD) patients) were recorded with a mobile application while performing several short vocal cognitive tasks during a regular consultation. These tasks included verbal fluency, picture description, counting down and a free speech task. The voice recordings were processed in two steps: in the first step, vocal markers were extracted using speech signal processing techniques; in the second, the vocal markers were tested to assess their 'power' to distinguish between SCI, MCI, AD and MD. The second step included training automatic classifiers for detecting MCI and AD, based on machine learning methods, and testing the detection accuracy. The fluency and free speech tasks obtain the highest accuracy rates of classifying AD vs. MD vs. MCI vs. SCI. Using the data, we demonstrated classification accuracy as follows: SCI vs. AD = 92% accuracy; SCI vs. MD = 92% accuracy; SCI vs. MCI = 86% accuracy and MCI vs. AD = 86%. Our results indicate the potential value of vocal analytics and the use of a mobile application for accurate automatic differentiation between SCI, MCI and AD. This tool can provide the clinician with meaningful information for assessment and monitoring of people with MCI and AD based on a non-invasive, simple and low-cost method. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  11. Robust Classification of Small-Molecule Mechanism of Action Using a Minimalist High-Content Microscopy Screen and Multidimensional Phenotypic Trajectory Analysis

    PubMed Central

    Twarog, Nathaniel R.; Low, Jonathan A.; Currier, Duane G.; Miller, Greg; Chen, Taosheng; Shelat, Anang A.

    2016-01-01

    Phenotypic screening through high-content automated microscopy is a powerful tool for evaluating the mechanism of action of candidate therapeutics. Despite more than a decade of development, however, high content assays have yielded mixed results, identifying robust phenotypes in only a small subset of compound classes. This has led to a combinatorial explosion of assay techniques, analyzing cellular phenotypes across dozens of assays with hundreds of measurements. Here, using a minimalist three-stain assay and only 23 basic cellular measurements, we developed an analytical approach that leverages informative dimensions extracted by linear discriminant analysis to evaluate similarity between the phenotypic trajectories of different compounds in response to a range of doses. This method enabled us to visualize biologically-interpretable phenotypic tracks populated by compounds of similar mechanism of action, cluster compounds according to phenotypic similarity, and classify novel compounds by comparing them to phenotypically active exemplars. Hierarchical clustering applied to 154 compounds from over a dozen different mechanistic classes demonstrated tight agreement with published compound mechanism classification. Using 11 phenotypically active mechanism classes, classification was performed on all 154 compounds: 78% were correctly identified as belonging to one of the 11 exemplar classes or to a different unspecified class, with accuracy increasing to 89% when less phenotypically active compounds were excluded. Importantly, several apparent clustering and classification failures, including rigosertib and 5-fluoro-2’-deoxycytidine, instead revealed more complex mechanisms or off-target effects verified by more recent publications. These results show that a simple, easily replicated, minimalist high-content assay can reveal subtle variations in the cellular phenotype induced by compounds and can correctly predict mechanism of action, as long as the appropriate analytical tools are used. PMID:26886014

  12. Robust Classification of Small-Molecule Mechanism of Action Using a Minimalist High-Content Microscopy Screen and Multidimensional Phenotypic Trajectory Analysis.

    PubMed

    Twarog, Nathaniel R; Low, Jonathan A; Currier, Duane G; Miller, Greg; Chen, Taosheng; Shelat, Anang A

    2016-01-01

    Phenotypic screening through high-content automated microscopy is a powerful tool for evaluating the mechanism of action of candidate therapeutics. Despite more than a decade of development, however, high content assays have yielded mixed results, identifying robust phenotypes in only a small subset of compound classes. This has led to a combinatorial explosion of assay techniques, analyzing cellular phenotypes across dozens of assays with hundreds of measurements. Here, using a minimalist three-stain assay and only 23 basic cellular measurements, we developed an analytical approach that leverages informative dimensions extracted by linear discriminant analysis to evaluate similarity between the phenotypic trajectories of different compounds in response to a range of doses. This method enabled us to visualize biologically-interpretable phenotypic tracks populated by compounds of similar mechanism of action, cluster compounds according to phenotypic similarity, and classify novel compounds by comparing them to phenotypically active exemplars. Hierarchical clustering applied to 154 compounds from over a dozen different mechanistic classes demonstrated tight agreement with published compound mechanism classification. Using 11 phenotypically active mechanism classes, classification was performed on all 154 compounds: 78% were correctly identified as belonging to one of the 11 exemplar classes or to a different unspecified class, with accuracy increasing to 89% when less phenotypically active compounds were excluded. Importantly, several apparent clustering and classification failures, including rigosertib and 5-fluoro-2'-deoxycytidine, instead revealed more complex mechanisms or off-target effects verified by more recent publications. These results show that a simple, easily replicated, minimalist high-content assay can reveal subtle variations in the cellular phenotype induced by compounds and can correctly predict mechanism of action, as long as the appropriate analytical tools are used.

  13. Liquid chromatographic determination of sennosides in Cassia angustifolia leaves.

    PubMed

    Srivastava, Alpuna; Pandey, Richa; Verma, Ram K; Gupta, Madan M

    2006-01-01

    A simple liquid chromatographic method was developed for the determination of sennosides B and A in leaves of Cassia angustifolia. These compounds were extracted from leaves with a mixture of methanol-water (70 + 30, v/v) after defatting with hexane. Analyte separation and quantitation were achieved by gradient reversed-phase liquid chromatography and UV absorbance at 270 nm using a photodiode array detector. The method involves the use of an RP-18 Lichrocart reversed-phase column (5 microm, 125 x 4.0 mm id) and a binary gradient mobile-phase profile. The various other aspects of analysis, namely, peak purity, similarity, recovery, repeatability, and robustness, were validated. Average recoveries of 98.5 and 98.6%, with a coefficient of variation of 0.8 and 0.3%, were obtained by spiking sample solution with 3 different concentration solutions of standards (60, 100, and 200 microg/mL). Detection limits were 10 microg/mL for sennoside B and 35 microg/mL for sennoside A, present in the sample solution. The quantitation limits were 28 and 100 microg/mL. The analytical method was applied to a large number of senna leaf samples. The new method provides a reliable tool for rapid screening of C. angustifolia samples in large numbers, which is needed in breeding/genetic engineering and genetic mapping experiments.

  14. Thawing as a critical pre-analytical step in the lipidomic profiling of plasma samples: New standardized protocol.

    PubMed

    Pizarro, Consuelo; Arenzana-Rámila, Irene; Pérez-del-Notario, Nuria; Pérez-Matute, Patricia; González-Sáiz, José María

    2016-03-17

    Lipid profiling is a promising tool for the discovery and subsequent identification of biomarkers associated with various diseases. However, data quality is quite dependent on the pre-analytical methods employed. To date, potential confounding factors that may affect lipid metabolite levels after the thawing of plasma for biomarker exploration studies have not been thoroughly evaluated. In this study, by means of experimental design methodology, we performed the first in-depth examination of the ways in which thawing conditions affect lipid metabolite levels. After the optimization stage, we concluded that temperature, sample volume and the thawing method were the determining factors that had to be exhaustively controlled in the thawing process to ensure the quality of biomarker discovery. Best thawing conditions were found to be: 4 °C, with 0.25 mL of human plasma and ultrasound (US) thawing. The new US proposed thawing method was quicker than the other methods we studied, allowed more features to be identified and increased the signal of the lipids. In view of its speed, efficiency and detectability, the US thawing method appears to be a simple, economical method for the thawing of plasma samples, which could easily be applied in clinical laboratories before lipid profiling studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Analytical performances of d-ROMs test and BAP test in canine plasma. Definition of the normal range in healthy Labrador dogs.

    PubMed

    Pasquini, A; Luchetti, E; Marchetti, V; Cardini, G; Iorio, E L

    2008-02-01

    An high level of ROS (Reactive Oxygen Species), due to an increased production of oxidant species and/or a decreased efficacy of antioxidant system, can lead to oxidative stress, an emerging health risk factor involved in the aging and in many diseases, including inflammatory, infectious and degenerative disorders, either in humans or in animals. In the last years some assays panels have been developed to globally evaluate the oxidative balance by means of the concomitant assessment of ROS production and antioxidant system capability. In this report, the validation trials of d-ROMs (Reactive Oxygen Metabolites- derived compounds) and BAP (Biological Antioxidant Potential) tests in canine specie are described and also the specific referral ranges are calculated in a Labrador population. The results of linearity, precision and accuracy trials show that both tests exhibit good to excellent analytical performances. The possibility of measuring oxidative stress in vivo with simple, cheap and accurate tests, d-ROMs test and BAP test, provides for the veterinarians a very suitable tool to monitor oxidative stress and to correctly choice of eventual antioxidant supplementations in diseases proven related to oxidative stress in animals and particularly in dogs. Further studies will be useful to confirm this possibility.

  16. Rapid determination of amino acids in neonatal blood samples based on derivatization with isobutyl chloroformate followed by solid-phase microextraction and gas chromatography/mass spectrometry.

    PubMed

    Deng, Chunhui; Li, Ning; Zhang, Xiangmin

    2004-01-01

    The purpose of this study was to develop a simple, rapid and sensitive analytical method for determination of amino acids in neonatal blood samples. The developed method involves the employment of derivatization and a solid-phase microextraction (SPME) technique together with gas chromatography/mass spectrometry (GC/MS). Amino acids in blood samples were derivatized by a mixture of isobutyl chloroformate, methanol and pyridine, and the N(O,S)-alkoxycarbonyl alkyl esters thus formed were headspace extracted by a SPME fiber. Finally, the extracted analytes on the fiber were desorbed and detected by GC/MS in electron impact (EI) mode. L-Valine, L-leucine, L-isoleucine, L-phenylanaline and L-tyrosine in blood samples were quantitatively analyzed by measurement of the corresponding N(O,S)-alkoxycarbonyl alkyl esters using an external standard method. SPME conditions were optimized, and the method was validated. The method was applied to diagnosis of neonatal phenylkenuria (PKU) and maple syrup urine disease (MSUD) by the analyses of five amino acids in blood samples. The results showed that the proposed method is a potentially powerful tool for simultaneous screening for neonatal PKU and MSUD. Copyright (c) 2004 John Wiley & Sons, Ltd.

  17. DMSO Assisted Electrospray Ionization for the Detection of Small Peptide Hormones in Urine by Dilute-and-Shoot-Liquid-Chromatography-High Resolution Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Judák, Péter; Grainger, Janelle; Goebel, Catrin; Van Eenoo, Peter; Deventer, Koen

    2017-08-01

    The mobile phase additive (DMSO) has been described as a useful tool to enhance electrospray ionization (ESI) of peptides and proteins. So far, this technique has mainly been used in proteomic/peptide research, and its applicability in a routine clinical laboratory setting (i.e., doping control analysis) has not been described yet. This work provides a simple, easy to implement screening method for the detection of doping relevant small peptides (GHRPs, GnRHs, GHS, and vasopressin-analogues) with molecular weight less than 2 kDa applying DMSO in the mobile phase. The gain in sensitivity was sufficient to inject the urine samples after a 2-fold dilution step omitting a time consuming sample preparation. The employed analytical procedure was validated for the qualitative determination of 36 compounds, including 13 metabolites. The detection limits (LODs) ranged between 50 and 1000 pg/mL and were compliant with the 2 ng/mL minimum detection level required by the World Anti-Doping Agency (WADA) for all the target peptides. To demonstrate the feasibility of the work, urine samples obtained from patients who have been treated with desmopressin or leuprolide and urine samples that have been declared as adverse analytical findings were analyzed.

  18. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for quantitation of Benazepril alone and in combination with Amlodipine.

    PubMed

    Farouk, M; Elaziz, Omar Abd; Tawakkol, Shereen M; Hemdan, A; Shehata, Mostafa A

    2014-04-05

    Four simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the determination of Benazepril (BENZ) alone and in combination with Amlodipine (AML) in pharmaceutical dosage form. The first method is pH induced difference spectrophotometry, where BENZ can be measured in presence of AML as it showed maximum absorption at 237nm and 241nm in 0.1N HCl and 0.1N NaOH, respectively, while AML has no wavelength shift in both solvents. The second method is the new Extended Ratio Subtraction Method (EXRSM) coupled to Ratio Subtraction Method (RSM) for determination of both drugs in commercial dosage form. The third and fourth methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 2-30μg/mL for BENZ in difference and extended ratio subtraction spectrophotometric method, and 5-30 for AML in EXRSM method, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Prediction of down-gradient impacts of DNAPL source depletion using tracer techniques: Laboratory and modeling validation

    NASA Astrophysics Data System (ADS)

    Jawitz, J. W.; Basu, N.; Chen, X.

    2007-05-01

    Interwell application of coupled nonreactive and reactive tracers through aquifer contaminant source zones enables quantitative characterization of aquifer heterogeneity and contaminant architecture. Parameters obtained from tracer tests are presented here in a Lagrangian framework that can be used to predict the dissolution of nonaqueous phase liquid (NAPL) contaminants. Nonreactive tracers are commonly used to provide information about travel time distributions in hydrologic systems. Reactive tracers have more recently been introduced as a tool to quantify the amount of NAPL contaminant present within the tracer swept volume. Our group has extended reactive tracer techniques to also characterize NAPL spatial distribution heterogeneity. By conceptualizing the flow field through an aquifer as a collection of streamtubes, the aquifer hydrodynamic heterogeneities may be characterized by a nonreactive tracer travel time distribution, and NAPL spatial distribution heterogeneity may be similarly described using reactive travel time distributions. The combined statistics of these distributions are used to derive a simple analytical solution for contaminant dissolution. This analytical solution, and the tracer techniques used for its parameterization, were validated both numerically and experimentally. Illustrative applications are presented from numerical simulations using the multiphase flow and transport simulator UTCHEM, and laboratory experiments of surfactant-enhanced NAPL remediation in two-dimensional flow chambers.

  20. A Simple Sonication Improves Protein Signal in Matrix-Assisted Laser Desorption Ionization Imaging

    NASA Astrophysics Data System (ADS)

    Lin, Li-En; Su, Pin-Rui; Wu, Hsin-Yi; Hsu, Cheng-Chih

    2018-02-01

    Proper matrix application is crucial in obtaining high quality matrix-assisted laser desorption ionization (MALDI) mass spectrometry imaging (MSI). Solvent-free sublimation was essentially introduced as an approach of homogeneous coating that gives small crystal size of the organic matrix. However, sublimation has lower extraction efficiency of analytes. Here, we present that a simple sonication step after the hydration in standard sublimation protocol significantly enhances the sensitivity of MALDI MSI. This modified procedure uses a common laboratory ultrasonicator to immobilize the analytes from tissue sections without noticeable delocalization. Improved imaging quality with additional peaks above 10 kDa in the spectra was thus obtained upon sonication treatment. [Figure not available: see fulltext.

  1. Electrochemistry and analytical determination of lysergic acid diethylamide (LSD) via adsorptive stripping voltammetry.

    PubMed

    Merli, Daniele; Zamboni, Daniele; Protti, Stefano; Pesavento, Maria; Profumo, Antonella

    2014-12-01

    Lysergic acid diethylamide (LSD) is hardly detectable and quantifiable in biological samples because of its low active dose. Although several analytical tests are available, routine analysis of this drug is rarely performed. In this article, we report a simple and accurate method for the determination of LSD, based on adsorptive stripping voltammetry in DMF/tetrabutylammonium perchlorate, with a linear range of 1-90 ng L(-1) for deposition times of 50s. LOD of 1.4 ng L(-1) and LOQ of 4.3 ng L(-1) were found. The method can be also applied to biological samples after a simple extraction with 1-chlorobutane. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Ground temperature measurement by PRT-5 for maps experiment

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1978-01-01

    A simple algorithm and computer program were developed for determining the actual surface temperature from the effective brightness temperature as measured remotely by a radiation thermometer called PRT-5. This procedure allows the computation of atmospheric correction to the effective brightness temperature without performing detailed radiative transfer calculations. Model radiative transfer calculations were performed to compute atmospheric corrections for several values of the surface and atmospheric parameters individually and in combination. Polynomial regressions were performed between the magnitudes or deviations of these parameters and the corresponding computed corrections to establish simple analytical relations between them. Analytical relations were also developed to represent combined correction for simultaneous variation of parameters in terms of their individual corrections.

  3. Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.

    2018-04-01

    A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.

  4. Simple and Sensitive Paper-Based Device Coupling Electrochemical Sample Pretreatment and Colorimetric Detection.

    PubMed

    Silva, Thalita G; de Araujo, William R; Muñoz, Rodrigo A A; Richter, Eduardo M; Santana, Mário H P; Coltro, Wendell K T; Paixão, Thiago R L C

    2016-05-17

    We report the development of a simple, portable, low-cost, high-throughput visual colorimetric paper-based analytical device for the detection of procaine in seized cocaine samples. The interference of most common cutting agents found in cocaine samples was verified, and a novel electrochemical approach was used for sample pretreatment in order to increase the selectivity. Under the optimized experimental conditions, a linear analytical curve was obtained for procaine concentrations ranging from 5 to 60 μmol L(-1), with a detection limit of 0.9 μmol L(-1). The accuracy of the proposed method was evaluated using seized cocaine samples and an addition and recovery protocol.

  5. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A Comprehensive Tool and Analytical Pathway for Differential Molecular Profiling and Biomarker Discovery

    DTIC Science & Technology

    2014-10-20

    three possiblities: AKR , B6, and BALB_B) and MUP Protein (containing two possibilities: Intact and Denatured), then you can view a plot of the Strain...the tags for the last two labels. Again, if the attribute Strain has three tags: AKR , B6, 74 Distribution A . Approved for public release...AFRL-RH-WP-TR-2014-0131 A COMPREHENSIVE TOOL AND ANALYTICAL PATHWAY FOR DIFFERENTIAL MOLECULAR PROFILING AND BIOMARKER DISCOVERY

  7. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  8. Modeling of the Global Water Cycle - Analytical Models

    Treesearch

    Yongqiang Liu; Roni Avissar

    2005-01-01

    Both numerical and analytical models of coupled atmosphere and its underlying ground components (land, ocean, ice) are useful tools for modeling the global and regional water cycle. Unlike complex three-dimensional climate models, which need very large computing resources and involve a large number of complicated interactions often difficult to interpret, analytical...

  9. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    NASA Astrophysics Data System (ADS)

    Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-02-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.

  10. Using Interactive Data Visualizations for Exploratory Analysis in Undergraduate Genomics Coursework: Field Study Findings and Guidelines

    PubMed Central

    Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan

    2016-01-01

    Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625

  11. Designing dipolar recoupling and decoupling experiments for biological solid-state NMR using interleaved continuous wave and RF pulse irradiation.

    PubMed

    Bjerring, Morten; Jain, Sheetal; Paaske, Berit; Vinther, Joachim M; Nielsen, Niels Chr

    2013-09-17

    Rapid developments in solid-state NMR methodology have boosted this technique into a highly versatile tool for structural biology. The invention of increasingly advanced rf pulse sequences that take advantage of better hardware and sample preparation have played an important part in these advances. In the development of these new pulse sequences, researchers have taken advantage of analytical tools, such as average Hamiltonian theory or lately numerical methods based on optimal control theory. In this Account, we focus on the interplay between these strategies in the systematic development of simple pulse sequences that combines continuous wave (CW) irradiation with short pulses to obtain improved rf pulse, recoupling, sampling, and decoupling performance. Our initial work on this problem focused on the challenges associated with the increasing use of fully or partly deuterated proteins to obtain high-resolution, liquid-state-like solid-state NMR spectra. Here we exploit the overwhelming presence of (2)H in such samples as a source of polarization and to gain structural information. The (2)H nuclei possess dominant quadrupolar couplings which complicate even the simplest operations, such as rf pulses and polarization transfer to surrounding nuclei. Using optimal control and easy analytical adaptations, we demonstrate that a series of rotor synchronized short pulses may form the basis for essentially ideal rf pulse performance. Using similar approaches, we design (2)H to (13)C polarization transfer experiments that increase the efficiency by one order of magnitude over standard cross polarization experiments. We demonstrate how we can translate advanced optimal control waveforms into simple interleaved CW and rf pulse methods that form a new cross polarization experiment. This experiment significantly improves (1)H-(15)N and (15)N-(13)C transfers, which are key elements in the vast majority of biological solid-state NMR experiments. In addition, we demonstrate how interleaved sampling of spectra exploiting polarization from (1)H and (2)H nuclei can substantially enhance the sensitivity of such experiments. Finally, we present systematic development of (1)H decoupling methods where CW irradiation of moderate amplitude is interleaved with strong rotor-synchronized refocusing pulses. We show that these sequences remove residual cross terms between dipolar coupling and chemical shielding anisotropy more effectively and improve the spectral resolution over that observed in current state-of-the-art methods.

  12. Exact analytical modeling of lightwave propagation in planar media with arbitrarily graded index profiles

    NASA Astrophysics Data System (ADS)

    Krapez, J.-C.

    2018-02-01

    Applying the Darboux transformation in the optical-depth space allows building infinite chains of exact analytical solutions of the electromagnetic (EM) fields in planar 1D-graded dielectrics. As a matter of fact, infinite chains of solvable admittance profiles (e.g. refractive-index profiles, in the case of non-magnetic materials), together with the related EM fields are simultaneously and recursively obtained. The whole procedure has received the name "PROFIDT method" for PROperty and FIeld Darboux Transformation method. By repeating the Darboux transformations we can find out progressively more complex profiles and their EM solutions. An alternative is to stop after the first step and settle for a particular class of four-parameter admittance profiles that were dubbed of "sech(ξ)-type". These profiles are highly flexible. For this reason, they can be used as elementary bricks for building and modeling profiles of arbitrary shape. In addition, the corresponding transfer matrix involves only elementary functions. The sub-class of "sech(ξ)-type" profiles with horizontal end-slopes (S-shaped function) is particularly interesting: these can be used for high-level modeling of piecewise-sigmoidal refractive-index profiles encountered in various photonic devices such as matchinglayers, antireflection layers, rugate filters, chirped mirrors and photonic crystals. These simple analytical tools also allow exploring the fascinating properties of a new kind of structure, namely smooth quasicrystals. They can also be applied to model propagation of other types of waves in graded media such as acoustic waves and electric waves in tapered transmission lines.

  13. Nanofabrication of densely packed metal-polymer arrays for surface-enhanced Raman spectrometry.

    PubMed

    De Jesús, M A; Giesfeldt, K S; Oran, J M; Abu-Hatab, N A; Lavrik, N V; Sepaniak, M J

    2005-12-01

    A key element to improve the analytical capabilities of surface-enhanced Raman spectroscopy (SERS) resides in the performance characteristics of the SERS-active substrate. Variables such as shape, size, and homogeneous distribution of the metal nanoparticles throughout the substrate surface are important in the design of more analytically sensitive and reliable substrates. Electron-beam lithography (EBL) has emerged as a powerful tool for the systematic fabrication of substrates with periodic nanoscale features. EBL also allows the rational design of nanoscale features that are optimized to the frequency of the Raman laser source. In this work, the efficiency of EBL fabricated substrates are studied by measuring the relative SERS signals of Rhodamine 6G and 1,10-phenanthro-line adsorbed on a series of cubic, elliptical, and hexagonal nanopatterned pillars of ma-N 2403 directly coated by physical vapor deposition with 25 nm films of Ag or Au. The raw analyte SERS signals, and signals normalized to metal nanoparticle surface area or numbers of loci, are used to study the effects of nanoparticle morphology on the performance of a rapidly created, diverse collection of substrates. For the excitation wavelength used, the nanoparticle size, geometry, and orientation of the particle primary axis relative to the excitation polarization vector, and particularly the density of nanoparticles, are shown to strongly influence substrate performance. A correlation between the inverse of the magnitude of the laser backscatter passed by the spectrometer and SERS activities of the various substrate patterns is also noted and provides a simple means to evaluate possible efficient coupling of the excitation radiation to localized surface plasmons for Raman enhancement.

  14. Fast and simultaneous monitoring of organic pollutants in a drinking water treatment plant by a multi-analyte biosensor followed by LC-MS validation.

    PubMed

    Rodriguez-Mozaz, Sara; de Alda, Maria J López; Barceló, Damià

    2006-04-15

    This work describes the application of an optical biosensor (RIver ANALyser, RIANA) to the simultaneous analysis of three relevant environmental organic pollutants, namely, the pesticides atrazine and isoproturon and the estrogen estrone, in real water samples. This biosensor is based on an indirect inhibition immunoassay which takes place at a chemically modified optical transducer chip. The spatially resolved modification of the transducer surface allows the simultaneous determination of selected target analytes by means of "total internal reflection fluorescence" (TIRF). The performance of the immunosensor method developed was evaluated against a well accepted traditional method based on solid-phase extraction followed by liquid chromatography-mass spectrometry (LC-MS). The chromatographic method was superior in terms of linearity, sensitivity and accuracy, and the biosensor method in terms of repeatability, speed, cost and automation. The application of both methods in parallel to determine the occurrence and removal of atrazine, isoproturon and estrone throughout the treatment process (sand filtration, ozonation, activated carbon filtration and chlorination) in a waterworks showed an overestimation of results in the case of the biosensor, which was partially attributed to matrix and cross-reactivity effects, in spite of the addition of ovalbumin to the sample to minimize matrix interferences. Based on the comparative performance of both techniques, the biosensor emerges as a suitable tool for fast, simple and automated screening of water pollutants without sample pretreatment. To the author's knowledge, this is the first description of the application of the biosensor RIANA in the multi-analyte configuration to the regular monitoring of pollutants in a waterworks.

  15. Plastid: nucleotide-resolution analysis of next-generation sequencing and genomics data.

    PubMed

    Dunn, Joshua G; Weissman, Jonathan S

    2016-11-22

    Next-generation sequencing (NGS) informs many biological questions with unprecedented depth and nucleotide resolution. These assays have created a need for analytical tools that enable users to manipulate data nucleotide-by-nucleotide robustly and easily. Furthermore, because many NGS assays encode information jointly within multiple properties of read alignments - for example, in ribosome profiling, the locations of ribosomes are jointly encoded in alignment coordinates and length - analytical tools are often required to extract the biological meaning from the alignments before analysis. Many assay-specific pipelines exist for this purpose, but there remains a need for user-friendly, generalized, nucleotide-resolution tools that are not limited to specific experimental regimes or analytical workflows. Plastid is a Python library designed specifically for nucleotide-resolution analysis of genomics and NGS data. As such, Plastid is designed to extract assay-specific information from read alignments while retaining generality and extensibility to novel NGS assays. Plastid represents NGS and other biological data as arrays of values associated with genomic or transcriptomic positions, and contains configurable tools to convert data from a variety of sources to such arrays. Plastid also includes numerous tools to manipulate even discontinuous genomic features, such as spliced transcripts, with nucleotide precision. Plastid automatically handles conversion between genomic and feature-centric coordinates, accounting for splicing and strand, freeing users of burdensome accounting. Finally, Plastid's data models use consistent and familiar biological idioms, enabling even beginners to develop sophisticated analytical workflows with minimal effort. Plastid is a versatile toolkit that has been used to analyze data from multiple NGS assays, including RNA-seq, ribosome profiling, and DMS-seq. It forms the genomic engine of our ORF annotation tool, ORF-RATER, and is readily adapted to novel NGS assays. Examples, tutorials, and extensive documentation can be found at https://plastid.readthedocs.io .

  16. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data

    PubMed Central

    2011-01-01

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968

  17. Web GIS in practice IX: a demonstration of geospatial visual analytics using Microsoft Live Labs Pivot technology and WHO mortality data.

    PubMed

    Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin

    2011-03-16

    The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.

  18. Google Analytics – Index of Resources

    EPA Pesticide Factsheets

    Find how-to and best practice resources and training for accessing and understanding EPA's Google Analytics (GA) tools, including how to create reports that will help you improve and maintain the web areas you manage.

  19. Vertical and pitching resonance of train cars moving over a series of simple beams

    NASA Astrophysics Data System (ADS)

    Yang, Y. B.; Yau, J. D.

    2015-02-01

    The resonant response, including both vertical and pitching motions, of an undamped sprung mass unit moving over a series of simple beams is studied by a semi-analytical approach. For a sprung mass that is very small compared with the beam, we first simplify the sprung mass as a constant moving force and obtain the response of the beam in closed form. With this, we then solve for the response of the sprung mass passing over a series of simple beams, and validate the solution by an independent finite element analysis. To evaluate the pitching resonance, we consider the cases of a two-axle model and a coach model traveling over rough rails supported by a series of simple beams. The resonance of a train car is characterized by the fact that its response continues to build up, as it travels over more and more beams. For train cars with long axle intervals, the vertical acceleration induced by pitching resonance dominates the peak response of the train traveling over a series of simple beams. The present semi-analytical study allows us to grasp the key parameters involved in the primary/sub-resonant responses. Other phenomena of resonance are also discussed in the exemplar study.

  20. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    NASA Astrophysics Data System (ADS)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  1. An analytical approach to γ-ray self-shielding effects for radioactive bodies encountered nuclear decommissioning scenarios.

    PubMed

    Gamage, K A A; Joyce, M J

    2011-10-01

    A novel analytical approach is described that accounts for self-shielding of γ radiation in decommissioning scenarios. The approach is developed with plutonium-239, cobalt-60 and caesium-137 as examples; stainless steel and concrete have been chosen as the media for cobalt-60 and caesium-137, respectively. The analytical methods have been compared MCNPX 2.6.0 simulations. A simple, linear correction factor relates the analytical results and the simulated estimates. This has the potential to greatly simplify the estimation of self-shielding effects in decommissioning activities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Sample injection and electrophoretic separation on a simple laminated paper based analytical device.

    PubMed

    Xu, Chunxiu; Zhong, Minghua; Cai, Longfei; Zheng, Qingyu; Zhang, Xiaojun

    2016-02-01

    We described a strategy to perform multistep operations on a simple laminated paper-based separation device by using electrokinetic flow to manipulate the fluids. A laminated crossed-channel paper-based separation device was fabricated by cutting a filter paper sheet followed by lamination. Multiple function units including sample loading, sample injection, and electrophoretic separation were integrated on a single paper based analytical device for the first time, by applying potential at different reservoirs for sample, sample waste, buffer, and buffer waste. As a proof-of-concept demonstration, mixed sample solution containing carmine and sunset yellow were loaded in the sampling channel, and then injected into separation channel followed by electrophoretic separation, by adjusting the potentials applied at the four terminals of sampling and separation channel. The effects of buffer pH, buffer concentration, channel width, and separation time on resolution of electrophoretic separation were studied. This strategy may be used to perform multistep operations such as reagent dilution, sample injection, mixing, reaction, and separation on a single microfluidic paper based analytical device, which is very attractive for building micro total analysis systems on microfluidic paper based analytical devices. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Review of Thawing Time Prediction Models Depending
on Process Conditions and Product Characteristics

    PubMed Central

    Kluza, Franciszek; Spiess, Walter E. L.; Kozłowicz, Katarzyna

    2016-01-01

    Summary Determining thawing times of frozen foods is a challenging problem as the thermophysical properties of the product change during thawing. A number of calculation models and solutions have been developed. The proposed solutions range from relatively simple analytical equations based on a number of assumptions to a group of empirical approaches that sometimes require complex calculations. In this paper analytical, empirical and graphical models are presented and critically reviewed. The conditions of solution, limitations and possible applications of the models are discussed. The graphical and semi--graphical models are derived from numerical methods. Using the numerical methods is not always possible as running calculations takes time, whereas the specialized software and equipment are not always cheap. For these reasons, the application of analytical-empirical models is more useful for engineering. It is demonstrated that there is no simple, accurate and feasible analytical method for thawing time prediction. Consequently, simplified methods are needed for thawing time estimation of agricultural and food products. The review reveals the need for further improvement of the existing solutions or development of new ones that will enable accurate determination of thawing time within a wide range of practical conditions of heat transfer during processing. PMID:27904387

  4. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  5. Modern analytical chemistry in the contemporary world

    NASA Astrophysics Data System (ADS)

    Šíma, Jan

    2016-12-01

    Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.

  6. Dip-strip method for monitoring environmental contamination of aflatoxin in food and feed: use of a portable aflatoxin detection kit.

    PubMed

    Sashidhar, R B

    1993-10-01

    Aflatoxin contamination of food and feed have gained global significance due to its deleterious effect on human and animal health and its importance in the international trade. The potential of aflatoxin as a carcinogen, mutagen, teratogen, and immunosuppressive agent is well documented. The problem of aflatoxin contamination of food and feed has led to the enactment of various legislation. However, meaningful strategies for implementation of this legislation is limited by nonavailability of simple, cost-effective method for screening and detection of aflatoxin under field conditions. Keeping in mind the analytical constraints in developing countries, a simple-to-operate, rapid, reliable, and cost-effective portable aflatoxin detection kit has been developed. The important components of the kit include a hand-held UV lamp (365 nm, 4 W output), a solvent blender (12,000 rpm) for toxin extraction, and adsorbent-coated dip-strips (polyester film) for detecting and quantifying aflatoxin. Analysis of variance indicates that there were no significant differences between various batches of dip-strips (p > 0.05). The minimum detection limit for aflatoxin B1 was 10 ppb per spot. The kit may find wide application as a research tool in public health laboratories, environmental monitoring agencies, and in the poultry industry.

  7. A Comprehensive Physical Impedance Model of Polymer Electrolyte Fuel Cell Cathodes in Oxygen-free Atmosphere.

    PubMed

    Obermaier, Michael; Bandarenka, Aliaksandr S; Lohri-Tymozhynsky, Cyrill

    2018-03-21

    Electrochemical impedance spectroscopy (EIS) is an indispensable tool for non-destructive operando characterization of Polymer Electrolyte Fuel Cells (PEFCs). However, in order to interpret the PEFC's impedance response and understand the phenomena revealed by EIS, numerous semi-empirical or purely empirical models are used. In this work, a relatively simple model for PEFC cathode catalyst layers in absence of oxygen has been developed, where all the equivalent circuit parameters have an entire physical meaning. It is based on: (i) experimental quantification of the catalyst layer pore radii, (ii) application of De Levie's analytical formula to calculate the response of a single pore, (iii) approximating the ionomer distribution within every pore, (iv) accounting for the specific adsorption of sulfonate groups and (v) accounting for a small H 2 crossover through ~15 μm ionomer membranes. The derived model has effectively only 6 independent fitting parameters and each of them has clear physical meaning. It was used to investigate the cathode catalyst layer and the double layer capacitance at the interface between the ionomer/membrane and Pt-electrocatalyst. The model has demonstrated excellent results in fitting and interpretation of the impedance data under different relative humidities. A simple script enabling fitting of impedance data is provided as supporting information.

  8. A simple capacitive method to evaluate ethanol fuel samples

    NASA Astrophysics Data System (ADS)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-02-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  9. Dip-strip method for monitoring environmental contamination of aflatoxin in food and feed: use of a portable aflatoxin detection kit.

    PubMed Central

    Sashidhar, R B

    1993-01-01

    Aflatoxin contamination of food and feed have gained global significance due to its deleterious effect on human and animal health and its importance in the international trade. The potential of aflatoxin as a carcinogen, mutagen, teratogen, and immunosuppressive agent is well documented. The problem of aflatoxin contamination of food and feed has led to the enactment of various legislation. However, meaningful strategies for implementation of this legislation is limited by nonavailability of simple, cost-effective method for screening and detection of aflatoxin under field conditions. Keeping in mind the analytical constraints in developing countries, a simple-to-operate, rapid, reliable, and cost-effective portable aflatoxin detection kit has been developed. The important components of the kit include a hand-held UV lamp (365 nm, 4 W output), a solvent blender (12,000 rpm) for toxin extraction, and adsorbent-coated dip-strips (polyester film) for detecting and quantifying aflatoxin. Analysis of variance indicates that there were no significant differences between various batches of dip-strips (p > 0.05). The minimum detection limit for aflatoxin B1 was 10 ppb per spot. The kit may find wide application as a research tool in public health laboratories, environmental monitoring agencies, and in the poultry industry. Images FIGURE 1. PMID:8143644

  10. Effect of Parametric Dichotomic Markov Noise on the Properties of Chaotic Transitions in Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Gac, J. M.; Żebrowski, J. J.

    A chaotic transition occurs when a continuous change of one of the parameters of the system causes a discontinuous change in the properties of the chaotic attractor of the system. Such phenomena are present in many dynamical systems, in which a chaotic behavior occurs. The best known of these transitions are: the period-doubling bifurcation cascade, intermittency and crises. The effect of dichotomous Markov noise (DMN) on the properties of systems with chaotic transitions is discussed. DMN is a very simple two-valued stochastic process, with constant transition rates between the two states. In spite of its simplicity, this kind of noise is a very powerful tool to describe various phenomena present in many physical, chemical or biological systems. Many interesting phenomena induced by DMN are known. However, there is no research on the effect of this kind of noise on intermittency or crises. We present the change of the mean laminar phase length and of laminar phase length distribution caused by DMN modulating the parameters of a system with intermittency and the modification of the mean life time on the pre-crisis attractor in the case of a boundary crisis. The results obtained analytically are compared with numerical simulations for several simple dynamical systems.

  11. Analyzing Discourse Processing Using a Simple Natural Language Processing Tool

    ERIC Educational Resources Information Center

    Crossley, Scott A.; Allen, Laura K.; Kyle, Kristopher; McNamara, Danielle S.

    2014-01-01

    Natural language processing (NLP) provides a powerful approach for discourse processing researchers. However, there remains a notable degree of hesitation by some researchers to consider using NLP, at least on their own. The purpose of this article is to introduce and make available a "simple" NLP (SiNLP) tool. The overarching goal of…

  12. Development and validation of a simple high-performance liquid chromatography analytical method for simultaneous determination of phytosterols, cholesterol and squalene in parenteral lipid emulsions.

    PubMed

    Novak, Ana; Gutiérrez-Zamora, Mercè; Domenech, Lluís; Suñé-Negre, Josep M; Miñarro, Montserrat; García-Montoya, Encarna; Llop, Josep M; Ticó, Josep R; Pérez-Lozano, Pilar

    2018-02-01

    A simple analytical method for simultaneous determination of phytosterols, cholesterol and squalene in lipid emulsions was developed owing to increased interest in their clinical effects. Method development was based on commonly used stationary (C 18 , C 8 and phenyl) and mobile phases (mixtures of acetonitrile, methanol and water) under isocratic conditions. Differences in stationary phases resulted in peak overlapping or coelution of different peaks. The best separation of all analyzed compounds was achieved on Zorbax Eclipse XDB C 8 (150 × 4.6 mm, 5 μm; Agilent) and ACN-H 2 O-MeOH, 80:19.5:0.5 (v/v/v). In order to achieve a shorter time of analysis, the method was further optimized and gradient separation was established. The optimized analytical method was validated and tested for routine use in lipid emulsion analyses. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Water Conservation Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ian Metzger, Jesse Dean

    2010-12-31

    This software requires inputs of simple water fixture inventory information and calculates the water/energy and cost benefits of various retrofit opportunities. This tool includes water conservation measures for: Low-flow Toilets, Low-flow Urinals, Low-flow Faucets, and Low-flow Showheads. This tool calculates water savings, energy savings, demand reduction, cost savings, and building life cycle costs including: simple payback, discounted payback, net-present value, and savings to investment ratio. In addition this tool also displays the environmental benefits of a project.

  14. Simple waves in a two-component Bose-Einstein condensate

    NASA Astrophysics Data System (ADS)

    Ivanov, S. K.; Kamchatnov, A. M.

    2018-04-01

    We study the dynamics of so-called simple waves in a two-component Bose-Einstein condensate. The evolution of the condensate is described by Gross-Pitaevskii equations which can be reduced for these simple wave solutions to a system of ordinary differential equations which coincide with those derived by Ovsyannikov for the two-layer fluid dynamics. We solve the Ovsyannikov system for two typical situations of large and small difference between interspecies and intraspecies nonlinear interaction constants. Our analytic results are confirmed by numerical simulations.

  15. Spin Seebeck effect in a simple ferromagnet near T c: a Ginzburg-Landau approach

    NASA Astrophysics Data System (ADS)

    Adachi, Hiroto; Yamamoto, Yutaka; Ichioka, Masanori

    2018-04-01

    A time-dependent Ginzburg-Landau theory is used to examine the longitudinal spin Seebeck effect in a simple ferromagnet in the vicinity of the Curie temperature T c. It is shown analytically that the spin Seebeck effect is proportional to the magnetization near T c, whose result is in line with the previous numerical finding. It is argued that the present result can be tested experimentally using a simple magnetic system such as EuO/Pt or EuS/Pt.

  16. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  17. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  18. Applying Pragmatics Principles for Interaction with Visual Analytics.

    PubMed

    Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac

    2018-01-01

    Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.

  19. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach

    PubMed Central

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-01-01

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design. PMID:28773800

  20. Design of Friction Stir Spot Welding Tools by Using a Novel Thermal-Mechanical Approach.

    PubMed

    Su, Zheng-Ming; Qiu, Qi-Hong; Lin, Pai-Chen

    2016-08-09

    A simple thermal-mechanical model for friction stir spot welding (FSSW) was developed to obtain similar weld performance for different weld tools. Use of the thermal-mechanical model and a combined approach enabled the design of weld tools for various sizes but similar qualities. Three weld tools for weld radii of 4, 5, and 6 mm were made to join 6061-T6 aluminum sheets. Performance evaluations of the three weld tools compared fracture behavior, microstructure, micro-hardness distribution, and welding temperature of welds in lap-shear specimens. For welds made by the three weld tools under identical processing conditions, failure loads were approximately proportional to tool size. Failure modes, microstructures, and micro-hardness distributions were similar. Welding temperatures correlated with frictional heat generation rate densities. Because the three weld tools sufficiently met all design objectives, the proposed approach is considered a simple and feasible guideline for preliminary tool design.

Top