Sample records for analysis an analytical application

  1. Recent advancements in chemical luminescence-based lab-on-chip and microfluidic platforms for bioanalysis.

    PubMed

    Mirasoli, Mara; Guardigli, Massimo; Michelini, Elisa; Roda, Aldo

    2014-01-01

    Miniaturization of analytical procedures through microchips, lab-on-a-chip or micro total analysis systems is one of the most recent trends in chemical and biological analysis. These systems are designed to perform all the steps in an analytical procedure, with the advantages of low sample and reagent consumption, fast analysis, reduced costs, possibility of extra-laboratory application. A range of detection technologies have been employed in miniaturized analytical systems, but most applications relied on fluorescence and electrochemical detection. Chemical luminescence (which includes chemiluminescence, bioluminescence, and electrogenerated chemiluminescence) represents an alternative detection principle that offered comparable (or better) analytical performance and easier implementation in miniaturized analytical devices. Nevertheless, chemical luminescence-based ones represents only a small fraction of the microfluidic devices reported in the literature, and until now no review has been focused on these devices. Here we review the most relevant applications (since 2009) of miniaturized analytical devices based on chemical luminescence detection. After a brief overview of the main chemical luminescence systems and of the recent technological advancements regarding their implementation in miniaturized analytical devices, analytical applications are reviewed according to the nature of the device (microfluidic chips, microchip electrophoresis, lateral flow- and paper-based devices) and the type of application (micro-flow injection assays, enzyme assays, immunoassays, gene probe hybridization assays, cell assays, whole-cell biosensors). Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Micro-optics for microfluidic analytical applications.

    PubMed

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  3. Chemiluminescence microarrays in analytical chemistry: a critical review.

    PubMed

    Seidel, Michael; Niessner, Reinhard

    2014-09-01

    Multi-analyte immunoassays on microarrays and on multiplex DNA microarrays have been described for quantitative analysis of small organic molecules (e.g., antibiotics, drugs of abuse, small molecule toxins), proteins (e.g., antibodies or protein toxins), and microorganisms, viruses, and eukaryotic cells. In analytical chemistry, multi-analyte detection by use of analytical microarrays has become an innovative research topic because of the possibility of generating several sets of quantitative data for different analyte classes in a short time. Chemiluminescence (CL) microarrays are powerful tools for rapid multiplex analysis of complex matrices. A wide range of applications for CL microarrays is described in the literature dealing with analytical microarrays. The motivation for this review is to summarize the current state of CL-based analytical microarrays. Combining analysis of different compound classes on CL microarrays reduces analysis time, cost of reagents, and use of laboratory space. Applications are discussed, with examples from food safety, water safety, environmental monitoring, diagnostics, forensics, toxicology, and biosecurity. The potential and limitations of research on multiplex analysis by use of CL microarrays are discussed in this review.

  4. Visual analytics for aviation safety: A collaborative approach to sensemaking

    NASA Astrophysics Data System (ADS)

    Wade, Andrew

    Visual analytics, the "science of analytical reasoning facilitated by interactive visual interfaces", is more than just visualization. Understanding the human reasoning process is essential for designing effective visualization tools and providing correct analyses. This thesis describes the evolution, application and evaluation of a new method for studying analytical reasoning that we have labeled paired analysis. Paired analysis combines subject matter experts (SMEs) and tool experts (TE) in an analytic dyad, here used to investigate aircraft maintenance and safety data. The method was developed and evaluated using interviews, pilot studies and analytic sessions during an internship at the Boeing Company. By enabling a collaborative approach to sensemaking that can be captured by researchers, paired analysis yielded rich data on human analytical reasoning that can be used to support analytic tool development and analyst training. Keywords: visual analytics, paired analysis, sensemaking, boeing, collaborative analysis.

  5. Application of Learning Analytics Using Clustering Data Mining for Students' Disposition Analysis

    ERIC Educational Resources Information Center

    Bharara, Sanyam; Sabitha, Sai; Bansal, Abhay

    2018-01-01

    Learning Analytics (LA) is an emerging field in which sophisticated analytic tools are used to improve learning and education. It draws from, and is closely tied to, a series of other fields of study like business intelligence, web analytics, academic analytics, educational data mining, and action analytics. The main objective of this research…

  6. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  7. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  8. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications

    PubMed Central

    Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.

    2018-01-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069

  9. SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.

    PubMed

    Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D

    2017-04-01

    The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.

  10. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    PubMed Central

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed. PMID:22315551

  11. An Application of Social Network Analysis on Military Strategy, System Networks and the Phases of War

    DTIC Science & Technology

    2015-03-26

    1977. [29] J. D. Guzman, R. F. Deckro, M. J. Robbins, J. F. Morris and N. A. Ballester, “An Analytical Comparison of Social Network Measures,” IEEE...AN APPLICATION OF SOCIAL NETWORK ANALYSIS ON MILITARY STRATEGY, SYSTEM NETWORKS AND THE PHASES OF...subject to copyright protection in the United States. AFIT-ENS-MS-15-M-117 AN APPLICATION OF SOCIAL NETWORK ANALYSIS ON MILITARY STRATEGY

  12. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  13. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis.

    PubMed

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions.

  14. IoT Big-Data Centred Knowledge Granule Analytic and Cluster Framework for BI Applications: A Case Base Analysis

    PubMed Central

    Chang, Hsien-Tsung; Mishra, Nilamadhab; Lin, Chung-Chih

    2015-01-01

    The current rapid growth of Internet of Things (IoT) in various commercial and non-commercial sectors has led to the deposition of large-scale IoT data, of which the time-critical analytic and clustering of knowledge granules represent highly thought-provoking application possibilities. The objective of the present work is to inspect the structural analysis and clustering of complex knowledge granules in an IoT big-data environment. In this work, we propose a knowledge granule analytic and clustering (KGAC) framework that explores and assembles knowledge granules from IoT big-data arrays for a business intelligence (BI) application. Our work implements neuro-fuzzy analytic architecture rather than a standard fuzzified approach to discover the complex knowledge granules. Furthermore, we implement an enhanced knowledge granule clustering (e-KGC) mechanism that is more elastic than previous techniques when assembling the tactical and explicit complex knowledge granules from IoT big-data arrays. The analysis and discussion presented here show that the proposed framework and mechanism can be implemented to extract knowledge granules from an IoT big-data array in such a way as to present knowledge of strategic value to executives and enable knowledge users to perform further BI actions. PMID:26600156

  15. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  16. Climate Analytics as a Service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.

    2014-01-01

    Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.

  17. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  18. Analysis of human tissues by total reflection X-ray fluorescence. Application of chemometrics for diagnostic cancer recognition

    NASA Astrophysics Data System (ADS)

    Benninghoff, L.; von Czarnowski, D.; Denkhaus, E.; Lemke, K.

    1997-07-01

    For the determination of trace element distributions of more than 20 elements in malignant and normal tissues of the human colon, tissue samples (approx. 400 mg wet weight) were digested with 3 ml of nitric acid (sub-boiled quality) by use of an autoclave system. The accuracy of measurements has been investigated by using certified materials. The analytical results were evaluated by using a spreadsheet program to give an overview of the element distribution in cancerous samples and in normal colon tissues. A further application, cluster analysis of the analytical results, was introduced to demonstrate the possibility of classification for cancer diagnosis. To confirm the results of cluster analysis, multivariate three-way principal component analysis was performed. Additionally, microtome frozen sections (10 μm) were prepared from the same tissue samples to compare the analytical results, i.e. the mass fractions of elements, according to the preparation method and to exclude systematic errors depending on the inhomogeneity of the tissues.

  19. Integrated Multi-Scale Data Analytics and Machine Learning for the Distribution Grid and Building-to-Grid Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, Emma M.; Hendrix, Val; Chertkov, Michael

    This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less

  20. Analytical and numerical analysis of charge carriers extracted by linearly increasing voltage in a metal-insulator-semiconductor structure relevant to bulk heterojunction organic solar cells

    NASA Astrophysics Data System (ADS)

    Yumnam, Nivedita; Hirwa, Hippolyte; Wagner, Veit

    2017-12-01

    Analysis of charge extraction by linearly increasing voltage is conducted on metal-insulator-semiconductor capacitors in a structure relevant to organic solar cells. For this analysis, an analytical model is developed and is used to determine the conductivity of the active layer. Numerical simulations of the transient current were performed as a way to confirm the applicability of our analytical model and other analytical models existing in the literature. Our analysis is applied to poly(3-hexylthiophene)(P3HT) : phenyl-C61-butyric acid methyl ester (PCBM) which allows to determine the electron and hole mobility independently. A combination of experimental data analysis and numerical simulations reveals the effect of trap states on the transient current and where this contribution is crucial for data analysis.

  1. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  2. Moving your laboratories to the field – Advantages and limitations of the use of field portable instruments in environmental sample analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical chemistry principles. • Performance requirements in field analysis stimulate technological progress.« less

  3. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  4. Synthesis and characterization of an organic reagent 4-(6-bromo-2-benzothiazolylazo) pyrogallol and its analytical application.

    PubMed

    Naser, N A; Kahdim, K H; Taha, D N

    2012-01-01

    Organic reagent, 4-(6-Bromo-2-Benzothiazolylazo) pyrogallol (4-Br-BTAP), was synthesized by coupling reaction of diazotized 2-amino-6-bromobenzothiazole with pyrogallol and purified using ethanol recrystallization method. Analysis and characterization of synthesized product were carried out using melting point, elementary analysis, IR and H¹-NMR. Dissociation constants of the organic reagent were calculated by spectrophotometric method. Absorption spectra of the 4-Br-BTAP in solvents of different polarities were investigated. Analytical application of 4-Br-BTAP was established with Cu (II) and Pd (II).

  5. Space Station Common Berthing Mechanism, a multi-body simulation application

    NASA Technical Reports Server (NTRS)

    Searle, Ian

    1993-01-01

    This paper discusses an application of multi-body dynamic analysis conducted at the Boeing Company in connection with the Space Station (SS) Common Berthing Mechanism (CBM). After introducing the hardware and analytical objectives we will focus on some of the day-to-day computational issues associated with this type of analysis.

  6. Incorporating uncertainty regarding applicability of evidence from meta-analyses into clinical decision making.

    PubMed

    Kriston, Levente; Meister, Ramona

    2014-03-01

    Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. An Introduction to MAMA (Meta-Analysis of MicroArray data) System.

    PubMed

    Zhang, Zhe; Fenstermacher, David

    2005-01-01

    Analyzing microarray data across multiple experiments has been proven advantageous. To support this kind of analysis, we are developing a software system called MAMA (Meta-Analysis of MicroArray data). MAMA utilizes a client-server architecture with a relational database on the server-side for the storage of microarray datasets collected from various resources. The client-side is an application running on the end user's computer that allows the user to manipulate microarray data and analytical results locally. MAMA implementation will integrate several analytical methods, including meta-analysis within an open-source framework offering other developers the flexibility to plug in additional statistical algorithms.

  8. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  9. Flow chemistry vs. flow analysis.

    PubMed

    Trojanowicz, Marek

    2016-01-01

    The flow mode of conducting chemical syntheses facilitates chemical processes through the use of on-line analytical monitoring of occurring reactions, the application of solid-supported reagents to minimize downstream processing and computerized control systems to perform multi-step sequences. They are exactly the same attributes as those of flow analysis, which has solid place in modern analytical chemistry in several last decades. The following review paper, based on 131 references to original papers as well as pre-selected reviews, presents basic aspects, selected instrumental achievements and developmental directions of a rapidly growing field of continuous flow chemical synthesis. Interestingly, many of them might be potentially employed in the development of new methods in flow analysis too. In this paper, examples of application of flow analytical measurements for on-line monitoring of flow syntheses have been indicated and perspectives for a wider application of real-time analytical measurements have been discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. An assessment of envelope-based demodulation in case of proximity of carrier and modulation frequencies

    NASA Astrophysics Data System (ADS)

    Shahriar, Md Rifat; Borghesani, Pietro; Randall, R. B.; Tan, Andy C. C.

    2017-11-01

    Demodulation is a necessary step in the field of diagnostics to reveal faults whose signatures appear as an amplitude and/or frequency modulation. The Hilbert transform has conventionally been used for the calculation of the analytic signal required in the demodulation process. However, the carrier and modulation frequencies must meet the conditions set by the Bedrosian identity for the Hilbert transform to be applicable for demodulation. This condition, basically requiring the carrier frequency to be sufficiently higher than the frequency of the modulation harmonics, is usually satisfied in many traditional diagnostic applications (e.g. vibration analysis of gear and bearing faults) due to the order-of-magnitude ratio between the carrier and modulation frequency. However, the diversification of the diagnostic approaches and applications shows cases (e.g. electrical signature analysis-based diagnostics) where the carrier frequency is in close proximity to the modulation frequency, thus challenging the applicability of the Bedrosian theorem. This work presents an analytic study to quantify the error introduced by the Hilbert transform-based demodulation when the Bedrosian identity is not satisfied and proposes a mitigation strategy to combat the error. An experimental study is also carried out to verify the analytical results. The outcome of the error analysis sets a confidence limit on the estimated modulation (both shape and magnitude) achieved through the Hilbert transform-based demodulation in case of violated Bedrosian theorem. However, the proposed mitigation strategy is found effective in combating the demodulation error aroused in this scenario, thus extending applicability of the Hilbert transform-based demodulation.

  11. Capacitive chemical sensor

    DOEpatents

    Manginell, Ronald P; Moorman, Matthew W; Wheeler, David R

    2014-05-27

    A microfabricated capacitive chemical sensor can be used as an autonomous chemical sensor or as an analyte-sensitive chemical preconcentrator in a larger microanalytical system. The capacitive chemical sensor detects changes in sensing film dielectric properties, such as the dielectric constant, conductivity, or dimensionality. These changes result from the interaction of a target analyte with the sensing film. This capability provides a low-power, self-heating chemical sensor suitable for remote and unattended sensing applications. The capacitive chemical sensor also enables a smart, analyte-sensitive chemical preconcentrator. After sorption of the sample by the sensing film, the film can be rapidly heated to release the sample for further analysis. Therefore, the capacitive chemical sensor can optimize the sample collection time prior to release to enable the rapid and accurate analysis of analytes by a microanalytical system.

  12. Evaluation of the matrix effect on gas chromatography--mass spectrometry with carrier gas containing ethylene glycol as an analyte protectant.

    PubMed

    Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi

    2016-02-19

    The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Integration of electrochemistry in micro-total analysis systems for biochemical assays: recent developments.

    PubMed

    Xu, Xiaoli; Zhang, Song; Chen, Hui; Kong, Jilie

    2009-11-15

    Micro-total analysis systems (microTAS) integrate different analytical operations like sample preparation, separation and detection into a single microfabricated device. With the outstanding advantages of low cost, satisfactory analytical efficiency and flexibility in design, highly integrated and miniaturized devices from the concept of microTAS have gained widespread applications, especially in biochemical assays. Electrochemistry is shown to be quite compatible with microanalytical systems for biochemical assays, because of its attractive merits such as simplicity, rapidity, high sensitivity, reduced power consumption, and sample/reagent economy. This review presents recent developments in the integration of electrochemistry in microdevices for biochemical assays. Ingenious microelectrode design and fabrication methods, and versatility of electrochemical techniques are involved. Practical applications of such integrated microsystem in biochemical assays are focused on in situ analysis, point-of-care testing and portable devices. Electrochemical techniques are apparently suited to microsystems, since easy microfabrication of electrochemical elements and a high degree of integration with multi-analytical functions can be achieved at low cost. Such integrated microsystems will play an increasingly important role for analysis of small volume biochemical samples. Work is in progress toward new microdevice design and applications.

  14. Manufacturing data analytics using a virtual factory representation.

    PubMed

    Jain, Sanjay; Shao, Guodong; Shin, Seung-Jun

    2017-01-01

    Large manufacturers have been using simulation to support decision-making for design and production. However, with the advancement of technologies and the emergence of big data, simulation can be utilised to perform and support data analytics for associated performance gains. This requires not only significant model development expertise, but also huge data collection and analysis efforts. This paper presents an approach within the frameworks of Design Science Research Methodology and prototyping to address the challenge of increasing the use of modelling, simulation and data analytics in manufacturing via reduction of the development effort. The use of manufacturing simulation models is presented as data analytics applications themselves and for supporting other data analytics applications by serving as data generators and as a tool for validation. The virtual factory concept is presented as the vehicle for manufacturing modelling and simulation. Virtual factory goes beyond traditional simulation models of factories to include multi-resolution modelling capabilities and thus allowing analysis at varying levels of detail. A path is proposed for implementation of the virtual factory concept that builds on developments in technologies and standards. A virtual machine prototype is provided as a demonstration of the use of a virtual representation for manufacturing data analytics.

  15. Web-Based Visual Analytics for Social Media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bruce, Joseph R.; Dowson, Scott T.

    Social media provides a rich source of data that reflects current trends and public opinion on a multitude of topics. The data can be harvested from Twitter, Facebook, Blogs, and other social applications. The high rate of adoption of social media has created a domain that has an ever expanding volume of data that make it difficult to use the raw data for analysis. Information visual analytics is key in drawing out features of interest in social media. The Scalable Reasoning System is an application that couples a back end server performing analysis algorithms and an intuitive front end visualizationmore » to allow for investigation. We provide a componentized system that can be rapidly adapted to customer needs such that the information they are most interested in is brought to their attention through the application. To this end, we have developed a social media application for use by emergency operations for the city of Seattle to show current weather and traffic trends which is important for their tasks.« less

  16. Non-volatile analysis in fruits by laser resonant ionization spectrometry: application to resveratrol (3,5,4'-trihydroxystilbene) in grapes

    NASA Astrophysics Data System (ADS)

    Montero, C.; Orea, J. M.; Soledad Muñoz, M.; Lobo, R. F. M.; González Ureña, A.

    A laser desorption (LD) coupled with resonance-enhanced multiphoton ionisation (REMPI) and time-of-flight mass spectrometry (TOFMS) technique for non-volatile trace analysis compounds is presented. Essential features are: (a) an enhanced desorption yield due to the mixing of metal powder with the analyte in the sample preparation, (b) a high resolution, great sensitivity and low detection limit due to laser resonant ionisation and mass spectrometry detection. Application to resveratrol content in grapes demonstrated the capability of the analytical method with a sensitivity of 0.2 pg per single laser shot and a detection limit of 5 ppb.

  17. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  18. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  19. A Single Molecular Beacon Probe Is Sufficient for the Analysis of Multiple Nucleic Acid Sequences

    PubMed Central

    Gerasimova, Yulia V.; Hayson, Aaron; Ballantyne, Jack; Kolpashchikov, Dmitry M.

    2010-01-01

    Molecular beacon (MB) probes are dual-labeled hairpin-shaped oligodeoxyribonucleotides that are extensively used for real-time detection of specific RNA/DNA analytes. In the MB probe, the loop fragment is complementary to the analyte: therefore, a unique probe is required for the analysis of each new analyte sequence. The conjugation of an oligonucleotide with two dyes and subsequent purification procedures add to the cost of MB probes, thus reducing their application in multiplex formats. Here we demonstrate how one MB probe can be used for the analysis of an arbitrary nucleic acid. The approach takes advantage of two oligonucleotide adaptor strands, each of which contains a fragment complementary to the analyte and a fragment complementary to an MB probe. The presence of the analyte leads to association of MB probe and the two DNA strands in quadripartite complex. The MB probe fluorescently reports the formation of this complex. In this design, the MB does not bind the analyte directly; therefore, the MB sequence is independent of the analyte. In this study one universal MB probe was used to genotype three human polymorphic sites. This approach promises to reduce the cost of multiplex real-time assays and improve the accuracy of single-nucleotide polymorphism genotyping. PMID:20665615

  20. [Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].

    PubMed

    Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou

    2014-08-01

    In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).

  1. RE-EVALUATION OF APPLICABILITY OF AGENCY SAMPLE HOLDING TIMES

    EPA Science Inventory

    Holding times are the length of time a sample can be stored after collection and prior to analysis without significantly affecting the analytical results. Holding times vary with the analyte, sample matrix, and analytical methodology used to quantify the analytes concentration. ...

  2. Microfluidic-Based Platform for Universal Sample Preparation and Biological Assays Automation for Life-Sciences Research and Remote Medical Applications

    NASA Astrophysics Data System (ADS)

    Brassard, D.; Clime, L.; Daoud, J.; Geissler, M.; Malic, L.; Charlebois, D.; Buckley, N.; Veres, T.

    2018-02-01

    An innovative centrifugal microfluidic universal platform for remote bio-analytical assays automation required in life-sciences research and medical applications, including purification and analysis from body fluids of cellular and circulating markers.

  3. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  4. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  5. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  6. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  7. 40 CFR Appendix B to Part 136 - Definition and Procedure for the Determination of the Method Detection Limit-Revision 1.11

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the analyte concentration is greater than zero and is determined from analysis of a sample in a given matrix containing the analyte. Scope and Application This procedure is designed for applicability to a wide variety of sample types ranging from reagent (blank) water containing analyte to wastewater...

  8. Tunable lasers and their application in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Steinfeld, J. I.

    1975-01-01

    The impact that laser techniques might have in chemical analysis is examined. Absorption, scattering, and heterodyne detection is considered. Particular emphasis is placed on the advantages of using frequency-tunable sources, and dye solution lasers are regarded as the outstanding example of this type of laser. Types of spectroscopy that can be carried out with lasers are discussed along with the ultimate sensitivity or minimum detectable concentration of molecules that can be achieved with each method. Analytical applications include laser microprobe analysis, remote sensing and instrumental methods such as laser-Raman spectroscopy, atomic absorption/fluorescence spectrometry, fluorescence assay techniques, optoacoustic spectroscopy, and polarization measurements. The application of lasers to spectroscopic methods of analysis would seem to be a rewarding field both for research in analytical chemistry and for investments in instrument manufacturing.

  9. Hyperspectral imaging for non-contact analysis of forensic traces.

    PubMed

    Edelman, G J; Gaston, E; van Leeuwen, T G; Cullen, P J; Aalders, M C G

    2012-11-30

    Hyperspectral imaging (HSI) integrates conventional imaging and spectroscopy, to obtain both spatial and spectral information from a specimen. This technique enables investigators to analyze the chemical composition of traces and simultaneously visualize their spatial distribution. HSI offers significant potential for the detection, visualization, identification and age estimation of forensic traces. The rapid, non-destructive and non-contact features of HSI mark its suitability as an analytical tool for forensic science. This paper provides an overview of the principles, instrumentation and analytical techniques involved in hyperspectral imaging. We describe recent advances in HSI technology motivating forensic science applications, e.g. the development of portable and fast image acquisition systems. Reported forensic science applications are reviewed. Challenges are addressed, such as the analysis of traces on backgrounds encountered in casework, concluded by a summary of possible future applications. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE PAGES

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    2016-09-16

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  11. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  12. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies

    PubMed Central

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-01-01

    Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006

  13. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned.

    PubMed

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation's largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration. 1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Data awareness, that is, an appreciation of the importance of data integrity, data hygiene 2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable.

  14. Deployment of Analytics into the Healthcare Safety Net: Lessons Learned

    PubMed Central

    Hartzband, David; Jacobs, Feygele

    2016-01-01

    Background As payment reforms shift healthcare reimbursement toward value-based payment programs, providers need the capability to work with data of greater complexity, scope and scale. This will in many instances necessitate a change in understanding of the value of data, and the types of data needed for analysis to support operations and clinical practice. It will also require the deployment of different infrastructure and analytic tools. Community health centers, which serve more than 25 million people and together form the nation’s largest single source of primary care for medically underserved communities and populations, are expanding and will need to optimize their capacity to leverage data as new payer and organizational models emerge. Methods To better understand existing capacity and help organizations plan for the strategic and expanded uses of data, a project was initiated that deployed contemporary, Hadoop-based, analytic technology into several multi-site community health centers (CHCs) and a primary care association (PCA) with an affiliated data warehouse supporting health centers across the state. An initial data quality exercise was carried out after deployment, in which a number of analytic queries were executed using both the existing electronic health record (EHR) applications and in parallel, the analytic stack. Each organization carried out the EHR analysis using the definitions typically applied for routine reporting. The analysis deploying the analytic stack was carried out using those common definitions established for the Uniform Data System (UDS) by the Health Resources and Service Administration.1 In addition, interviews with health center leadership and staff were completed to understand the context for the findings. Results The analysis uncovered many challenges and inconsistencies with respect to the definition of core terms (patient, encounter, etc.), data formatting, and missing, incorrect and unavailable data. At a population level, apparent underreporting of a number of diagnoses, specifically obesity and heart disease, was also evident in the results of the data quality exercise, for both the EHR-derived and stack analytic results. Conclusion Data awareness, that is, an appreciation of the importance of data integrity, data hygiene2 and the potential uses of data, needs to be prioritized and developed by health centers and other healthcare organizations if analytics are to be used in an effective manner to support strategic objectives. While this analysis was conducted exclusively with community health center organizations, its conclusions and recommendations may be more broadly applicable. PMID:28210424

  15. Moving your laboratories to the field--Advantages and limitations of the use of field portable instruments in environmental sample analysis.

    PubMed

    Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek

    2015-07-01

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  17. Line-Focused Optical Excitation of Parallel Acoustic Focused Sample Streams for High Volumetric and Analytical Rate Flow Cytometry.

    PubMed

    Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W

    2017-09-19

    Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.

  18. Biomolecular logic systems: applications to biosensors and bioactuators

    NASA Astrophysics Data System (ADS)

    Katz, Evgeny

    2014-05-01

    The paper presents an overview of recent advances in biosensors and bioactuators based on the biocomputing concept. Novel biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce output in the form of YES/NO response. Compared to traditional single-analyte sensing devices, biocomputing approach enables a high-fidelity multi-analyte biosensing, particularly beneficial for biomedical applications. Multi-signal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert to medical emergencies, along with an immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly exemplified for liver injury. Wide-ranging applications of multi-analyte digital biosensors in medicine, environmental monitoring and homeland security are anticipated. "Smart" bioactuators, for example for signal-triggered drug release, were designed by interfacing switchable electrodes and biocomputing systems. Integration of novel biosensing and bioactuating systems with the biomolecular information processing systems keeps promise for further scientific advances and numerous practical applications.

  19. Q-controlled amplitude modulation atomic force microscopy in liquids: An analysis

    NASA Astrophysics Data System (ADS)

    Hölscher, H.; Schwarz, U. D.

    2006-08-01

    An analysis of amplitude modulation atomic force microscopy in liquids is presented with respect to the application of the Q-Control technique. The equation of motion is solved by numerical and analytic methods with and without Q-Control in the presence of a simple model interaction force adequate for many liquid environments. In addition, the authors give an explicit analytical formula for the tip-sample indentation showing that higher Q factors reduce the tip-sample force. It is found that Q-Control suppresses unwanted deformations of the sample surface, leading to the enhanced image quality reported in several experimental studies.

  20. A theoretical analysis of the free vibrations of ring- and/or stringer-stiffened elliptical cylinders with arbitrary end conditions. Volume 1: Analytical derivation and applications

    NASA Technical Reports Server (NTRS)

    Boyd, D. E.; Rao, C. K. P.

    1973-01-01

    The derivation and application of a Rayleigh-Ritz modal vibration analysis are presented for ring and/or stringer stiffened noncircular cylindrical shells with arbitrary end conditions. Comparisons with previous results from experimental and analytical studies showed this method of analysis to be accurate for a variety of end conditions. Results indicate a greater effect of rings on natural frequencies than of stringers.

  1. Structural analysis and design of multivariable control systems: An algebraic approach

    NASA Technical Reports Server (NTRS)

    Tsay, Yih Tsong; Shieh, Leang-San; Barnett, Stephen

    1988-01-01

    The application of algebraic system theory to the design of controllers for multivariable (MV) systems is explored analytically using an approach based on state-space representations and matrix-fraction descriptions. Chapters are devoted to characteristic lambda matrices and canonical descriptions of MIMO systems; spectral analysis, divisors, and spectral factors of nonsingular lambda matrices; feedback control of MV systems; and structural decomposition theories and their application to MV control systems.

  2. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  3. Analytical and between-subject variation of thrombin generation measured by calibrated automated thrombography on plasma samples.

    PubMed

    Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona

    2018-05-01

    The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.

  4. Spring back of infinite honeycomb sheets beyond plastic deformation

    NASA Astrophysics Data System (ADS)

    Bonfanti, A.; Bhaskar, A.

    2015-02-01

    Cellular structures are promising for applications where high stiffness and strength are required with the minimal use of material. They are often used in applications where the plastic deformation plays an important role, such as those involving crashworthiness, energy absorption, and stents. The elastic analysis of a honeycomb sheet has been carried out in the past [1]. The present analysis extends this classical work in the elasto-plastic regime. Recoil analysis due to elastic recovery is absent from the published literature. This work aims to develop an analytical model to calculate the spring back for a simplified case, that of an infinite honeycomb sheet. An elastic-perfectly plastic material model is assumed. The recoil for a clamped beam with a load and moment applied at the free edge is analytically calculated first. This is carried out by relating the stress distribution of the cross section to the final deformed shape. The part corresponding to the elastic contribution is subsequently subtracted in order to obtain the final configuration after the external load is removed. This simple elasto-plastic analysis is then incorporated into the analysis of an infinite sheet made of uniform hexagonal cells. The translational symmetry of the lattice is exploited along with the analysis of a beam under tip loading through to plastic stage and recoil. The final shape of the struts upon the removal of the remote stress is completely determined by the plastic deformation which cannot be recovered. The expression for the beam thus obtained is then used to build an analytical model for an infinite honeycomb sheet loaded in both directions.

  5. Comparison of univariate and multivariate calibration for the determination of micronutrients in pellets of plant materials by laser induced breakdown spectrometry

    NASA Astrophysics Data System (ADS)

    Braga, Jez Willian Batista; Trevizan, Lilian Cristina; Nunes, Lidiane Cristina; Rufini, Iolanda Aparecida; Santos, Dário, Jr.; Krug, Francisco José

    2010-01-01

    The application of laser induced breakdown spectrometry (LIBS) aiming the direct analysis of plant materials is a great challenge that still needs efforts for its development and validation. In this way, a series of experimental approaches has been carried out in order to show that LIBS can be used as an alternative method to wet acid digestions based methods for analysis of agricultural and environmental samples. The large amount of information provided by LIBS spectra for these complex samples increases the difficulties for selecting the most appropriated wavelengths for each analyte. Some applications have suggested that improvements in both accuracy and precision can be achieved by the application of multivariate calibration in LIBS data when compared to the univariate regression developed with line emission intensities. In the present work, the performance of univariate and multivariate calibration, based on partial least squares regression (PLSR), was compared for analysis of pellets of plant materials made from an appropriate mixture of cryogenically ground samples with cellulose as the binding agent. The development of a specific PLSR model for each analyte and the selection of spectral regions containing only lines of the analyte of interest were the best conditions for the analysis. In this particular application, these models showed a similar performance, but PLSR seemed to be more robust due to a lower occurrence of outliers in comparison to the univariate method. Data suggests that efforts dealing with sample presentation and fitness of standards for LIBS analysis must be done in order to fulfill the boundary conditions for matrix independent development and validation.

  6. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  7. Design and analysis of composite structures with stress concentrations

    NASA Technical Reports Server (NTRS)

    Garbo, S. P.

    1983-01-01

    An overview of an analytic procedure which can be used to provide comprehensive stress and strength analysis of composite structures with stress concentrations is given. The methodology provides designer/analysts with a user-oriented procedure which, within acceptable engineering accuracy, accounts for the effects of a wide range of application design variables. The procedure permits the strength of arbitrary laminate constructions under general bearing/bypass load conditions to be predicted with only unnotched unidirectional strength and stiffness input data required. Included is a brief discussion of the relevancy of this analysis to the design of primary aircraft structure; an overview of the analytic procedure with theory/test correlations; and an example of the use and interaction of this strength analysis relative to the design of high-load transfer bolted composite joints.

  8. Food adulteration analysis without laboratory prepared or determined reference food adulterant values.

    PubMed

    Kalivas, John H; Georgiou, Constantinos A; Moira, Marianna; Tsafaras, Ilias; Petrakis, Eleftherios A; Mousdis, George A

    2014-04-01

    Quantitative analysis of food adulterants is an important health and economic issue that needs to be fast and simple. Spectroscopy has significantly reduced analysis time. However, still needed are preparations of analyte calibration samples matrix matched to prediction samples which can be laborious and costly. Reported in this paper is the application of a newly developed pure component Tikhonov regularization (PCTR) process that does not require laboratory prepared or reference analysis methods, and hence, is a greener calibration method. The PCTR method requires an analyte pure component spectrum and non-analyte spectra. As a food analysis example, synchronous fluorescence spectra of extra virgin olive oil samples adulterated with sunflower oil is used. Results are shown to be better than those obtained using ridge regression with reference calibration samples. The flexibility of PCTR allows including reference samples and is generic for use with other instrumental methods and food products. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Neutron radiative capture methods for surface elemental analysis

    USGS Publications Warehouse

    Trombka, J.I.; Senftle, F.; Schmadebeck, R.

    1970-01-01

    Both an accelerator and a 252Cf neutron source have been used to induce characteristic gamma radiation from extended soil samples. To demonstrate the method, measurements of the neutron-induced radiative capture and activation gamma rays have been made with both Ge(Li) and NaI(Tl) detectors, Because of the possible application to space flight geochemical analysis, it is believed that NaI(Tl) detectors must be used. Analytical procedures have been developed to obtain both qualitative and semiquantitative results from an interpretation of the measured NaI(Tl) pulse-height spectrum. Experiment results and the analytic procedure are presented. ?? 1970.

  10. Data visualisation in surveillance for injury prevention and control: conceptual bases and case studies.

    PubMed

    Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F

    2016-04-01

    The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  11. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics, and others yet to emerge on the postgenomics horizon.

  12. Current applications of high-resolution mass spectrometry for the analysis of new psychoactive substances: a critical review.

    PubMed

    Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin

    2017-10-01

    The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.

  13. Application of conductive polymer analysis for wood and woody plant identifications

    Treesearch

    A. Dan Wilson; D.G. Lester; Charisse S. Oberle

    2005-01-01

    An electronic aroma detection (EAD) technology known as conductive polymer analysis (CPA) was evaluated as a means of identifying and discriminating woody samples of angiosperms and gymnosperms using an analytical instrument (electronic nose) that characterizes the aroma profiles of volatiles released from excised wood into sampled headspace. The instrument measures...

  14. Species authentication and geographical origin discrimination of herbal medicines by near infrared spectroscopy: A review.

    PubMed

    Wang, Pei; Yu, Zhiguo

    2015-10-01

    Near infrared (NIR) spectroscopy as a rapid and nondestructive analytical technique, integrated with chemometrics, is a powerful process analytical tool for the pharmaceutical industry and is becoming an attractive complementary technique for herbal medicine analysis. This review mainly focuses on the recent applications of NIR spectroscopy in species authentication of herbal medicines and their geographical origin discrimination.

  15. Exact synchronization bound for coupled time-delay systems.

    PubMed

    Senthilkumar, D V; Pesquera, Luis; Banerjee, Santo; Ortín, Silvia; Kurths, J

    2013-04-01

    We obtain an exact bound for synchronization in coupled time-delay systems using the generalized Halanay inequality for the general case of time-dependent delay, coupling, and coefficients. Furthermore, we show that the same analysis is applicable to both uni- and bidirectionally coupled time-delay systems with an appropriate evolution equation for their synchronization manifold, which can also be defined for different types of synchronization. The exact synchronization bound assures an exponential stabilization of the synchronization manifold which is crucial for applications. The analytical synchronization bound is independent of the nature of the modulation and can be applied to any time-delay system satisfying a Lipschitz condition. The analytical results are corroborated numerically using the Ikeda system.

  16. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  17. APPLICATION OF THE MASTER ANALYTICAL SCHEME TO POLAR ORGANICS IN DRINKING WATER

    EPA Science Inventory

    EPA's Master Analytical Scheme (MAS) for Organic Compounds in Water provides for comprehensive qualitative-quantitative analysis of gas chromatographable organics in many types of water. The paper emphasizes the analysis of polar and ionic organics, the more water soluble compoun...

  18. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    PubMed Central

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Árpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis. PMID:24312804

  19. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Apparatus and method for performing microfluidic manipulations for chemical analysis and synthesis

    DOEpatents

    Ramsey, J. Michael

    2000-01-01

    A microchip laboratory system and method provide fluid manipulations for a variety of applications, including sample injection for microchip chemical separations. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis and electrochromatography are performed in channels formed in the substrate. Analytes are loaded into a four-way intersection of channels by electrokinetically pumping the analyte through the intersection, followed by switching of the potentials to force an analyte plug into the separation channel.

  1. Apparatus and method for performing microfluidic manipulations for chemical analysis and synthesis

    DOEpatents

    Ramsey, J. Michael

    2000-01-01

    A microchip laboratory system and method proved fluid manipulations for a variety of applications, including sample injection for microchip chemical separations. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis and electrochromatography are performed in channels formed in the substrate. Analytes are loaded into a four-way intersection of channels by electrokinetically pumping the analyte through the intersection, followed by switching of the potentials to force an analyte plug into the separation channel.

  2. Apparatus and method for performing microfluidic manipulations for chemical analysis and synthesis

    DOEpatents

    Ramsey, J. Michael

    2002-01-01

    A microchip laboratory system and method provide fluid manipulations for a variety of applications, including sample injection for microchip chemical separations. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis and electrochromatography are performed in channels formed in the substrate. Analytes are loaded into a four-way intersection of channels by electrokinetically pumping the analyte through the intersection, followed by switching of the potentials to force an analyte plug into the separation channel.

  3. Apparatus and method for performing microfluidic manipulations for chemical analysis and synthesis

    DOEpatents

    Ramsey, J. Michael

    1999-01-01

    A microchip laboratory system and method provide fluid manipulations for a variety of applications, including sample injection for microchip chemical separations. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis and electrochromatography are performed in channels formed in the substrate. Analytes are loaded into a four-way intersection of channels by electrokinetically pumping the analyte through the intersection, followed by switching of the potentials to force an analyte plug into the separation channel.

  4. Apparatus and method for performing microfluidic manipulations for chemical analysis and synthesis

    DOEpatents

    Ramsey, J.M.

    1999-01-12

    A microchip laboratory system and method provide fluid manipulations for a variety of applications, including sample injection for microchip chemical separations. The microchip is fabricated using standard photolithographic procedures and chemical wet etching, with the substrate and cover plate joined using direct bonding. Capillary electrophoresis and electrochromatography are performed in channels formed in the substrate. Analytes are loaded into a four-way intersection of channels by electrokinetically pumping the analyte through the intersection, followed by switching of the potentials to force an analyte plug into the separation channel. 46 figs.

  5. Investigation of colloidal graphite as a matrix for matrix-assisted laser desorption/ionisation mass spectrometry of low molecular weight analytes.

    PubMed

    Warren, Alexander D; Conway, Ulric; Arthur, Christopher J; Gates, Paul J

    2016-07-01

    The analysis of low molecular weight compounds by matrix-assisted laser desorption/ionisation mass spectrometry is problematic due to the interference and suppression of analyte ionisation by the matrices typically employed - which are themselves low molecular weight compounds. The application of colloidal graphite is demonstrated here as an easy to use matrix that can promote the ionisation of a wide range of analytes including low molecular weight organic compounds, complex natural products and inorganic complexes. Analyte ionisation with colloidal graphite is compared with traditional organic matrices along with various other sources of graphite (e.g. graphite rods and charcoal pencils). Factors such as ease of application, spectra reproducibility, spot longevity, spot-to-spot reproducibility and spot homogeneity (through single spot imaging) are explored. For some analytes, considerable matrix suppression effects are observed resulting in spectra completely devoid of matrix ions. We also report the observation of radical molecular ions [M(-●) ] in the negative ion mode, particularly with some aromatic analytes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.

    PubMed

    Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco

    2017-07-24

    DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Defining the challenges of the Modern Analytical Laboratory (CPSA USA 2014): the risks and reality of personalized healthcare.

    PubMed

    Weng, Naidong; Needham, Shane; Lee, Mike

    2015-01-01

    The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.

  8. Application of ion chromatography in pharmaceutical and drug analysis.

    PubMed

    Jenke, Dennis

    2011-08-01

    Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.

  9. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  10. Applications of derivatization reactions to trace organic compounds during sample preparation based on pressurized liquid extraction.

    PubMed

    Carro, Antonia M; González, Paula; Lorenzo, Rosa A

    2013-06-28

    Pressurized liquid extraction (PLE) is an exhaustive technique used for the extraction of analytes from solid samples. Temperature, pressure, solvent type and volume, and the addition of other reagents notably influence the efficiency of the extraction. The analytical applications of this technique can be improved by coupling with appropriate derivatization reactions. The aim of this review is to discuss the recent applications of the sequential combination of PLE with derivatization and the approaches that involve simultaneous extraction and in situ derivatization. The potential of the latest developments to the trace analysis of environmental, food and biological samples is also analyzed. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Multi-way chemometric methodologies and applications: a central summary of our research work.

    PubMed

    Wu, Hai-Long; Nie, Jin-Fang; Yu, Yong-Jie; Yu, Ru-Qin

    2009-09-14

    Multi-way data analysis and tensorial calibration are gaining widespread acceptance with the rapid development of modern analytical instruments. In recent years, our group working in State Key Laboratory of Chemo/Biosensing and Chemometrics in Hunan University has carried out exhaustive scientific research work in this area, such as building more canonical symbol systems, seeking the inner mathematical cyclic symmetry property for trilinear or multilinear decomposition, suggesting a series of multi-way calibration algorithms, exploring the rank estimation of three-way trilinear data array and analyzing different application systems. In this present paper, an overview from second-order data to third-order data covering about theories and applications in analytical chemistry has been presented.

  12. Applications of reversible covalent chemistry in analytical sample preparation.

    PubMed

    Siegel, David

    2012-12-07

    Reversible covalent chemistry (RCC) adds another dimension to commonly used sample preparation techniques like solid-phase extraction (SPE), solid-phase microextraction (SPME), molecular imprinted polymers (MIPs) or immuno-affinity cleanup (IAC): chemical selectivity. By selecting analytes according to their covalent reactivity, sample complexity can be reduced significantly, resulting in enhanced analytical performance for low-abundance target analytes. This review gives a comprehensive overview of the applications of RCC in analytical sample preparation. The major reactions covered include reversible boronic ester formation, thiol-disulfide exchange and reversible hydrazone formation, targeting analyte groups like diols (sugars, glycoproteins and glycopeptides, catechols), thiols (cysteinyl-proteins and cysteinyl-peptides) and carbonyls (carbonylated proteins, mycotoxins). Their applications range from low abundance proteomics to reversible protein/peptide labelling to antibody chromatography to quantitative and qualitative food analysis. In discussing the potential of RCC, a special focus is on the conditions and restrictions of the utilized reaction chemistry.

  13. Porous Au-Ag Nanospheres with High-Density and Highly Accessible Hotspots for SERS Analysis.

    PubMed

    Liu, Kai; Bai, Yaocai; Zhang, Lei; Yang, Zhongbo; Fan, Qikui; Zheng, Haoquan; Yin, Yadong; Gao, Chuanbo

    2016-06-08

    Colloidal plasmonic metal nanoparticles have enabled surface-enhanced Raman scattering (SERS) for a variety of analytical applications. While great efforts have been made to create hotspots for amplifying Raman signals, it remains a great challenge to ensure their high density and accessibility for improved sensitivity of the analysis. Here we report a dealloying process for the fabrication of porous Au-Ag alloy nanoparticles containing abundant inherent hotspots, which were encased in ultrathin hollow silica shells so that the need of conventional organic capping ligands for stabilization is eliminated, producing colloidal plasmonic nanoparticles with clean surface and thus high accessibility of the hotspots. As a result, these novel nanostructures show excellent SERS activity with an enhancement factor of ∼1.3 × 10(7) on a single particle basis (off-resonant condition), promising high applicability in many SERS-based analytical and biomedical applications.

  14. Environmental and forensic applications of field-portable GC-MS: an overview.

    PubMed

    Eckenrode, B A

    2001-06-01

    GC-MS can provide analytical information that is most reliable for many types of organic analyses. As field-portable GC-MS analytical systems evolve, the application scenarios have diversified as well. With the development of rugged fieldable systems, these instruments were demonstrated to be usable in the harsh environment of the jungle and in chemical demilitarization or military reconnaissance situations. Continuous unattended operations of a GC-MS for 12- or 24-hour monitoring applications in the field have been shown to be possible. A real-time algorithm strategy is proposed, which can be developed to aid in the advancement of field-portable mass spectrometry applied to chemical warfare agent analysis in military vehicles and can be used to raise the standard for field data quality. Each of these capabilities is discussed with the intent on reviewing analysis situations that can be expanded because of developments in field GC-MS instrumentation.

  15. Bio-analytical applications of mid-infrared spectroscopy using silver halide fiber-optic probes1

    NASA Astrophysics Data System (ADS)

    Heise, H. M.; Küpper, L.; Butvina, L. N.

    2002-10-01

    Infrared-spectroscopy has proved to be a powerful method for the study of various biomedical samples, in particular for in-vitro analysis in the clinical laboratory and for non-invasive diagnostics. In general, the analysis of biofluids such as whole blood, urine, microdialysates and bioreactor broth media takes advantage of the fact that a multitude of analytes can be quantified simultaneously and rapidly without the need for reagents. Progress in the quality of infrared silver halide fibers enabled us to construct several flexible fiber-optic probes of different geometries, which are particularly suitable for the measurement of small biosamples. Recent trends show that dry film measurements by mid-infrared spectroscopy could revolutionize analytical tools in the clinical chemistry laboratory, and an example is given. Infrared diagnostic tools show a promising potential for patients, and minimal-invasive blood glucose assays or skin tissue pathology in particular cannot be left out using mid-infrared fiber-based probes. Other applications include the measurement of skin samples including penetration studies of vitamins and constituents of cosmetic cream formulations. A further field is the micro-domain analysis of biopsy samples from bog mummified corpses, and recent results on the chemistry of dermis and hair samples are reported. Another field of application, for which results are reported, is food analysis and bio-reactor monitoring.

  16. Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.

    PubMed

    Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L

    2013-01-01

    Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.

  17. Application of nanotechnology in miniaturized systems and its use for advanced analytics and diagnostics - an updated review.

    PubMed

    Sandetskaya, Natalia; Allelein, Susann; Kuhlmeier, Dirk

    2013-12-01

    A combination of Micro-Electro-Mechanical Systems and nanoscale structures allows for the creation of novel miniaturized devices, which broaden the boundaries of the diagnostic approaches. Some materials possess unique properties at the nanolevel, which are different from those in bulk materials. In the last few years these properties became a focus of interest for many researchers, as well as methods of production, design and operation of the nanoobjects. Intensive research and development work resulted in numerous inventions exploiting nanotechnology in miniaturized systems. Modern technical and laboratory equipment allows for the precise control of such devices, making them suitable for sensitive and accurate detection of the analytes. The current review highlights recent patents in the field of nanotechnology in microdevices, applicable for medical, environmental or food analysis. The paper covers the structural and functional basis of such systems and describes specific embodiments in three principal branches: application of nanoparticles, nanofluidics, and nanosensors in the miniaturized systems for advanced analytics and diagnostics. This overview is an update of an earlier review article.

  18. Performance implications from sizing a VM on multi-core systems: A Data analytic application s view

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Horey, James L; Begoli, Edmon

    In this paper, we present a quantitative performance analysis of data analytics applications running on multi-core virtual machines. Such environments form the core of cloud computing. In addition, data analytics applications, such as Cassandra and Hadoop, are becoming increasingly popular on cloud computing platforms. This convergence necessitates a better understanding of the performance and cost implications of such hybrid systems. For example, the very rst step in hosting applications in virtualized environments, requires the user to con gure the number of virtual processors and the size of memory. To understand performance implications of this step, we benchmarked three Yahoo Cloudmore » Serving Benchmark (YCSB) workloads in a virtualized multi-core environment. Our measurements indicate that the performance of Cassandra for YCSB workloads does not heavily depend on the processing capacity of a system, while the size of the data set is critical to performance relative to allocated memory. We also identi ed a strong relationship between the running time of workloads and various hardware events (last level cache loads, misses, and CPU migrations). From this analysis, we provide several suggestions to improve the performance of data analytics applications running on cloud computing environments.« less

  19. Advantages and Challenges of Dried Blood Spot Analysis by Mass Spectrometry Across the Total Testing Process.

    PubMed

    Zakaria, Rosita; Allen, Katrina J; Koplin, Jennifer J; Roche, Peter; Greaves, Ronda F

    2016-12-01

    Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; "blood spot" and "mass spectrometry"; while excluding "newborn"; and "neonate". In addition, databases were restricted to English language and human specific. There was no time period limit applied. As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required.

  20. CZAEM USER'S GUIDE: MODELING CAPTURE ZONES OF GROUND-WATER WELLS USING ANALYTIC ELEMENTS

    EPA Science Inventory

    The computer program CZAEM is designed for elementary capture zone analysis, and is based on the analytic element method. CZAEM is applicable to confined and/or unconfined low in shallow aquifers; the Dupuit-Forchheimer assumption is adopted. CZAEM supports the following analyt...

  1. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  2. Determination of pesticides in sewage sludge from an agro-food industry using QuEChERS extraction followed by analysis with liquid chromatography-tandem mass spectrometry.

    PubMed

    Ponce-Robles, Laura; Rivas, Gracia; Esteban, Belen; Oller, Isabel; Malato, Sixto; Agüera, Ana

    2017-10-01

    An analytical method was developed and validated for the determination of ten pesticides in sewage sludge coming from an agro-food industry. The method was based on the application of Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) extraction for solid sewage sludge and SPE extraction for sludge aqueous phase, followed by liquid chromatography (LC) coupled to hybrid quadrupole/linear ion trap mass spectrometry (QqLIT-MS). The QuEChERS method was reported 14 years ago and nowadays is mainly applied to the analysis of pesticides in food. More recent applications have been reported in other matrices as sewage sludge, but the complexity of the matrix makes necessary the optimization of the cleanup step to improve the efficiency of the analysis. With this aim, several dispersive solid-phase extraction cleanup sorbents were tested, choosing C18 + PSA as a d-SPE sorbent. The proposed method was satisfactorily validated for most compounds investigated, showing recoveries higher than 80% in most cases, with the only exception of prochloraz (71%) at low concentration level. Limits of quantification were lower than 40 ng l -1 in the aqueous phase and below 40 ng g -1 in the solid phase for the majority of the analytes. The method was applied to solid sludge and the sludge aqueous phase coming from an agro-food industry which processes fruits and vegetables. Graphical abstract Application of LC/MS/MS advanced analytical techniques for determination of pesticides contained in sewage sludge.

  3. Utility of the summation chromatographic peak integration function to avoid manual reintegrations in the analysis of targeted analytes

    USDA-ARS?s Scientific Manuscript database

    As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...

  4. PROVIDING PLANT DATA ANALYTICS THROUGH A SEAMLESS DIGITAL ENVIRONMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bly, Aaron; Oxstrand, Johanna

    As technology continues to evolve and become more integrated into a worker’s daily routine in the Nuclear Power industry the need for easy access to data becomes a priority. Not only does the need for data increase but the amount of data collected increases. In most cases the data is collected and stored in various software applications, many of which are legacy systems, which do not offer any other option to access the data except through the application’s user interface. Furthermore the data gets grouped in “silos” according to work function and not necessarily by subject. Hence, in order tomore » access all the information needed for a particular task or analysis one may have to access multiple applications to gather all the data needed. The industry and the research community have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment. An SDE provides a means to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. In addition, the nuclear utilities have identified the need for research focused on data analytics. The effort should develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. Idaho National Laboratory is leading such effort, which is conducted in close collaboration with vendors, nuclear utilities, Institute of Nuclear Power Operations, and Electric Power Research Institute. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This paper will describe the study and the initial results.« less

  5. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  7. Experimental study and analytical model of deformation of magnetostrictive films as applied to mirrors for x-ray space telescopes.

    PubMed

    Wang, Xiaoli; Knapp, Peter; Vaynman, S; Graham, M E; Cao, Jian; Ulmer, M P

    2014-09-20

    The desire for continuously gaining new knowledge in astronomy has pushed the frontier of engineering methods to deliver lighter, thinner, higher quality mirrors at an affordable cost for use in an x-ray observatory. To address these needs, we have been investigating the application of magnetic smart materials (MSMs) deposited as a thin film on mirror substrates. MSMs have some interesting properties that make the application of MSMs to mirror substrates a promising solution for making the next generation of x-ray telescopes. Due to the ability to hold a shape with an impressed permanent magnetic field, MSMs have the potential to be the method used to make light weight, affordable x-ray telescope mirrors. This paper presents the experimental setup for measuring the deformation of the magnetostrictive bimorph specimens under an applied magnetic field, and the analytical and numerical analysis of the deformation. As a first step in the development of tools to predict deflections, we deposited Terfenol-D on the glass substrates. We then made measurements that were compared with the results from the analytical and numerical analysis. The surface profiles of thin-film specimens were measured under an external magnetic field with white light interferometry (WLI). The analytical model provides good predictions of film deformation behavior under various magnetic field strengths. This work establishes a solid foundation for further research to analyze the full three-dimensional deformation behavior of magnetostrictive thin films.

  8. Optical eye simulator for laser dazzle events.

    PubMed

    Coelho, João M P; Freitas, José; Williamson, Craig A

    2016-03-20

    An optical simulator of the human eye and its application to laser dazzle events are presented. The simulator combines optical design software (ZEMAX) with a scientific programming language (MATLAB) and allows the user to implement and analyze a dazzle scenario using practical, real-world parameters. Contrary to conventional analytical glare analysis, this work uses ray tracing and the scattering model and parameters for each optical element of the eye. The theoretical background of each such element is presented in relation to the model. The overall simulator's calibration, validation, and performance analysis are achieved by comparison with a simpler model based uponCIE disability glare data. Results demonstrate that this kind of advanced optical eye simulation can be used to represent laser dazzle and has the potential to extend the range of applicability of analytical models.

  9. Advances in Molecular Rotational Spectroscopy for Applied Science

    NASA Astrophysics Data System (ADS)

    Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.

    2017-06-01

    Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.

  10. [Construction of NIRS-based process analytical system for production of salvianolic acid for injection and relative discussion].

    PubMed

    Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang

    2016-10-01

    Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.

  11. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  12. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  13. CENTRIFUGAL VIBRATION TEST OF RC PILE FOUNDATION

    NASA Astrophysics Data System (ADS)

    Higuchi, Shunichi; Tsutsumiuchi, Takahiro; Otsuka, Rinna; Ito, Koji; Ejiri, Joji

    It is necessary that nonlinear responses of structures are clarified by soil-structure interaction analysis for the purpose of evaluating the seismic performances of underground structure or foundation structure. In this research, centrifuge shake table tests of reinforced concrete pile foundation installed in the liquefied ground were conducted. Then, finite element analyses for the tests were conducted to confirm an applicability of the analytical method by comparing the experimental results and analytical results.

  14. Cell bioprocessing in space - Applications of analytical cytology

    NASA Technical Reports Server (NTRS)

    Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.

    1988-01-01

    Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.

  15. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    NASA Technical Reports Server (NTRS)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  16. Managing Technical and Cost Uncertainties During Product Development in a Simulation-Based Design Environment

    NASA Technical Reports Server (NTRS)

    Karandikar, Harsh M.

    1997-01-01

    An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.

  17. Modern data science for analytical chemical data - A comprehensive review.

    PubMed

    Szymańska, Ewa

    2018-10-22

    Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Enzyme-based logic gates and circuits-analytical applications and interfacing with electronics.

    PubMed

    Katz, Evgeny; Poghossian, Arshak; Schöning, Michael J

    2017-01-01

    The paper is an overview of enzyme-based logic gates and their short circuits, with specific examples of Boolean AND and OR gates, and concatenated logic gates composed of multi-step enzyme-biocatalyzed reactions. Noise formation in the biocatalytic reactions and its decrease by adding a "filter" system, converting convex to sigmoid response function, are discussed. Despite the fact that the enzyme-based logic gates are primarily considered as components of future biomolecular computing systems, their biosensing applications are promising for immediate practical use. Analytical use of the enzyme logic systems in biomedical and forensic applications is discussed and exemplified with the logic analysis of biomarkers of various injuries, e.g., liver injury, and with analysis of biomarkers characteristic of different ethnicity found in blood samples on a crime scene. Interfacing of enzyme logic systems with modified electrodes and semiconductor devices is discussed, giving particular attention to the interfaces functionalized with signal-responsive materials. Future perspectives in the design of the biomolecular logic systems and their applications are discussed in the conclusion. Graphical Abstract Various applications and signal-transduction methods are reviewed for enzyme-based logic systems.

  19. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  20. Recent trends in sorption-based sample preparation and liquid chromatography techniques for food analysis.

    PubMed

    V Soares Maciel, Edvaldo; de Toffoli, Ana Lúcia; Lanças, Fernando Mauro

    2018-04-20

    The accelerated rising of the world's population increased the consumption of food, thus demanding more rigors in the control of residue and contaminants in food-based products marketed for human consumption. In view of the complexity of most food matrices, including fruits, vegetables, different types of meat, beverages, among others, a sample preparation step is important to provide more reliable results when combined with HPLC separations. An adequate sample preparation step before the chromatographic analysis is mandatory in obtaining higher precision and accuracy in order to improve the extraction of the target analytes, one of the priorities in analytical chemistry. The recent discovery of new materials such as ionic liquids, graphene-derived materials, molecularly imprinted polymers, restricted access media, magnetic nanoparticles, and carbonaceous nanomaterials, provided high sensitivity and selectivity results in an extensive variety of applications. These materials, as well as their several possible combinations, have been demonstrated to be highly appropriate for the extraction of different analytes in complex samples such as food products. The main characteristics and application of these new materials in food analysis will be presented and discussed in this paper. Another topic discussed in this review covers the main advantages and limitations of sample preparation microtechniques, as well as their off-line and on-line combination with HPLC for food analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Qualitative Analysis, with Periodicity, for "Real" Solutions.

    ERIC Educational Resources Information Center

    Rich, Ronald L.

    1984-01-01

    Presents an outline of group separations for a nonhydrogen sulfide analytical scheme applicable to all metallic elements (Bromide scheme). Also presents another outline of an abbreviated and modified version (Iodide scheme) designed for emphasis on nutritionally important metals, with special attention to 10 cations. (JM)

  2. Twenty-first century macro-trends in the institutional fabric of science: bibliometric monitoring and analysis.

    PubMed

    Tijssen, Robert J W; Winnink, Jos

    Some say that world science has become more 'applied', or at least more 'application-oriented', in recent years. Replacing the ill-defined distinction between 'basic research' and 'applied research', we introduce 'research application orientation' domains as an alternative conceptual and analytical framework for examining research output growth patterns. To distinguish possible developmental trajectories we define three institutional domains: 'university', 'industry', 'hospitals'. Our macro-level bibliometric analysis takes a closer look at general trends within and across some 750 of the world's largest research-intensive universities. To correct for database changes, our time-series analysis was applied to both a fixed journal set (same research journals and conference proceedings over time) and a dynamic journal set (changing set of publication outlets). We find that output growth in the 'hospital research orientation' has significantly outpaced the other two application domains, especially since 2006/2007. This happened mainly because of the introduction of new publication outlets in the WoS, but also partially because some universities-especially in China-seem to have become more visible in this domain. Our analytical approach needs further broadening and deepening to provide a more definitive answer whether hospitals and the medical sector are becoming increasingly dominant as a domain of scientific knowledge production and an environment for research applications.

  3. Laser Ablation in situ (U-Th-Sm)/He and U-Pb Double-Dating of Apatite and Zircon: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.

    2015-12-01

    We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.

  4. Restructuring the rotor analysis program C-60

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.

  5. A Review of Psychometric Data Analysis and Applications in Modelling of Academic Achievement in Tertiary Education

    ERIC Educational Resources Information Center

    Gray, Geraldine; McGuinness, Colm; Owende, Philip; Carthy, Aiden

    2014-01-01

    Increasing college participation rates, and diversity in student population, is posing a challenge to colleges in their attempts to facilitate learners achieve their full academic potential. Learning analytics is an evolving discipline with capability for educational data analysis that could enable better understanding of learning process, and…

  6. Application of Haddon's matrix in qualitative research methodology: an experience in burns epidemiology.

    PubMed

    Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza

    2012-01-01

    Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.

  7. Application of near infrared spectroscopy to the analysis and fast quality assessment of traditional Chinese medicinal products

    PubMed Central

    Zhang, Chao; Su, Jinghua

    2014-01-01

    Near infrared spectroscopy (NIRS) has been widely applied in both qualitative and quantitative analysis. There is growing interest in its application to traditional Chinese medicine (TCM) and a review of recent developments in the field is timely. To present an overview of recent applications of NIRS to the identification, classification and analysis of TCM products, studies describing the application of NIRS to TCM products are classified into those involving qualitative and quantitative analysis. In addition, the application of NIRS to the detection of illegal additives and the rapid assessment of quality of TCMs by fast inspection are also described. This review covers over 100 studies emphasizing the application of NIRS in different fields. Furthermore, basic analytical principles and specific examples are used to illustrate the feasibility and effectiveness of NIRS in pattern identification. NIRS provides an effective and powerful tool for the qualitative and quantitative analysis of TCM products. PMID:26579382

  8. Application of non-traditional stable isotopes in analytical ecogeochemistry assessed by MC ICP-MS--A critical review.

    PubMed

    Irrgeher, Johanna; Prohaska, Thomas

    2016-01-01

    Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.

  9. [Blood sampling using "dried blood spot": a clinical biology revolution underway?].

    PubMed

    Hirtz, Christophe; Lehmann, Sylvain

    2015-01-01

    Blood testing using the dried blood spot (DBS) is used since the 1960s in clinical analysis, mainly within the framework of the neonatal screening (Guthrie test). Since then numerous analytes such as nucleic acids, small molecules or lipids, were successfully measured on the DBS. While this pre-analytical method represents an interesting alternative to classic blood sampling, its use in routine is still limited. We review here the different clinical applications of the blood sampling on DBS and estimate its future place, supported by the new methods of analysis as the LC-MS mass spectrometry.

  10. Role of Mass Spectrometry in Clinical Endocrinology.

    PubMed

    Ketha, Siva S; Singh, Ravinder J; Ketha, Hemamalini

    2017-09-01

    The advent of mass spectrometry into the clinical laboratory has led to an improvement in clinical management of several endocrine diseases. Liquid chromatography tandem mass spectrometry found some of its first clinical applications in the diagnosis of inborn errors of metabolism, in quantitative steroid analysis, and in drug analysis laboratories. Mass spectrometry assays offer analytical sensitivity and specificity that is superior to immunoassays for many analytes. This article highlights several areas of clinical endocrinology that have witnessed the use of liquid chromatography tandem mass spectrometry to improve clinical outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  12. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  13. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  14. Advantages and Challenges of Dried Blood Spot Analysis by Mass Spectrometry Across the Total Testing Process

    PubMed Central

    Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter

    2016-01-01

    Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263

  15. Review of methodological and experimental LIBS techniques for coal analysis and their application in power plants in China

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Zhang, Lei; Zhao, Shu-Xia; Li, Yu-Fang; Gong, Yao; Dong, Lei; Ma, Wei-Guang; Yin, Wang-Bao; Yao, Shun-Chun; Lu, Ji-Dong; Xiao, Lian-Tuan; Jia, Suo-Tang

    2016-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an emerging analytical spectroscopy technique. This review presents the main recent developments in China regarding the implementation of LIBS for coal analysis. The paper mainly focuses on the progress of the past few years in the fundamentals, data pretreatment, calibration model, and experimental issues of LIBS and its application to coal analysis. Many important domestic studies focusing on coal quality analysis have been conducted. For example, a proposed novel hybrid quantification model can provide more reproducible quantitative analytical results; the model obtained the average absolute errors (AREs) of 0.42%, 0.05%, 0.07%, and 0.17% for carbon, hydrogen, volatiles, and ash, respectively, and a heat value of 0.07 MJ/kg. Atomic/ionic emission lines and molecular bands, such as CN and C2, have been employed to generate more accurate analysis results, achieving an ARE of 0.26% and a 0.16% limit of detection (LOD) for the prediction of unburned carbon in fly ashes. Both laboratory and on-line LIBS apparatuses have been developed for field application in coal-fired power plants. We consider that both the accuracy and the repeatability of the elemental and proximate analysis of coal have increased significantly and further efforts will be devoted to realizing large-scale commercialization of coal quality analyzer in China.

  16. A Study of the Applicability of Atomic Emission Spectroscopy (AES), Fourier Transform Infrared (FT-IR) Spectroscopy, Direct Reading and Analytical Ferrography on High Performance Aircraft Engine Lubricating Oils

    DTIC Science & Technology

    1998-01-01

    Ferrography on High Performance Aircraft Engine Lubricating Oils Allison M. Toms, Sharon 0. Hem, Tim Yarborough Joint Oil Analysis Program Technical...turbine engines by spectroscopy (AES and FT-IR) and direct reading and analytical ferrography . A statistical analysis of the data collected is...presented. Key Words: Analytical ferrography ; atomic emission spectroscopy; condition monitoring; direct reading ferrography ; Fourier transform infrared

  17. Recent trends in analytical methods and separation techniques for drugs of abuse in hair.

    PubMed

    Baciu, T; Borrull, F; Aguilar, C; Calull, M

    2015-01-26

    Hair analysis of drugs of abuse has been a subject of growing interest from a clinical, social and forensic perspective for years because of the broad time detection window after intake in comparison to urine and blood analysis. Over the last few years, hair analysis has gained increasing attention and recognition for the retrospective investigation of drug abuse in a wide variety of contexts, shown by the large number of applications developed. This review aims to provide an overview of the state of the art and the latest trends used in the literature from 2005 to the present in the analysis of drugs of abuse in hair, with a special focus on separation analytical techniques and their hyphenation with mass spectrometry detection. The most recently introduced sample preparation techniques are also addressed in this paper. The main strengths and weaknesses of all of these approaches are critically discussed by means of relevant applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. ExaSAT: An exascale co-design tool for performance modeling

    DOE PAGES

    Unat, Didem; Chan, Cy; Zhang, Weiqun; ...

    2015-02-09

    One of the emerging challenges to designing HPC systems is understanding and projecting the requirements of exascale applications. In order to determine the performance consequences of different hardware designs, analytic models are essential because they can provide fast feedback to the co-design centers and chip designers without costly simulations. However, current attempts to analytically model program performance typically rely on the user manually specifying a performance model. Here we introduce the ExaSAT framework that automates the extraction of parameterized performance models directly from source code using compiler analysis. The parameterized analytic model enables quantitative evaluation of a broad range ofmore » hardware design trade-offs and software optimizations on a variety of different performance metrics, with a primary focus on data movement as a metric. Finally, we demonstrate the ExaSAT framework’s ability to perform deep code analysis of a proxy application from the Department of Energy Combustion Co-design Center to illustrate its value to the exascale co-design process. ExaSAT analysis provides insights into the hardware and software trade-offs and lays the groundwork for exploring a more targeted set of design points using cycle-accurate architectural simulators.« less

  20. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  1. Future Roles for Autonomous Vertical Lift in Disaster Relief and Emergency Response

    NASA Technical Reports Server (NTRS)

    Young, Larry A.

    2006-01-01

    System analysis concepts are applied to the assessment of potential collaborative contributions of autonomous system and vertical lift (a.k.a. rotorcraft, VTOL, powered-lift, etc.) technologies to the important, and perhaps underemphasized, application domain of disaster relief and emergency response. In particular, an analytic framework is outlined whereby system design functional requirements for an application domain can be derived from defined societal good goals and objectives.

  2. Methods for determination of inorganic substances in water and fluvial sediments

    USGS Publications Warehouse

    Fishman, Marvin J.; Friedman, Linda C.

    1989-01-01

    Chapter Al of the laboratory manual contains methods used by the U.S. Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, the total recoverable and total of constituents in water-suspended sediment samples, and the recoverable and total concentrations of constituents in samples of bottom material. The introduction to the manual includes essential definitions and a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including the accuracy and precision of analyses, the use of standard-reference water samples, and the operation of an effective quality-assurance program. Methods for sample preparation and pretreatment are given also. A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods of these techniques are arranged alphabetically by constituent. For each method, the general topics covered are the application, the principle of the method, the interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 126 methods are given for the determination of 70 inorganic constituents and physical properties of water, suspended sediment, and bottom material.

  3. Methods for determination of inorganic substances in water and fluvial sediments

    USGS Publications Warehouse

    Fishman, Marvin J.; Friedman, Linda C.

    1985-01-01

    Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.

  4. Research education: findings of a study of teaching-learning research using multiple analytical perspectives.

    PubMed

    Vandermause, Roxanne; Barbosa-Leiker, Celestina; Fritz, Roschelle

    2014-12-01

    This multimethod, qualitative study provides results for educators of nursing doctoral students to consider. Combining the expertise of an empirical analytical researcher (who uses statistical methods) and an interpretive phenomenological researcher (who uses hermeneutic methods), a course was designed that would place doctoral students in the midst of multiparadigmatic discussions while learning fundamental research methods. Field notes and iterative analytical discussions led to patterns and themes that highlight the value of this innovative pedagogical application. Using content analysis and interpretive phenomenological approaches, together with one of the students, data were analyzed from field notes recorded in real time over the period the course was offered. This article describes the course and the study analysis, and offers the pedagogical experience as transformative. A link to a sample syllabus is included in the article. The results encourage nurse educators of doctoral nursing students to focus educational practice on multiple methodological perspectives. Copyright 2014, SLACK Incorporated.

  5. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  6. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  7. Instrumental Analysis in Environmental Chemistry - Liquid and Solid Phase Detection Systems

    ERIC Educational Resources Information Center

    Stedman, Donald H.; Meyers, Philip A.

    1974-01-01

    This is the second of two reviews dealing with analytical methods applicable to environmental chemistry. Methods are discussed under gas, liquid, or solid depending upon the state of the analyte during detection. (RH)

  8. An Analysis of Machine- and Human-Analytics in Classification.

    PubMed

    Tam, Gary K L; Kothari, Vivek; Chen, Min

    2017-01-01

    In this work, we present a study that traces the technical and cognitive processes in two visual analytics applications to a common theoretic model of soft knowledge that may be added into a visual analytics process for constructing a decision-tree model. Both case studies involved the development of classification models based on the "bag of features" approach. Both compared a visual analytics approach using parallel coordinates with a machine-learning approach using information theory. Both found that the visual analytics approach had some advantages over the machine learning approach, especially when sparse datasets were used as the ground truth. We examine various possible factors that may have contributed to such advantages, and collect empirical evidence for supporting the observation and reasoning of these factors. We propose an information-theoretic model as a common theoretic basis to explain the phenomena exhibited in these two case studies. Together we provide interconnected empirical and theoretical evidence to support the usefulness of visual analytics.

  9. Multivariate analysis in the pharmaceutical industry: enabling process understanding and improvement in the PAT and QbD era.

    PubMed

    Ferreira, Ana P; Tobyn, Mike

    2015-01-01

    In the pharmaceutical industry, chemometrics is rapidly establishing itself as a tool that can be used at every step of product development and beyond: from early development to commercialization. This set of multivariate analysis methods allows the extraction of information contained in large, complex data sets thus contributing to increase product and process understanding which is at the core of the Food and Drug Administration's Process Analytical Tools (PAT) Guidance for Industry and the International Conference on Harmonisation's Pharmaceutical Development guideline (Q8). This review is aimed at providing pharmaceutical industry professionals an introduction to multivariate analysis and how it is being adopted and implemented by companies in the transition from "quality-by-testing" to "quality-by-design". It starts with an introduction to multivariate analysis and the two methods most commonly used: principal component analysis and partial least squares regression, their advantages, common pitfalls and requirements for their effective use. That is followed with an overview of the diverse areas of application of multivariate analysis in the pharmaceutical industry: from the development of real-time analytical methods to definition of the design space and control strategy, from formulation optimization during development to the application of quality-by-design principles to improve manufacture of existing commercial products.

  10. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  11. Miniaturised wireless smart tag for optical chemical analysis applications.

    PubMed

    Steinberg, Matthew D; Kassal, Petar; Tkalčec, Biserka; Murković Steinberg, Ivana

    2014-01-01

    A novel miniaturised photometer has been developed as an ultra-portable and mobile analytical chemical instrument. The low-cost photometer presents a paradigm shift in mobile chemical sensor instrumentation because it is built around a contactless smart card format. The photometer tag is based on the radio-frequency identification (RFID) smart card system, which provides short-range wireless data and power transfer between the photometer and a proximal reader, and which allows the reader to also energise the photometer by near field electromagnetic induction. RFID is set to become a key enabling technology of the Internet-of-Things (IoT), hence devices such as the photometer described here will enable numerous mobile, wearable and vanguard chemical sensing applications in the emerging connected world. In the work presented here, we demonstrate the characterisation of a low-power RFID wireless sensor tag with an LED/photodiode-based photometric input. The performance of the wireless photometer has been tested through two different model analytical applications. The first is photometry in solution, where colour intensity as a function of dye concentration was measured. The second is an ion-selective optode system in which potassium ion concentrations were determined by using previously well characterised bulk optode membranes. The analytical performance of the wireless photometer smart tag is clearly demonstrated by these optical absorption-based analytical experiments, with excellent data agreement to a reference laboratory instrument. © 2013 Elsevier B.V. All rights reserved.

  12. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  13. Accelerator-based analytical technique in the evaluation of some Nigeria’s natural minerals: Fluorite, tourmaline and topaz

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, O. A.; Mazzoli, C.; Ceccato, D.; Akintunde, J. A.; De Poli, M.; Moschini, G.

    2005-10-01

    For the first time, the complementary accelerator-based analytical technique of PIXE and electron microprobe analysis (EMPA) were employed for the characterization of some Nigeria's natural minerals namely fluorite, tourmaline and topaz. These minerals occur in different areas in Nigeria. The minerals are mainly used as gemstones and for other scientific and technological applications and therefore are very important. There is need to characterize them to know the quality of these gemstones and update the geochemical data on them geared towards useful applications. PIXE analysis was carried out using the 1.8 MeV collimated proton beam from the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy. The novel results which show many elements at different concentrations in these minerals are presented and discussed.

  14. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  15. Fifty years of solid-phase extraction in water analysis--historical development and overview.

    PubMed

    Liska, I

    2000-07-14

    The use of an appropriate sample handling technique is a must in an analysis of organic micropollutants in water. The efforts to use a solid phase for the recovery of analytes from a water matrix prior to their detection have a long history. Since the first experimental trials using activated carbon filters that were performed 50 years ago, solid-phase extraction (SPE) has become an established sample preparation technique. The initial experimental applications of SPE resulted in widespread use of this technique in current water analysis and also to adoption of SPE into standardized analytical methods. During the decades of its evolution, chromatographers became aware of the advantages of SPE and, despite many innovations that appeared in the last decade, new SPE developments are still expected in the future. A brief overview of 50 years of the history of the use of SPE in organic trace analysis of water is given in presented paper.

  16. Recent advances and applications of gas chromatography vacuum ultraviolet spectroscopy.

    PubMed

    Santos, Inês C; Schug, Kevin A

    2017-01-01

    The vacuum ultraviolet spectrophotometer was developed recently as an alternative to existing gas chromatography detectors. This detector measures the absorption of gas-phase chemical species in the range of 120-240 nm, where all chemical compounds present unique absorption spectra. Therefore, qualitative analysis can be performed and quantification follows standard Beer-Lambert law principles. Different fields of application, such as petrochemical, food, and environmental analysis have been explored. Commonly demonstrated is the capability for facile deconvolution of co-eluting analytes. The concept of additive absorption for co-eluting analytes has also been advanced for classification and speciation of complex mixtures using a data treatment procedure termed time interval deconvolution. Furthermore, pseudo-absolute quantitation can be performed for system diagnosis, as well as potentially calibrationless quantitation. In this manuscript an overview of these features, the vacuum ultraviolet spectrophotometer instrumentation, and performance capabilities are given. A discussion of the applications of the vacuum ultraviolet detector is provided by describing and discussing the papers published thus far since 2014. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  18. Possibilities of Utilizing the Method of Analytical Hierarchy Process Within the Strategy of Corporate Social Business

    NASA Astrophysics Data System (ADS)

    Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter

    2010-01-01

    The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).

  19. Measurement of Henry's Law Constants Using Internal Standards: A Quantitative GC Experiment for the Instrumental Analysis or Environmental Chemistry Laboratory

    ERIC Educational Resources Information Center

    Ji, Chang; Boisvert, Susanne M.; Arida, Ann-Marie C.; Day, Shannon E.

    2008-01-01

    An internal standard method applicable to undergraduate instrumental analysis or environmental chemistry laboratory has been designed and tested to determine the Henry's law constants for a series of alkyl nitriles. In this method, a mixture of the analytes and an internal standard is prepared and used to make a standard solution (organic solvent)…

  20. Advances in silver ion chromatography for the analysis of fatty acids and triacylglycerols-2001 to 2011.

    PubMed

    Momchilova, Svetlana M; Nikolova-Damyanova, Boryana M

    2012-01-01

    An effort is made to critically present the achievements in silver ion chromatography during the last decade. Novelties in columns, mobile-phase compositions and detectors are described. Recent applications of silver ion chromatography in the analysis of fatty acids and triacylglycerols are presented while stressing novel analytical strategies or new objects. The tendencies in the application of the method in complementary ways with reversed-phase chromatography, chiral chromatography and, especially, mass detection are outlined.

  1. Planning and Evaluation of New Academic Library Services by Means of Web-Based Conjoint Analysis

    ERIC Educational Resources Information Center

    Decker, Reinhold; Hermelbracht, Antonia

    2006-01-01

    New product development is an omnipresent challenge to modern libraries in the information age. Therefore, we present the design and selected results of a comprehensive research project aiming at the systematic and user-oriented planning of academic library services by means of conjoint analysis. The applicability of the analytical framework used…

  2. Analytical capabilities and services of Lawrence Livermore Laboratory's General Chemistry Division. [Methods available at Lawrence Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutmacher, R.; Crawford, R.

    This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.

  3. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  4. On the line-shape analysis of Compton profiles and its application to neutron scattering

    NASA Astrophysics Data System (ADS)

    Romanelli, G.; Krzystyniak, M.

    2016-05-01

    Analytical properties of Compton profiles are used in order to simplify the analysis of neutron Compton scattering experiments. In particular, the possibility to fit the difference of Compton profiles is discussed as a way to greatly decrease the level of complexity of the data treatment, making the analysis easier, faster and more robust. In the context of the novel method proposed, two mathematical models describing the shapes of differenced Compton profiles are discussed: the simple Gaussian approximation for harmonic and isotropic local potential, and an analytical Gauss-Hermite expansion for an anharmonic or anisotropic potential. The method is applied to data collected by VESUVIO spectrometer at ISIS neutron and muon pulsed source (UK) on Copper and Aluminium samples at ambient and low temperatures.

  5. Rotor/Wing Interactions in Hover

    NASA Technical Reports Server (NTRS)

    Young, Larry A.; Derby, Michael R.

    2002-01-01

    Hover predictions of tiltrotor aircraft are hampered by the lack of accurate and computationally efficient models for rotor/wing interactional aerodynamics. This paper summarizes the development of an approximate, potential flow solution for the rotor-on-rotor and wing-on-rotor interactions. This analysis is based on actuator disk and vortex theory and the method of images. The analysis is applicable for out-of-ground-effect predictions. The analysis is particularly suited for aircraft preliminary design studies. Flow field predictions from this simple analytical model are validated against experimental data from previous studies. The paper concludes with an analytical assessment of the influence of rotor-on-rotor and wing-on-rotor interactions. This assessment examines the effect of rotor-to-wing offset distance, wing sweep, wing span, and flaperon incidence angle on tiltrotor inflow and performance.

  6. Portable Solid Phase Micro-Extraction Coupled with Ion Mobility Spectrometry System for On-Site Analysis of Chemical Warfare Agents and Simulants in Water Samples

    PubMed Central

    Yang, Liu; Han, Qiang; Cao, Shuya; Yang, Jie; Yang, Junchao; Ding, Mingyu

    2014-01-01

    On-site analysis is an efficient approach to facilitate analysis at the location of the system under investigation as it can result in more accurate, more precise and quickly available analytical data. In our work, a novel self-made thermal desorption based interface was fabricated to couple solid-phase microextraction with ion mobility spectrometry for on-site water analysis. The portable interface can be connected with the front-end of an ion mobility spectrometer directly without other modifications. The analytical performance was evaluated via the extraction of chemical warfare agents and simulants in water samples. Several parameters including ionic strength and extraction time have been investigated in detail. The application of the developed method afforded satisfactory recoveries ranging from 72.9% to 114.4% when applied to the analysis of real water samples. PMID:25384006

  7. Green analytical chemistry introduction to chloropropanols determination at no economic and analytical performance costs?

    PubMed

    Jędrkiewicz, Renata; Orłowski, Aleksander; Namieśnik, Jacek; Tobiszewski, Marek

    2016-01-15

    In this study we perform ranking of analytical procedures for 3-monochloropropane-1,2-diol determination in soy sauces by PROMETHEE method. Multicriteria decision analysis was performed for three different scenarios - metrological, economic and environmental, by application of different weights to decision making criteria. All three scenarios indicate capillary electrophoresis-based procedure as the most preferable. Apart from that the details of ranking results differ for these three scenarios. The second run of rankings was done for scenarios that include metrological, economic and environmental criteria only, neglecting others. These results show that green analytical chemistry-based selection correlates with economic, while there is no correlation with metrological ones. This is an implication that green analytical chemistry can be brought into laboratories without analytical performance costs and it is even supported by economic reasons. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Microemulsion Electrokinetic Chromatography.

    PubMed

    Buchberger, Wolfgang

    2016-01-01

    Microemulsion electrokinetic chromatography (MEEKC) is a special mode of capillary electrophoresis employing a microemulsion as carrier electrolyte. Analytes may partition between the aqueous phase of the microemulsion and its oil droplets which act as a pseudostationary phase. The technique is well suited for the separation of neutral species, in which case charged oil droplets (obtained by addition of an anionic or cationic surfactant) are present. A single set of separation parameters may be sufficient for separation of a wide range of analytes belonging to quite different chemical classes. Fine-tuning of resolution and analysis time may be achieved by addition of organic solvents, by changes in the nature of the surfactants (and cosurfactants) used to stabilize the microemulsion, or by various additives that may undergo some additional interactions with the analytes. Besides the separation of neutral analytes (which may be the most important application area of MEEKC), it can also be employed for cationic and/or anionic species. In this chapter, MEEKC conditions are summarized that have proven their reliability for routine analysis. Furthermore, the mechanisms encountered in MEEKC allow an efficient on-capillary preconcentration of analytes, so that the problem of poor concentration sensitivity of ultraviolet absorbance detection is circumvented.

  9. X-Graphs: Language and Algorithms for Heterogeneous Graph Streams

    DTIC Science & Technology

    2017-09-01

    INTRODUCTION 1 3 METHODS , ASUMPTIONS, AND PROCEDURES 2 Software Abstractions for Graph Analytic Applications 2 High performance Platforms for Graph Processing...data is stored in a distributed file system. 3 METHODS , ASUMPTIONS, AND PROCEDURES Software Abstractions for Graph Analytic Applications To...implementations of novel methods for networks analysis: several methods for detection of overlapping communities, personalized PageRank, node embeddings into a d

  10. Analysis of THG modes for femtosecond laser pulse

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Sidorov, Pavel S.

    2017-05-01

    THG is used nowadays in many practical applications such as a substance diagnostics, and biological objects imaging, and etc. With developing of new materials and technology (for example, photonic crystal) an attention to THG process analysis grow. Therefore, THG features understanding are a modern problem. Early we have developed new analytical approach based on using the problem invariant for analytical solution construction of the THG process. It should be stressed that we did not use a basic wave non-depletion approximation. Nevertheless, a long pulse duration approximation and plane wave approximation has applied. The analytical solution demonstrates, in particular, an optical bistability property (and may other regimes of frequency tripling) for the third harmonic generation process. But, obviously, this approach does not reflect an influence of a medium dispersion on the frequency tripling. Therefore, in this paper we analyze THG efficiency of a femtosecond laser pulse taking into account a second order dispersion affect as well as self- and crossmodulation of the interacting waves affect on the frequency conversion process. Analysis is made using a computer simulation on the base of Schrödinger equations describing the process under consideration.

  11. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  12. A Generalized Approach to Forensic Dye Identification: Development and Utility of Reference Libraries.

    PubMed

    Groves, Ethan; Palenik, Skip; Palenik, Christopher S

    2018-04-18

    While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.

  13. Applications of fiber-optics-based nanosensors to drug discovery.

    PubMed

    Vo-Dinh, Tuan; Scaffidi, Jonathan; Gregas, Molly; Zhang, Yan; Seewaldt, Victoria

    2009-08-01

    Fiber-optic nanosensors are fabricated by heating and pulling optical fibers to yield sub-micron diameter tips and have been used for in vitro analysis of individual living mammalian cells. Immobilization of bioreceptors (e.g., antibodies, peptides, DNA) selective to targeting analyte molecules of interest provides molecular specificity. Excitation light can be launched into the fiber, and the resulting evanescent field at the tip of the nanofiber can be used to excite target molecules bound to the bioreceptor molecules. The fluorescence or surface-enhanced Raman scattering produced by the analyte molecules is detected using an ultra-sensitive photodetector. This article provides an overview of the development and application of fiber-optic nanosensors for drug discovery. The nanosensors provide minimally invasive tools to probe subcellular compartments inside single living cells for health effect studies (e.g., detection of benzopyrene adducts) and medical applications (e.g., monitoring of apoptosis in cells treated with anticancer drugs).

  14. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-09-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing numbers of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis relating the multiple visualisation challenges to a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  15. Review: visual analytics of climate networks

    NASA Astrophysics Data System (ADS)

    Nocke, T.; Buschmann, S.; Donges, J. F.; Marwan, N.; Schulz, H.-J.; Tominski, C.

    2015-04-01

    Network analysis has become an important approach in studying complex spatiotemporal behaviour within geophysical observation and simulation data. This new field produces increasing amounts of large geo-referenced networks to be analysed. Particular focus lies currently on the network analysis of the complex statistical interrelationship structure within climatological fields. The standard procedure for such network analyses is the extraction of network measures in combination with static standard visualisation methods. Existing interactive visualisation methods and tools for geo-referenced network exploration are often either not known to the analyst or their potential is not fully exploited. To fill this gap, we illustrate how interactive visual analytics methods in combination with geovisualisation can be tailored for visual climate network investigation. Therefore, the paper provides a problem analysis, relating the multiple visualisation challenges with a survey undertaken with network analysts from the research fields of climate and complex systems science. Then, as an overview for the interested practitioner, we review the state-of-the-art in climate network visualisation and provide an overview of existing tools. As a further contribution, we introduce the visual network analytics tools CGV and GTX, providing tailored solutions for climate network analysis, including alternative geographic projections, edge bundling, and 3-D network support. Using these tools, the paper illustrates the application potentials of visual analytics for climate networks based on several use cases including examples from global, regional, and multi-layered climate networks.

  16. A comparative analysis of modified binders : original asphalts and materials extracted from existing pavements.

    DOT National Transportation Integrated Search

    2010-01-18

    This research demonstrated the application of gel permeation chromatography (GPC) as an analytical tool to : ascertain the amounts of polymer modifiers in polymer modified asphalt cements, which are soluble in eluting GPC : solvents. The technique wa...

  17. A Comparative Analysis of Modified Binders : Original Asphalt and Material Extracted from Existing Pavement

    DOT National Transportation Integrated Search

    2010-01-18

    This research demonstrated the application of gel permeation chromatography (GPC) as an analytical tool to ascertain the amounts of polymer modifiers in polymer modified asphalt cements, which are soluble in eluting GPC solvents. The technique was ap...

  18. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    PubMed

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  19. From pre-registration to publication: a non-technical primer for conducting a meta-analysis to synthesize correlational data

    PubMed Central

    Quintana, Daniel S.

    2015-01-01

    Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses. PMID:26500598

  20. From pre-registration to publication: a non-technical primer for conducting a meta-analysis to synthesize correlational data.

    PubMed

    Quintana, Daniel S

    2015-01-01

    Meta-analysis synthesizes a body of research investigating a common research question. Outcomes from meta-analyses provide a more objective and transparent summary of a research area than traditional narrative reviews. Moreover, they are often used to support research grant applications, guide clinical practice, and direct health policy. The aim of this article is to provide a practical and non-technical guide for psychological scientists that outlines the steps involved in planning and performing a meta-analysis of correlational datasets. I provide a supplementary R script to demonstrate each analytical step described in the paper, which is readily adaptable for researchers to use for their analyses. While the worked example is the analysis of a correlational dataset, the general meta-analytic process described in this paper is applicable for all types of effect sizes. I also emphasize the importance of meta-analysis protocols and pre-registration to improve transparency and help avoid unintended duplication. An improved understanding this tool will not only help scientists to conduct their own meta-analyses but also improve their evaluation of published meta-analyses.

  1. Application of a Portable Multi-Analyte Biosensor for Organic Acid Determination in Silage.

    PubMed

    Pilas, Johanna; Yazici, Yasemen; Selmer, Thorsten; Keusgen, Michael; Schöning, Michael J

    2018-05-08

    Multi-analyte biosensors may offer the opportunity to perform cost-effective and rapid analysis with reduced sample volume, as compared to electrochemical biosensing of each analyte individually. This work describes the development of an enzyme-based biosensor system for multi-parametric determination of four different organic acids. The biosensor array comprises five working electrodes for simultaneous sensing of ethanol, formate, d-lactate, and l-lactate, and an integrated counter electrode. Storage stability of the biosensor was evaluated under different conditions (stored at +4 °C in buffer solution and dry at −21 °C, +4 °C, and room temperature) over a period of 140 days. After repeated and regular application, the individual sensing electrodes exhibited the best stability when stored at −21 °C. Furthermore, measurements in silage samples (maize and sugarcane silage) were conducted with the portable biosensor system. Comparison with a conventional photometric technique demonstrated successful employment for rapid monitoring of complex media.

  2. Application of a Portable Multi-Analyte Biosensor for Organic Acid Determination in Silage

    PubMed Central

    Pilas, Johanna; Yazici, Yasemen; Selmer, Thorsten; Keusgen, Michael

    2018-01-01

    Multi-analyte biosensors may offer the opportunity to perform cost-effective and rapid analysis with reduced sample volume, as compared to electrochemical biosensing of each analyte individually. This work describes the development of an enzyme-based biosensor system for multi-parametric determination of four different organic acids. The biosensor array comprises five working electrodes for simultaneous sensing of ethanol, formate, d-lactate, and l-lactate, and an integrated counter electrode. Storage stability of the biosensor was evaluated under different conditions (stored at +4 °C in buffer solution and dry at −21 °C, +4 °C, and room temperature) over a period of 140 days. After repeated and regular application, the individual sensing electrodes exhibited the best stability when stored at −21 °C. Furthermore, measurements in silage samples (maize and sugarcane silage) were conducted with the portable biosensor system. Comparison with a conventional photometric technique demonstrated successful employment for rapid monitoring of complex media. PMID:29738487

  3. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Beaver, Justin M; BogenII, Paul L.

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less

  4. Exploring the Analytical Processes of Intelligence Analysts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Kuchar, Olga A.; Wolf, Katherine E.

    We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.

  5. Pumping tests in nonuniform aquifers - The radially symmetric case

    USGS Publications Warehouse

    Butler, J.J.

    1988-01-01

    Traditionally, pumping-test-analysis methodology has been limited to applications involving aquifers whose properties are assumed uniform in space. This work attempts to assess the applicability of analytical methodology to a broader class of units with spatially varying properties. An examination of flow behavior in a simple configuration consisting of pumping from the center of a circular disk embedded in a matrix of differing properties is the basis for this investigation. A solution describing flow in this configuration is obtained through Laplace-transform techniques using analytical and numerical inversion schemes. Approaches for the calculation of flow properties in conditions that can be roughly represented by this simple configuration are proposed. Possible applications include a wide variety of geologic structures, as well as the case of a well skin resulting from drilling or development. Of more importance than the specifics of these techniques for analysis of water-level responses is the insight into flow behavior during a pumping test that is provided by the large-time form of the derived solution. The solution reveals that drawdown during a pumping test can be considered to consist of two components that are dependent and independent of near-well properties, respectively. Such an interpretation of pumping-test drawdown allows some general conclusions to be drawn concerning the relationship between parameters calculated using analytical approaches based on curve-matching and those calculated using approaches based on the slope of a semilog straight line plot. The infinite-series truncation that underlies the semilog analytical approaches is shown to remove further contributions of near-well material to total drawdown. In addition, the semilog distance-drawdown approach is shown to yield an expression that is equivalent to the Thiem equation. These results allow some general recommendations to be made concerning observation-well placement for pumping tests in nonuniform aquifers. The relative diffusivity of material on either side of a discontinuity is shown to be the major factor in controlling flow behavior during the period in which the front of the cone of depression is moving across the discontinuity. Though resulting from an analysis of flow in an idealized configuration, the insights of this work into flow behavior during a pumping test are applicable to a wide class of nonuniform units. ?? 1988.

  6. Light distribution in diffractive multifocal optics and its optimization.

    PubMed

    Portney, Valdemar

    2011-11-01

    To expand a geometrical model of diffraction efficiency and its interpretation to the multifocal optic and to introduce formulas for analysis of far and near light distribution and their application to multifocal intraocular lenses (IOLs) and to diffraction efficiency optimization. Medical device consulting firm, Newport Coast, California, USA. Experimental study. Application of a geometrical model to the kinoform (single focus diffractive optical element) was expanded to a multifocal optic to produce analytical definitions of light split between far and near images and light loss to other diffraction orders. The geometrical model gave a simple interpretation of light split in a diffractive multifocal IOL. An analytical definition of light split between far, near, and light loss was introduced as curve fitting formulas. Several examples of application to common multifocal diffractive IOLs were developed; for example, to light-split change with wavelength. The analytical definition of diffraction efficiency may assist in optimization of multifocal diffractive optics that minimize light loss. Formulas for analysis of light split between different foci of multifocal diffractive IOLs are useful in interpreting diffraction efficiency dependence on physical characteristics, such as blaze heights of the diffractive grooves and wavelength of light, as well as for optimizing multifocal diffractive optics. Disclosure is found in the footnotes. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  7. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  8. Analytical framework and tool kit for SEA follow-up

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran

    2009-04-15

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less

  9. Principles and applications of Raman spectroscopy in pharmaceutical drug discovery and development.

    PubMed

    Gala, Urvi; Chauhan, Harsh

    2015-02-01

    In recent years, Raman spectroscopy has become increasingly important as an analytical technique in various scientific areas of research and development. This is partly due to the technological advancements in Raman instrumentation and partly due to detailed fingerprinting that can be derived from Raman spectra. Its versatility of applications, rapidness of collection and easy analysis have made Raman spectroscopy an attractive analytical tool. The following review describes Raman spectroscopy and its application within the pharmaceutical industry. The authors explain the theory of Raman scattering and its variations in Raman spectroscopy. The authors also highlight how Raman spectra are interpreted, providing examples. Raman spectroscopy has a number of potential applications within drug discovery and development. It can be used to estimate the molecular activity of drugs and to establish a drug's physicochemical properties such as its partition coefficient. It can also be used in compatibility studies during the drug formulation process. Raman spectroscopy's immense potential should be further investigated in future.

  10. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review.

    PubMed

    Dwivedi, D; Lepkova, K; Becker, T

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  11. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Lepkova, K.; Becker, T.

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  12. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    PubMed Central

    Dwivedi, D.; Becker, T.

    2017-01-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed. PMID:28413351

  13. On the Application of Euler Deconvolution to the Analytic Signal

    NASA Astrophysics Data System (ADS)

    Fedi, M.; Florio, G.; Pasteka, R.

    2005-05-01

    In the last years papers on Euler deconvolution (ED) used formulations that accounted for the unknown background field, allowing to consider the structural index (N) an unknown to be solved for, together with the source coordinates. Among them, Hsu (2002) and Fedi and Florio (2002) independently pointed out that the use of an adequate m-order derivative of the field, instead than the field itself, allowed solving for both N and source position. For the same reason, Keating and Pilkington (2004) proposed the ED of the analytic signal. A function being analyzed by ED must be homogeneous but also harmonic, because it must be possible to compute its vertical derivative, as well known from potential field theory. Huang et al. (1995), demonstrated that analytic signal is a homogeneous function, but, for instance, it is rather obvious that the magnetic field modulus (corresponding to the analytic signal of a gravity field) is not a harmonic function (e.g.: Grant & West, 1965). Thus, it appears that a straightforward application of ED to the analytic signal is not possible because a vertical derivation of this function is not correct by using standard potential fields analysis tools. In this note we want to theoretically and empirically check what kind of error are caused in the ED by such wrong assumption about analytic signal harmonicity. We will discuss results on profile and map synthetic data, and use a simple method to compute the vertical derivative of non-harmonic functions measured on a horizontal plane. Our main conclusions are: 1. To approximate a correct evaluation of the vertical derivative of a non-harmonic function it is useful to compute it with finite-difference, by using upward continuation. 2. We found that the errors on the vertical derivative computed as if the analytic signal was harmonic reflects mainly on the structural index estimate; these errors can mislead an interpretation even though the depth estimates are almost correct. 3. Consistent estimates of depth and S.I. are instead obtained by using a finite-difference vertical derivative of the analytic signal. 4. Analysis of a case history confirms the strong error in the estimation of structural index if the analytic signal is treated as an harmonic function.

  14. Reuse of the Cloud Analytics and Collaboration Environment within Tactical Applications (TacApps): A Feasibility Analysis

    DTIC Science & Technology

    2016-03-01

    Representational state transfer  Java messaging service  Java application programming interface (API)  Internet relay chat (IRC)/extensible messaging and...JBoss application server or an Apache Tomcat servlet container instance. The relational database management system can be either PostgreSQL or MySQL ... Java library called direct web remoting. This library has been part of the core CACE architecture for quite some time; however, there have not been

  15. Interactive Visual Analytics Approch for Exploration of Geochemical Model Simulations with Different Parameter Sets

    NASA Astrophysics Data System (ADS)

    Jatnieks, Janis; De Lucia, Marco; Sips, Mike; Dransch, Doris

    2015-04-01

    Many geoscience applications can benefit from testing many combinations of input parameters for geochemical simulation models. It is, however, a challenge to screen the input and output data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we propose a Visual Analytics approach that has been developed in an ongoing collaboration between computer science and geoscience researchers. Our Visual Analytics approach uses visualization methods of hierarchical horizontal axis, multi-factor stacked bar charts and interactive semi-automated filtering for input and output data together with automatic sensitivity analysis. This guides the users towards significant relationships. We implement our approach as an interactive data exploration tool. It is designed with flexibility in mind, so that a diverse set of tasks such as inverse modeling, sensitivity analysis and model parameter refinement can be supported. Here we demonstrate the capabilities of our approach by two examples for gas storage applications. For the first example our Visual Analytics approach enabled the analyst to observe how the element concentrations change around previously established baselines in response to thousands of different combinations of mineral phases. This supported combinatorial inverse modeling for interpreting observations about the chemical composition of the formation fluids at the Ketzin pilot site for CO2 storage. The results indicate that, within the experimental error range, the formation fluid cannot be considered at local thermodynamical equilibrium with the mineral assemblage of the reservoir rock. This is a valuable insight from the predictive geochemical modeling for the Ketzin site. For the second example our approach supports sensitivity analysis for a reaction involving the reductive dissolution of pyrite with formation of pyrrothite in presence of gaseous hydrogen. We determine that this reaction is thermodynamically favorable under a broad range of conditions. This includes low temperatures and absence of microbial catalysators. Our approach has potential for use in other applications that involve exploration of relationships in geochemical simulation model data.

  16. Analytical and experimental investigation of flow fields of annular jets with and without swirling flow

    NASA Technical Reports Server (NTRS)

    Simonson, M. R.; Smith, E. G.; Uhl, W. R.

    1974-01-01

    Analytical and experimental studies were performed to define the flowfield of annular jets, with and, without swirling flow. The analytical model treated configurations with variations of flow angularities, radius ratio, and swirl distributions. Swirl distributions characteristic of stator vanes and rotor blade rows, where the total pressure and swirl distributions are related were incorporated in the mathematical model. The experimental studies included tests of eleven nozzle models, both with and, without swirling exhaust flow. Flowfield surveys were obtained and used for comparison with the analytical model. This comparison of experimental and analytical studies served as the basis for evaluation of several empirical constants as required for application of the analysis to the general flow configuration. The analytical model developed during these studies is applicable to the evaluation of the flowfield and overall performance of the exhaust of statorless lift fan systems that contain various levels of exhaust swirl.

  17. Principles of operation and data reduction techniques for the loft drag disc turbine transducer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silverman, S.

    An analysis of the single- and two-phase flow data applicable to the loss-of-fluid test (LOFT) is presented for the LOFT drag turbine transducer. Analytical models which were employed to correlate the experimental data are presented.

  18. Method and Apparatus for Concentrating Vapors for Analysis

    DOEpatents

    Grate, Jay W.; Baldwin, David L.; Anheier, Jr., Norman C.

    2008-10-07

    An apparatus and method are disclosed for pre-concentrating gaseous vapors for analysis. The invention finds application in conjunction with, e.g., analytical instruments where low detection limits for gaseous vapors are desirable. Vapors sorbed and concentrated within the bed of the apparatus can be thermally desorbed achieving at least partial separation of vapor mixtures. The apparatus is suitable, e.g., for preconcentration and sample injection, and provides greater resolution of peaks for vapors within vapor mixtures, yielding detection levels that are 10-10,000 times better than for direct sampling and analysis systems. Features are particularly useful for continuous unattended monitoring applications.

  19. Bayesian variable selection for post-analytic interrogation of susceptibility loci.

    PubMed

    Chen, Siying; Nunez, Sara; Reilly, Muredach P; Foulkes, Andrea S

    2017-06-01

    Understanding the complex interplay among protein coding genes and regulatory elements requires rigorous interrogation with analytic tools designed for discerning the relative contributions of overlapping genomic regions. To this aim, we offer a novel application of Bayesian variable selection (BVS) for classifying genomic class level associations using existing large meta-analysis summary level resources. This approach is applied using the expectation maximization variable selection (EMVS) algorithm to typed and imputed SNPs across 502 protein coding genes (PCGs) and 220 long intergenic non-coding RNAs (lncRNAs) that overlap 45 known loci for coronary artery disease (CAD) using publicly available Global Lipids Gentics Consortium (GLGC) (Teslovich et al., 2010; Willer et al., 2013) meta-analysis summary statistics for low-density lipoprotein cholesterol (LDL-C). The analysis reveals 33 PCGs and three lncRNAs across 11 loci with >50% posterior probabilities for inclusion in an additive model of association. The findings are consistent with previous reports, while providing some new insight into the architecture of LDL-cholesterol to be investigated further. As genomic taxonomies continue to evolve, additional classes such as enhancer elements and splicing regions, can easily be layered into the proposed analysis framework. Moreover, application of this approach to alternative publicly available meta-analysis resources, or more generally as a post-analytic strategy to further interrogate regions that are identified through single point analysis, is straightforward. All coding examples are implemented in R version 3.2.1 and provided as supplemental material. © 2016, The International Biometric Society.

  20. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  1. A New Unified Analysis of Estimate Errors by Model-Matching Phase-Estimation Methods for Sensorless Drive of Permanent-Magnet Synchronous Motors and New Trajectory-Oriented Vector Control, Part II

    NASA Astrophysics Data System (ADS)

    Shinnaka, Shinji

    This paper presents a new unified analysis of estimate errors by model-matching extended-back-EMF estimation methods for sensorless drive of permanent-magnet synchronous motors. Analytical solutions about estimate errors, whose validity is confirmed by numerical experiments, are rich in universality and applicability. As an example of universality and applicability, a new trajectory-oriented vector control method is proposed, which can realize directly quasi-optimal strategy minimizing total losses with no additional computational loads by simply orienting one of vector-control coordinates to the associated quasi-optimal trajectory. The coordinate orientation rule, which is analytically derived, is surprisingly simple. Consequently the trajectory-oriented vector control method can be applied to a number of conventional vector control systems using model-matching extended-back-EMF estimation methods.

  2. Programmable Self-Assembly of DNA-Dendrimer and DNA-Fullerene Nanostructures

    DTIC Science & Technology

    2004-10-01

    separated by high pressure liquid chromatography ( HPLC ). The resulting material was analytically pure (99%) and monodisperse. Hybridization...bacterial and viral recognition, and gene expression analysis . These major accomplishments have been disseminated by various applications including 16...designing DNA strands with specific structural properties. The direct analysis of genomic DNA and RNA in an array format without labeling or

  3. A New Colorimetric Assay of Tabletop Sweeteners Using a Modified Biuret Reagent: An Analytical Chemistry Experiment for the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Fenk, Christopher J.; Kaufman, Nathan; Gerbig, Donald G., Jr.

    2007-01-01

    A new, fast and effective colorimetric analysis of the artificial sweetener aspartame is presented for application in undergraduate laboratory courses. This new method incorporates the use of a modified biuret reagent for selective detection and analysis of aspartame in aqueous solutions. The modified reagent is less caustic than the traditional…

  4. Rapid and Automated Analytical Methods for Redox Species Based on Potentiometric Flow Injection Analysis Using Potential Buffers

    PubMed Central

    Ohura, Hiroki; Imato, Toshihiko

    2011-01-01

    Two analytical methods, which prove the utility of a potentiometric flow injection technique for determining various redox species, based on the use of some redox potential buffers, are reviewed. The first is a potentiometric flow injection method in which a redox couple such as Fe(III)-Fe(II), Fe(CN)6 3−-Fe(CN)(CN)6 4−, and bromide-bromine and a redox electrode or a combined platinum-bromide ion selective electrode are used. The analytical principle and advantages of the method are discussed, and several examples of its application are reported. Another example is a highly sensitive potentiometric flow injection method, in which a large transient potential change due to bromine or chlorine as an intermediate, generated during the reaction of the oxidative species with an Fe(III)-Fe(II) potential buffer containing bromide or chloride, is utilized. The analytical principle and details of the proposed method are described, and examples of several applications are described. The determination of trace amounts of hydrazine, based on the detection of a transient change in potential caused by the reaction with a Ce(IV)-Ce(III) potential buffer, is also described. PMID:21584280

  5. Technology advancement for integrative stem cell analyses.

    PubMed

    Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi

    2014-12-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.

  6. Technology Advancement for Integrative Stem Cell Analyses

    PubMed Central

    Jeong, Yoon

    2014-01-01

    Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188

  7. Recent advances in chemiluminescence detection coupled with capillary electrophoresis and microchip capillary electrophoresis.

    PubMed

    Liu, Yuxuan; Huang, Xiangyi; Ren, Jicun

    2016-01-01

    CE is an ideal analytical method for extremely volume-limited biological microenvironments. However, the small injection volume makes it a challenge to achieve highly sensitive detection. Chemiluminescence (CL) detection is characterized by providing low background with excellent sensitivity because of requiring no light source. The coupling of CL with CE and MCE has become a powerful analytical method. So far, this method has been widely applied to chemical analysis, bioassay, drug analysis, and environment analysis. In this review, we first introduce some developments for CE-CL and MCE-CL systems, and then put the emphasis on the applications in the last 10 years. Finally, we discuss the future prospects. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. 10 CFR 431.445 - Determination of small electric motor efficiency.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determined either by testing in accordance with § 431.444 of this subpart, or by application of an... method. An AEDM applied to a basic model must be: (i) Derived from a mathematical model that represents... statistical analysis, computer simulation or modeling, or other analytic evaluation of performance data. (3...

  9. Capillary electrophoresis for the analysis of contaminants in emerging food safety issues and food traceability.

    PubMed

    Vallejo-Cordoba, Belinda; González-Córdova, Aarón F

    2010-07-01

    This review presents an overview of the applicability of CE in the analysis of chemical and biological contaminants involved in emerging food safety issues. Additionally, CE-based genetic analyzers' usefulness as a unique tool in food traceability verification systems was presented. First, analytical approaches for the determination of melamine and specific food allergens in different foods were discussed. Second, natural toxin analysis by CE was updated from the last review reported in 2008. Finally, the analysis of prion proteins associated with the "mad cow" crises and the application of CE-based genetic analyzers for meat traceability were summarized.

  10. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. An Integrated Hot-Stage Microscope-Direct Analysis in Real Time-Mass Spectrometry System for Studying the Thermal Behavior of Materials.

    PubMed

    Ashton, Gage P; Harding, Lindsay P; Parkes, Gareth M B

    2017-12-19

    This paper describes a new analytical instrument that combines a precisely temperature-controlled hot-stage with digital microscopy and Direct Analysis in Real Time-mass spectrometry (DART-MS) detection. The novelty of the instrument lies in its ability to monitor processes as a function of temperature through the simultaneous recording of images, quantitative color changes, and mass spectra. The capability of the instrument was demonstrated through successful application to four very varied systems including profiling an organic reaction, decomposition of silicone polymers, and the desorption of rhodamine B from an alumina surface. The multidimensional, real-time analytical data provided by this instrument allow for a much greater insight into thermal processes than could be achieved previously.

  12. Laser ablation in analytical chemistry - A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russo, Richard E.; Mao, Xianglei; Liu, Haichen

    Laser ablation is becoming a dominant technology for direct solid sampling in analytical chemistry. Laser ablation refers to the process in which an intense burst of energy delivered by a short laser pulse is used to sample (remove a portion of) a material. The advantages of laser ablation chemical analysis include direct characterization of solids, no chemical procedures for dissolution, reduced risk of contamination or sample loss, analysis of very small samples not separable for solution analysis, and determination of spatial distributions of elemental composition. This review describes recent research to understand and utilize laser ablation for direct solid sampling,more » with emphasis on sample introduction to an inductively coupled plasma (ICP). Current research related to contemporary experimental systems, calibration and optimization, and fractionation is discussed, with a summary of applications in several areas.« less

  13. Analysis of the connection of the timber-fiber concrete composite structure

    NASA Astrophysics Data System (ADS)

    Holý, Milan; Vráblík, Lukáš; Petřík, Vojtěch

    2017-09-01

    This paper deals with an implementation of the material parameters of the connection to complex models for analysis of the timber-fiber concrete composite structures. The aim of this article is to present a possible way of idealization of the continuous contact model that approximates the actual behavior of timber-fiber reinforced concrete structures. The presented model of the connection was derived from push-out shear tests. It was approved by use of the nonlinear numerical analysis, that it can be achieved a very good compliance between results of numerical simulations and results of the experiments by a suitable choice of the material parameters of the continuous contact. Finally, an application for an analytical calculation of timber-fiber concrete composite structures is developed for the practical use in engineering praxis. The input material parameters for the analytical model was received using data from experiments.

  14. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  15. Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zheng, Qiuling; Chen, Hao

    2016-06-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.

  16. AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent

    Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less

  17. Biomedical application of MALDI mass spectrometry for small-molecule analysis.

    PubMed

    van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M

    2011-01-01

    Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.

  18. Recent applications for HPLC-MS analysis of anthocyanins in Food materials

    USDA-ARS?s Scientific Manuscript database

    Anthocyanins are an important group of polyphenols that have health promoting properties. Analytical techniques for profiling anthocyanins have been widely reported in the last decade for in vitro and in vivo studies. A number of important technological advances in high-performance liquid chromatog...

  19. Palm: Easing the Burden of Analytical Performance Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tallent, Nathan R.; Hoisie, Adolfy

    2014-06-01

    Analytical (predictive) application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult because they must be both accurate and concise. To ease the burden of performance modeling, we developed Palm, a modeling tool that combines top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. To express insight, Palm defines a source code modeling annotation language. By coordinating models and source code, Palm's models are `first-class' and reproducible. Unlike prior work, Palm formally links models, functions, and measurements. As a result, Palm (a) uses functions to either abstract or express complexitymore » (b) generates hierarchical models (representing an application's static and dynamic structure); and (c) automatically incorporates measurements to focus attention, represent constant behavior, and validate models. We discuss generating models for three different applications.« less

  20. Analytical and Theranostic Applications of Gold Nanoparticles and Multifunctional Nanocomposites

    PubMed Central

    Khlebtsov, Nikolai; Bogatyrev, Vladimir; Dykman, Lev; Khlebtsov, Boris; Staroverov, Sergey; Shirokov, Alexander; Matora, Larisa; Khanadeev, Vitaly; Pylaev, Timofey; Tsyganova, Natalia; Terentyuk, Georgy

    2013-01-01

    Gold nanoparticles (GNPs) and GNP-based multifunctional nanocomposites are the subject of intensive studies and biomedical applications. This minireview summarizes our recent efforts in analytical and theranostic applications of engineered GNPs and nanocomposites by using plasmonic properties of GNPs and various optical techniques. Specifically, we consider analytical biosensing; visualization and bioimaging of bacterial, mammalian, and plant cells; photodynamic treatment of pathogenic bacteria; and photothermal therapy of xenografted tumors. In addition to recently published reports, we discuss new data on dot immunoassay diagnostics of mycobacteria, multiplexed immunoelectron microscopy analysis of Azospirillum brasilense, materno-embryonic transfer of GNPs in pregnant rats, and combined photodynamic and photothermal treatment of rat xenografted tumors with gold nanorods covered by a mesoporous silica shell doped with hematoporphyrin. PMID:23471188

  1. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  2. Extending the Kerberos Protocol for Distributed Data as a Service

    DTIC Science & Technology

    2012-09-20

    exported as a UIMA [11] PEAR file for deployment to IBM Content Analytics (ICA). A UIMA PEAR file is a deployable text analytics “pipeline” (analogous...to a web application packaged in a WAR file). ICA is a text analysis and search application that supports UIMA . The key entities targeted by NLP rules...workbench. [Online]. Available: https: //www.ibm.com/developerworks/community/alphaworks/lrw/ [11] Apache UIMA . [Online]. Available: http

  3. Influence of transverse-shear and large-deformation effects on the low-speed impact response of laminated composite plates

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Starnes, James H., Jr.; Prasad, Chunchu B.

    1993-01-01

    An analytical procedure is presented for determining the transient response of simply supported, rectangular laminated composite plates subjected to impact loads from airgun-propelled or dropped-weight impactors. A first-order shear-deformation theory is included in the analysis to represent properly any local short-wave-length transient bending response. The impact force is modeled as a locally distributed load with a cosine-cosine distribution. A double Fourier series expansion and the Timoshenko small-increment method are used to determine the contact force, out-of-plane deflections, and in-plane strains and stresses at any plate location due to an impact force at any plate location. The results of experimental and analytical studies are compared for quasi-isotropic laminates. The results indicate that using the appropriate local force distribution for the locally loaded area and including transverse-shear-deformation effects in the laminated plate response analysis are important. The applicability of the present analytical procedure based on small deformation theory is investigated by comparing analytical and experimental results for combinations of quasi-isotropic laminate thicknesses and impact energy levels. The results of this study indicate that large-deformation effects influence the response of both 24- and 32-ply laminated plates, and that a geometrically nonlinear analysis is required for predicting the response accurately.

  4. Advances in the analysis of biological samples using ionic liquids.

    PubMed

    Clark, Kevin D; Trujillo-Rodríguez, María J; Anderson, Jared L

    2018-02-12

    Ionic liquids are a class of solvents and materials that hold great promise in bioanalytical chemistry. Task-specific ionic liquids have recently been designed for the selective extraction, separation, and detection of proteins, peptides, nucleic acids, and other physiologically relevant analytes from complex biological samples. To facilitate rapid bioanalysis, ionic liquids have been integrated in miniaturized and automated procedures. Bioanalytical separations have also benefited from the modification of nonspecific magnetic materials with ionic liquids or the implementation of ionic liquids with inherent magnetic properties. Furthermore, the direct detection of the extracted molecules in the analytical instrument has been demonstrated with structurally tuned ionic liquids and magnetic ionic liquids, providing a significant advantage in the analysis of low-abundance analytes. This article gives an overview of these advances that involve the application of ionic liquids and derivatives in bioanalysis. Graphical abstract Ionic liquids, magnetic ionic liquids, and ionic liquid-based sorbents are increasing the speed, selectivity, and sensitivity in the analysis of biological samples.

  5. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations

    NASA Astrophysics Data System (ADS)

    Pinho, Ludmila A. G.; Sá-Barreto, Lívia C. L.; Infante, Carlos M. C.; Cunha-Filho, Marcílio S. S.

    2016-04-01

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form.

  6. Simultaneous determination of benznidazole and itraconazole using spectrophotometry applied to the analysis of mixture: A tool for quality control in the development of formulations.

    PubMed

    Pinho, Ludmila A G; Sá-Barreto, Lívia C L; Infante, Carlos M C; Cunha-Filho, Marcílio S S

    2016-04-15

    The aim of this work was the development of an analytical procedure using spectrophotometry for simultaneous determination of benznidazole (BNZ) and itraconazole (ITZ) in a medicine used for the treatment of Chagas disease. In order to achieve this goal, the analysis of mixtures was performed applying the Lambert-Beer law through the absorbances of BNZ and ITZ in the wavelengths 259 and 321 nm, respectively. Diverse tests were carried out for development and validation of the method, which proved to be selective, robust, linear, and precise. The lower limits of detection and quantification demonstrate its sensitivity to quantify small amounts of analytes, enabling its application for various analytical purposes, such as dissolution test and routine assays. In short, the quantification of BNZ and ITZ by analysis of mixtures had shown to be efficient and cost-effective alternative for determination of these drugs in a pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.

  7. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  8. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  10. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  11. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  12. Application of Partial Least Square (PLS) Analysis on Fluorescence Data of 8-Anilinonaphthalene-1-Sulfonic Acid, a Polarity Dye, for Monitoring Water Adulteration in Ethanol Fuel.

    PubMed

    Kumar, Keshav; Mishra, Ashok Kumar

    2015-07-01

    Fluorescence characteristic of 8-anilinonaphthalene-1-sulfonic acid (ANS) in ethanol-water mixture in combination with partial least square (PLS) analysis was used to propose a simple and sensitive analytical procedure for monitoring the adulteration of ethanol by water. The proposed analytical procedure was found to be capable of detecting even small adulteration level of ethanol by water. The robustness of the procedure is evident from the statistical parameters such as square of correlation coefficient (R(2)), root mean square of calibration (RMSEC) and root mean square of prediction (RMSEP) that were found to be well with in the acceptable limits.

  13. Desorption atmospheric pressure photoionization.

    PubMed

    Haapala, Markus; Pól, Jaroslav; Saarela, Ville; Arvola, Ville; Kotiaho, Tapio; Ketola, Raimo A; Franssila, Sami; Kauppila, Tiina J; Kostiainen, Risto

    2007-10-15

    An ambient ionization technique for mass spectrometry, desorption atmospheric pressure photoionization (DAPPI), is presented, and its application to the rapid analysis of compounds of various polarities on surfaces is demonstrated. The DAPPI technique relies on a heated nebulizer microchip delivering a heated jet of vaporized solvent, e.g., toluene, and a photoionization lamp emitting 10-eV photons. The solvent jet is directed toward sample spots on a surface, causing the desorption of analytes from the surface. The photons emitted by the lamp ionize the analytes, which are then directed into the mass spectrometer. The limits of detection obtained with DAPPI were in the range of 56-670 fmol. Also, the direct analysis of pharmaceuticals from a tablet surface was successfully demonstrated. A comparison of the performance of DAPPI with that of the popular desorption electrospray ionization method was done with four standard compounds. DAPPI was shown to be equally or more sensitive especially in the case of less polar analytes.

  14. Multiple analyte adduct formation in liquid chromatography-tandem mass spectrometry - Advantages and limitations in the analysis of biologically-related samples.

    PubMed

    Dziadosz, Marek

    2018-05-01

    Multiple analyte adduct formation was examined and discussed in the context of reproducible signal detection in liquid chromatography-tandem mass spectrometry applied in the analysis of biologically-related samples. Appropriate infusion solutions were prepared in H 2 O/methanol (3/97, v/v) with 1 mM sodium acetate and 10 mM acetic acid. An API 4000 QTrap tandem mass spectrometer was used for experiments performed in the negative scan mode (-Q1 MS) and the negative enhanced product ion mode (-EPI). γ‑Hydroxybutyrate and its deuterated form were used as model compounds to highlight both the complexity of adduct formation in popular mobile phases used and the effective signal compensation by the application of isotope-labelled analytes as internal standards. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. The Application of Survival Analysis to the Study of Psychotherapy Termination

    ERIC Educational Resources Information Center

    Corning, Alexadra F.; Malofeeva, Elena V.

    2004-01-01

    The state of the psychotherapy termination literature to date might best be characterized as inconclusive. Despite decades of studies, almost no predictors of premature termination have emerged consistently. An examination of this literature reveals a number of recurrent methodological-analytical problems that likely have contributed substantially…

  16. A Stochastic Super-Exponential Growth Model for Population Dynamics

    NASA Astrophysics Data System (ADS)

    Avila, P.; Rekker, A.

    2010-11-01

    A super-exponential growth model with environmental noise has been studied analytically. Super-exponential growth rate is a property of dynamical systems exhibiting endogenous nonlinear positive feedback, i.e., of self-reinforcing systems. Environmental noise acts on the growth rate multiplicatively and is assumed to be Gaussian white noise in the Stratonovich interpretation. An analysis of the stochastic super-exponential growth model with derivations of exact analytical formulae for the conditional probability density and the mean value of the population abundance are presented. Interpretations and various applications of the results are discussed.

  17. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science through Cloud-Enabled Climate Analytics-as-a-Service

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.

    2013-12-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  18. MERRA Analytic Services: Meeting the Big Data Challenges of Climate Science Through Cloud-enabled Climate Analytics-as-a-service

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.

    2014-01-01

    Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.

  19. Rapid, Automated Determination of Elemental Compositions of Ions in Mass Spectra Obtained with an Open-Air Ion Source (2 of 2)

    EPA Science Inventory

    An inexpensive autosampler for a DART/TOFMS provides mass spectra from analytes absorbed on 76 cotton swab, wipe samples in 7.5 min. A field sample carrier simplifies sample collection and provides swabs nearly ready for analysis to the lab. Applications of the high throughput pr...

  20. Graphene as a Novel Matrix for the Analysis of Small Molecules by MALDI-TOF MS

    PubMed Central

    Dong, Xiaoli; Cheng, Jinsheng; Li, Jinghong; Wang, Yinsheng

    2010-01-01

    Graphene was utilized for the first time as matrix for the analysis of low-molecular weight compounds using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Polar compounds including amino acids, polyamines, anticancer drugs and nucleosides could be successfully analyzed. Additionally, nonpolar compounds including steroids could be detected with high resolution and sensitivity. Compared with conventional matrix, graphene exhibited high desorption/ionization efficiency for nonpolar compounds. The graphene matrix functions as substrate to trap analytes, and it transfers energy to the analytes upon laser irradiation, which allowed for the analytes to be readily desorbed/ionized and interference of intrinsic matrix ions to be eliminated. The use of graphene as matrix avoided the fragmentation of analytes and provided good reproducibility and high salt tolerance, underscoring the potential application of graphene as matrix for MALDI-MS analysis of practical samples in complex sample matrices. We also demonstrated that the use of graphene as adsorbent for the solid-phase extraction of squalene could improve greatly the detection limit. This work not only opens a new field for applications of graphene, but also offers a new technique for high-speed analysis of low-molecular weight compounds in areas such as metabolism research and natural products characterization. PMID:20565059

  1. Rapid Harmonic Analysis of Piezoelectric MEMS Resonators.

    PubMed

    Puder, Jonathan M; Pulskamp, Jeffrey S; Rudy, Ryan Q; Cassella, Cristian; Rinaldi, Matteo; Chen, Guofeng; Bhave, Sunil A; Polcawich, Ronald G

    2018-06-01

    This paper reports on a novel simulation method combining the speed of analytical evaluation with the accuracy of finite-element analysis (FEA). This method is known as the rapid analytical-FEA technique (RAFT). The ability of the RAFT to accurately predict frequency response orders of magnitude faster than conventional simulation methods while providing deeper insights into device design not possible with other types of analysis is detailed. Simulation results from the RAFT across wide bandwidths are compared to measured results of resonators fabricated with various materials, frequencies, and topologies with good agreement. These include resonators targeting beam extension, disk flexure, and Lamé beam modes. An example scaling analysis is presented and other applications enabled are discussed as well. The supplemental material includes example code for implementation in ANSYS, although any commonly employed FEA package may be used.

  2. Geometric model of pseudo-distance measurement in satellite location systems

    NASA Astrophysics Data System (ADS)

    Panchuk, K. L.; Lyashkov, A. A.; Lyubchinov, E. V.

    2018-04-01

    The existing mathematical model of pseudo-distance measurement in satellite location systems does not provide a precise solution of the problem, but rather an approximate one. The existence of such inaccuracy, as well as bias in measurement of distance from satellite to receiver, results in inaccuracy level of several meters. Thereupon, relevance of refinement of the current mathematical model becomes obvious. The solution of the system of quadratic equations used in the current mathematical model is based on linearization. The objective of the paper is refinement of current mathematical model and derivation of analytical solution of the system of equations on its basis. In order to attain the objective, geometric analysis is performed; geometric interpretation of the equations is given. As a result, an equivalent system of equations, which allows analytical solution, is derived. An example of analytical solution implementation is presented. Application of analytical solution algorithm to the problem of pseudo-distance measurement in satellite location systems allows to improve the accuracy such measurements.

  3. Swarm intelligence metaheuristics for enhanced data analysis and optimization.

    PubMed

    Hanrahan, Grady

    2011-09-21

    The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.

  4. Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1982-01-01

    Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…

  5. Computer program for analysis of high speed, single row, angular contact, spherical roller bearing, SASHBEAN. Volume 2: Mathematical formulation and analysis

    NASA Technical Reports Server (NTRS)

    Aggarwal, Arun K.

    1993-01-01

    Spherical roller bearings have typically been used in applications with speeds limited to about 5000 rpm and loads limited for operation at less than about 0.25 million DN. However, spherical roller bearings are now being designed for high load and high speed applications including aerospace applications. A computer program, SASHBEAN, was developed to provide an analytical tool to design, analyze, and predict the performance of high speed, single row, angular contact (including zero contact angle), spherical roller bearings. The material presented is the mathematical formulation and analytical methods used to develop computer program SASHBEAN. For a given set of operating conditions, the program calculates the bearings ring deflections (axial and radial), roller deflections, contact areas stresses, depth and magnitude of maximum shear stresses, axial thrust, rolling element and cage rotational speeds, lubrication parameters, fatigue lives, and rates of heat generation. Centrifugal forces and gyroscopic moments are fully considered. The program is also capable of performing steady-state and time-transient thermal analyses of the bearing system.

  6. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  7. Solid State Laser

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Titan-CW Ti:sapphire (titanium-doped sapphire) tunable laser is an innovation in solid-state laser technology jointly developed by the Research and Solid State Laser Divisions of Schwartz Electro-optics, Inc. (SEO). SEO is producing the laser for the commercial market, an outgrowth of a program sponsored by Langley Research Center to develop Ti:sapphire technology for space use. SEO's Titan-CW series of Ti:sapphire tunable lasers have applicability in analytical equipment designed for qualitative analysis of carbohydrates and proteins, structural analysis of water, starch/sugar analyses, and measurements of salt in meat. Further applications are expected in semiconductor manufacture, in medicine for diagnosis and therapy, and in biochemistry.

  8. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  9. Micromechanical analysis and design of an integrated thermal protection system for future space vehicles

    NASA Astrophysics Data System (ADS)

    Martinez, Oscar

    Thermal protection systems (TPS) are the key features incorporated into a spacecraft's design to protect it from severe aerodynamic heating during high-speed travel through planetary atmospheres. The thermal protection system is the key technology that enables a spacecraft to be lightweight, fully reusable, and easily maintainable. Add-on TPS concepts have been used since the beginning of the space race. The Apollo space capsule used ablative TPS and the Space Shuttle Orbiter TPS technology consisted of ceramic tiles and blankets. Many problems arose from the add-on concept such as incompatibility, high maintenance costs, non-load bearing, and not being robust and operable. To make the spacecraft's TPS more reliable, robust, and efficient, we investigated Integral Thermal Protection System (ITPS) concept in which the load-bearing structure and the TPS are combined into one single component. The design of an ITPS was a challenging task, because the requirement of a load-bearing structure and a TPS are often conflicting. Finite element (FE) analysis is often the preferred method of choice for a structural analysis problem. However, as the structure becomes complex, the computational time and effort for an FE analysis increases. New structural analytical tools were developed, or available ones were modified, to perform a full structural analysis of the ITPS. With analytical tools, the designer is capable of obtaining quick and accurate results and has a good idea of the response of the structure without having to go to an FE analysis. A MATLABRTM code was developed to analytically determine performance metrics of the ITPS such as stresses, buckling, deflection, and other failure modes. The analytical models provide fast and accurate results that were within 5% difference from the FEM results. The optimization procedure usually performs 100 function evaluations for every design variable. Using the analytical models in the optimization procedure was a time saver, because the optimization time to reach an optimum design was reached in less than an hour, where as an FE optimization study would take hours to reach an optimum design. Corrugated-core structures were designed for ITPS applications with loads and boundary conditions similar to that of a Space Shuttle-like vehicle. Temperature, buckling, deflection and stress constraints were considered for the design and optimization process. An optimized design was achieved with consideration of all the constraints. The ITPS design obtained from the analytical solutions was lighter (4.38 lb/ft2) when compared to the ITPS design obtained from a finite element analysis (4.85 lb/ft 2). The ITPS boundary effects added local stresses and compressive loads to the top facesheet that was not able to be captured by the 2D plate solutions. The inability to fully capture the boundary effects lead to a lighter ITPS when compared to the FE solution. However, the ITPS can withstand substantially large mechanical loads when compared to the previous designs. Truss-core structures were found to be unsuitable as they could not withstand the large thermal gradients frequently encountered in ITPS applications.

  10. The role of atomic fluorescence spectrometry in the automatic environmental monitoring of trace element analysis

    PubMed Central

    Stockwell, P. B.; Corns, W. T.

    1993-01-01

    Considerable attention has been drawn to the environmental levels of mercury, arsenic, selenium and antimony in the last decade. Legislative and environmental pressure has forced levels to be lowered and this has created an additional burden for analytical chemists. Not only does an analysis have to reach lower detection levels, but it also has to be seen to be correct. Atomic fluorescence detection, especially when coupled to vapour generation techniques, offers both sensitivity and specificity. Developments in the design of specified atomic fluorescence detectors for mercury, for the hydride-forming elements and also for cadmium, are described in this paper. Each of these systems is capable of analysing samples in the part per trillion (ppt) range reliably and economically. Several analytical applications are described. PMID:18924964

  11. gQTL: A Web Application for QTL Analysis Using the Collaborative Cross Mouse Genetic Reference Population.

    PubMed

    Konganti, Kranti; Ehrlich, Andre; Rusyn, Ivan; Threadgill, David W

    2018-06-07

    Multi-parental recombinant inbred populations, such as the Collaborative Cross (CC) mouse genetic reference population, are increasingly being used for analysis of quantitative trait loci (QTL). However specialized analytic software for these complex populations is typically built in R that works only on command-line, which limits the utility of these powerful resources for many users. To overcome analytic limitations, we developed gQTL, a web accessible, simple graphical user interface application based on the DOQTL platform in R to perform QTL mapping using data from CC mice. Copyright © 2018, G3: Genes, Genomes, Genetics.

  12. Forensic applications of desorption electrospray ionisation mass spectrometry (DESI-MS).

    PubMed

    Morelato, Marie; Beavis, Alison; Kirkbride, Paul; Roux, Claude

    2013-03-10

    Desorption electrospray ionisation mass spectrometry (DESI-MS) is an emerging analytical technique that enables in situ mass spectrometric analysis of specimens under ambient conditions. It has been successfully applied to a large range of forensically relevant materials. This review assesses and highlights forensic applications of DESI-MS including the analysis and detection of illicit drugs, explosives, chemical warfare agents, inks and documents, fingermarks, gunshot residues and drugs of abuse in urine and plasma specimens. The minimal specimen preparation required for analysis and the sensitivity of detection achieved offer great advantages, especially in the field of forensic science. Crown Copyright © 2013. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Analytical Characterization of Erythritol Tetranitrate, an Improvised Explosive.

    PubMed

    Matyáš, Robert; Lyčka, Antonín; Jirásko, Robert; Jakový, Zdeněk; Maixner, Jaroslav; Mišková, Linda; Künzel, Martin

    2016-05-01

    Erythritol tetranitrate (ETN), an ester of nitric acid and erythritol, is a solid crystalline explosive with high explosive performance. Although it has never been used in any industrial or military application, it has become one of the most prepared and misused improvise explosives. In this study, several analytical techniques were explored to facilitate analysis in forensic laboratories. FTIR and Raman spectrometry measurements expand existing data and bring more detailed assignment of bands through the parallel study of erythritol [(15) N4 ] tetranitrate. In the case of powder diffraction, recently published data were verified, and (1) H, (13) C, and (15) N NMR spectra are discussed in detail. The technique of electrospray ionization tandem mass spectrometry was successfully used for the analysis of ETN. Described methods allow fast, versatile, and reliable detection or analysis of samples containing erythritol tetranitrate in forensic laboratories. © 2016 American Academy of Forensic Sciences.

  14. Molecularly Imprinted Nanomaterials for Sensor Applications

    PubMed Central

    Irshad, Muhammad; Iqbal, Naseer; Mujahid, Adnan; Afzal, Adeel; Hussain, Tajamal; Sharif, Ahsan; Ahmad, Ejaz; Athar, Muhammad Makshoof

    2013-01-01

    Molecular imprinting is a well-established technology to mimic antibody-antigen interaction in a synthetic platform. Molecularly imprinted polymers and nanomaterials usually possess outstanding recognition capabilities. Imprinted nanostructured materials are characterized by their small sizes, large reactive surface area and, most importantly, with rapid and specific analysis of analytes due to the formation of template driven recognition cavities within the matrix. The excellent recognition and selectivity offered by this class of materials towards a target analyte have found applications in many areas, such as separation science, analysis of organic pollutants in water, environmental analysis of trace gases, chemical or biological sensors, biochemical assays, fabricating artificial receptors, nanotechnology, etc. We present here a concise overview and recent developments in nanostructured imprinted materials with respect to various sensor systems, e.g., electrochemical, optical and mass sensitive, etc. Finally, in light of recent studies, we conclude the article with future perspectives and foreseen applications of imprinted nanomaterials in chemical sensors. PMID:28348356

  15. A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques

    PubMed Central

    Weller, Michael G.

    2012-01-01

    The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539

  16. Analytic Closed-Form Solution of a Mixed Layer Model for Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Akyurek, Bengu Ozge

    Stratocumulus clouds play an important role in climate cooling and are hard to predict using global climate and weather forecast models. Thus, previous studies in the literature use observations and numerical simulation tools, such as large-eddy simulation (LES), to solve the governing equations for the evolution of stratocumulus clouds. In contrast to the previous works, this work provides an analytic closed-form solution to the cloud thickness evolution of stratocumulus clouds in a mixed-layer model framework. With a focus on application over coastal lands, the diurnal cycle of cloud thickness and whether or not clouds dissipate are of particular interest. An analytic solution enables the sensitivity analysis of implicitly interdependent variables and extrema analysis of cloud variables that are hard to achieve using numerical solutions. In this work, the sensitivity of inversion height, cloud-base height, and cloud thickness with respect to initial and boundary conditions, such as Bowen ratio, subsidence, surface temperature, and initial inversion height, are studied. A critical initial cloud thickness value that can be dissipated pre- and post-sunrise is provided. Furthermore, an extrema analysis is provided to obtain the minima and maxima of the inversion height and cloud thickness within 24 h. The proposed solution is validated against LES results under the same initial and boundary conditions. Then, the proposed analytic framework is extended to incorporate multiple vertical columns that are coupled by advection through wind flow. This enables a bridge between the micro-scale and the mesoscale relations. The effect of advection on cloud evolution is studied and a sensitivity analysis is provided.

  17. [Development of an Operational Model for the Application of Planning-Programming-Budgeting Systems in Local School Districts. Program Budgeting Note 3, Cost-Effectiveness Analysis: What Is It?

    ERIC Educational Resources Information Center

    State Univ. of New York, Buffalo. Western New York School Study Council.

    Cost effectiveness analysis is used in situations where benefits and costs are not readily converted into a money base. Five elements can be identified in such an analytic process: (1) The objective must be defined in terms of what it is and how it is attained; (2) alternatives to the objective must be clearly definable; (3) the costs must be…

  18. Investigation of translaminar fracture in fibrereinforced composite laminates---applicability of linear elastic fracture mechanics and cohesive-zone model

    NASA Astrophysics Data System (ADS)

    Hou, Fang

    With the extensive application of fiber-reinforced composite laminates in industry, research on the fracture mechanisms of this type of materials have drawn more and more attentions. A variety of fracture theories and models have been developed. Among them, the linear elastic fracture mechanics (LEFM) and cohesive-zone model (CZM) are two widely-accepted fracture models, which have already shown applicability in the fracture analysis of fiber-reinforced composite laminates. However, there remain challenges which prevent further applications of the two fracture models, such as the experimental measurement of fracture resistance. This dissertation primarily focused on the study of the applicability of LEFM and CZM for the fracture analysis of translaminar fracture in fibre-reinforced composite laminates. The research for each fracture model consisted of two sections: the analytical characterization of crack-tip fields and the experimental measurement of fracture resistance parameters. In the study of LEFM, an experimental investigation based on full-field crack-tip displacement measurements was carried out as a way to characterize the subcritical and steady-state crack advances in translaminar fracture of fiber-reinforced composite laminates. Here, the fiber-reinforced composite laminates were approximated as anisotropic solids. The experimental investigation relied on the LEFM theory with a modification with respect to the material anisotropy. Firstly, the full-field crack-tip displacement fields were measured by Digital Image Correlation (DIC). Then two methods, separately based on the stress intensity approach and the energy approach, were developed to measure the crack-tip field parameters from crack-tip displacement fields. The studied crack-tip field parameters included the stress intensity factor, energy release rate and effective crack length. Moreover, the crack-growth resistance curves (R-curves) were constructed with the measured crack-tip field parameters. In addition, an error analysis was carried out with an emphasis on the influence of out-of-plane rotation of specimen. In the study of CZM, two analytical inverse methods, namely the field projection method (FPM) and the separable nonlinear least-squares method, were developed for the extraction of cohesive fracture properties from crack-tip full-field displacements. Firstly, analytical characterizations of the elastic fields around a crack-tip cohesive zone and the cohesive variables within the cohesive zone were derived in terms of an eigenfunction expansion. Then both of the inverse methods were developed based on the analytical characterization. With the analytical inverse methods, the cohesive-zone law (CZL), cohesive-zone size and position can be inversely computed from the cohesive-crack-tip displacement fields. In the study, comprehensive numerical tests were carried out to investigate the applicability and robustness of two inverse methods. From the numerical tests, it was found that the field projection method was very sensitive to noise and thus had limited applicability in practice. On the other hand, the separable nonlinear least-squares method was found to be more noise-resistant and less ill-conditioned. Subsequently, the applicability of separable nonlinear least-squares method was validated with the same translaminar fracture experiment for the study of LEFM. Eventually, it was found that the experimental measurements of R-curves and CZL showed a great agreement, in both of the fracture energy and the predicted load carrying capability. It thus demonstrated the validity of present research for the translaminar fracture of fiber-reinforced composite laminates.

  19. New-generation bar adsorptive microextraction (BAμE) devices for a better eco-user-friendly analytical approach-Application for the determination of antidepressant pharmaceuticals in biological fluids.

    PubMed

    Ide, A H; Nogueira, J M F

    2018-05-10

    The present contribution aims to design new-generation bar adsorptive microextraction (BAμE) devices that promote an innovative and much better user-friendly analytical approach. The novel BAμE devices were lab-made prepared having smaller dimensions by using flexible nylon-based supports (7.5 × 1.0 mm) coated with convenient sorbents (≈ 0.5 mg). This novel advance allows effective microextraction and back-extraction ('only single liquid desorption step') stages as well as interfacing enhancement with the instrumental systems dedicated for routine analysis. To evaluate the achievements of these improvements, four antidepressant agents (bupropion, citalopram, amitriptyline and trazodone) were used as model compounds in aqueous media combined with liquid chromatography (LC) systems. By using an N-vinylpyrrolidone based-polymer phase good selectivity and efficiency were obtained. Assays performed on 25 mL spiked aqueous samples, yielded average recoveries in between 67.8 ± 12.4% (bupropion) and 88.3 ± 12.1% (citalopram), under optimized experimental conditions. The analytical performance also showed convenient precision (RSD < 12%) and detection limits (50 ng L -1 ), as well as linear dynamic ranges (160-2000 ng L -1 ) with suitable determination coefficients (r 2  > 0.9820). The application of the proposed analytical approach on biological fluids showed negligible matrix effects by using the standard addition methodology. From the data obtained, the new-generation BAμE devices presented herein provide an innovative and robust analytical cycle, are simple to prepare, cost-effective, user-friendly and compatible with the current LC autosampler systems. Furthermore, the novel devices were designed to be disposable and used together with negligible amounts of organic solvents (100 μL) during back-extraction, in compliance with the green analytical chemistry principles. In short, the new-generation BAμE devices showed to be an eco-user-friendly approach for trace analysis of priority compounds in biological fluids and a versatile alternative over other well-stablished sorption-based microextraction techniques. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Fe 2O 3-Au hybrid nanoparticles for sensing applications via sers analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murph, Simona Hunyadi; Searles, Emily

    2017-06-25

    Nanoparticles with large amounts of surface area and unique characteristics that are distinct from their bulk material provide an interesting application in the enhancement of inelastic scattering signal. Surface Enhanced Raman Spectroscopy (SERS) strives to increase the Raman scattering effect when chemical species of interest are in the close proximity of metallic nnaostructures. Gold nanoparticles of various shapes have been used for sensing applications via SERS as they demonstrate the greatest effect of plasmonic behavior in the visible-near IR region of the spectrum. When coupled with other nanoparticles, namely iron oxide nanoparticles, hybrid structures with increased functionality were produced. Multifunctionalmore » iron oxide-gold hybrid nanostructures have been created via solution chemistries and investigated for analyte detection of a model analyte. By exploiting their magnetic properties, nanogaps or “hot spots” were rationally created and evaluated for SERS enhancement studies.« less

  1. A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines

    PubMed Central

    Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua

    2018-01-01

    The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905

  2. Network analysis applications in hydrology

    NASA Astrophysics Data System (ADS)

    Price, Katie

    2017-04-01

    Applied network theory has seen pronounced expansion in recent years, in fields such as epidemiology, computer science, and sociology. Concurrent development of analytical methods and frameworks has increased possibilities and tools available to researchers seeking to apply network theory to a variety of problems. While water and nutrient fluxes through stream systems clearly demonstrate a directional network structure, the hydrological applications of network theory remain under­explored. This presentation covers a review of network applications in hydrology, followed by an overview of promising network analytical tools that potentially offer new insights into conceptual modeling of hydrologic systems, identifying behavioral transition zones in stream networks and thresholds of dynamical system response. Network applications were tested along an urbanization gradient in Atlanta, Georgia, USA. Peachtree Creek and Proctor Creek. Peachtree Creek contains a nest of five long­term USGS streamflow and water quality gages, allowing network application of long­term flow statistics. The watershed spans a range of suburban and heavily urbanized conditions. Summary flow statistics and water quality metrics were analyzed using a suite of network analysis techniques, to test the conceptual modeling and predictive potential of the methodologies. Storm events and low flow dynamics during Summer 2016 were analyzed using multiple network approaches, with an emphasis on tomogravity methods. Results indicate that network theory approaches offer novel perspectives for understanding long­ term and event­based hydrological data. Key future directions for network applications include 1) optimizing data collection, 2) identifying "hotspots" of contaminant and overland flow influx to stream systems, 3) defining process domains, and 4) analyzing dynamic connectivity of various system components, including groundwater­surface water interactions.

  3. 78 FR 63522 - Syntax Analytics, LLC and Syntax ETF Trust; Notice of Application

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-10-24

    ... Analytics, LLC and Syntax ETF Trust; Notice of Application October 18, 2013. AGENCY: Securities and Exchange... Trust (``Trust''). Summary of Application: Applicants request an order that permits: (a) Actively... unit investment trusts outside of the same group of investment companies as the series to acquire...

  4. Current Applications of Chromatographic Methods in the Study of Human Body Fluids for Diagnosing Disorders.

    PubMed

    Jóźwik, Jagoda; Kałużna-Czaplińska, Joanna

    2016-01-01

    Currently, analysis of various human body fluids is one of the most essential and promising approaches to enable the discovery of biomarkers or pathophysiological mechanisms for disorders and diseases. Analysis of these fluids is challenging due to their complex composition and unique characteristics. Development of new analytical methods in this field has made it possible to analyze body fluids with higher selectivity, sensitivity, and precision. The composition and concentration of analytes in body fluids are most often determined by chromatography-based techniques. There is no doubt that proper use of knowledge that comes from a better understanding of the role of body fluids requires the cooperation of scientists of diverse specializations, including analytical chemists, biologists, and physicians. This article summarizes current knowledge about the application of different chromatographic methods in analyses of a wide range of compounds in human body fluids in order to diagnose certain diseases and disorders.

  5. A Review of Interface Electronic Systems for AT-cut Quartz Crystal Microbalance Applications in Liquids

    PubMed Central

    Arnau, Antonio

    2008-01-01

    From the first applications of AT-cut quartz crystals as sensors in solutions more than 20 years ago, the so-called quartz crystal microbalance (QCM) sensor is becoming into a good alternative analytical method in a great deal of applications such as biosensors, analysis of biomolecular interactions, study of bacterial adhesion at specific interfaces, pathogen and microorganism detection, study of polymer film-biomolecule or cell-substrate interactions, immunosensors and an extensive use in fluids and polymer characterization and electrochemical applications among others. The appropriate evaluation of this analytical method requires recognizing the different steps involved and to be conscious of their importance and limitations. The first step involved in a QCM system is the accurate and appropriate characterization of the sensor in relation to the specific application. The use of the piezoelectric sensor in contact with solutions strongly affects its behavior and appropriate electronic interfaces must be used for an adequate sensor characterization. Systems based on different principles and techniques have been implemented during the last 25 years. The interface selection for the specific application is important and its limitations must be known to be conscious of its suitability, and for avoiding the possible error propagation in the interpretation of results. This article presents a comprehensive overview of the different techniques used for AT-cut quartz crystal microbalance in in-solution applications, which are based on the following principles: network or impedance analyzers, decay methods, oscillators and lock-in techniques. The electronic interfaces based on oscillators and phase-locked techniques are treated in detail, with the description of different configurations, since these techniques are the most used in applications for detection of analytes in solutions, and in those where a fast sensor response is necessary. PMID:27879713

  6. Laser Induced Breakdown Spectroscopy for Elemental Analysis in Environmental, Cultural Heritage and Space Applications: A Review of Methods and Results

    PubMed Central

    Gaudiuso, Rosalba; Dell’Aglio, Marcella; De Pascale, Olga; Senesi, Giorgio S.; De Giacomo, Alessandro

    2010-01-01

    Analytical applications of Laser Induced Breakdown Spectroscopy (LIBS), namely optical emission spectroscopy of laser-induced plasmas, have been constantly growing thanks to its intrinsic conceptual simplicity and versatility. Qualitative and quantitative analysis can be performed by LIBS both by drawing calibration lines and by using calibration-free methods and some of its features, so as fast multi-elemental response, micro-destructiveness, instrumentation portability, have rendered it particularly suitable for analytical applications in the field of environmental science, space exploration and cultural heritage. This review reports and discusses LIBS achievements in these areas and results obtained for soils and aqueous samples, meteorites and terrestrial samples simulating extraterrestrial planets, and cultural heritage samples, including buildings and objects of various kinds. PMID:22163611

  7. Laser induced breakdown spectroscopy for elemental analysis in environmental, cultural heritage and space applications: a review of methods and results.

    PubMed

    Gaudiuso, Rosalba; Dell'Aglio, Marcella; De Pascale, Olga; Senesi, Giorgio S; De Giacomo, Alessandro

    2010-01-01

    Analytical applications of Laser Induced Breakdown Spectroscopy (LIBS), namely optical emission spectroscopy of laser-induced plasmas, have been constantly growing thanks to its intrinsic conceptual simplicity and versatility. Qualitative and quantitative analysis can be performed by LIBS both by drawing calibration lines and by using calibration-free methods and some of its features, so as fast multi-elemental response, micro-destructiveness, instrumentation portability, have rendered it particularly suitable for analytical applications in the field of environmental science, space exploration and cultural heritage. This review reports and discusses LIBS achievements in these areas and results obtained for soils and aqueous samples, meteorites and terrestrial samples simulating extraterrestrial planets, and cultural heritage samples, including buildings and objects of various kinds.

  8. Healthcare predictive analytics: An overview with a focus on Saudi Arabia.

    PubMed

    Alharthi, Hana

    2018-03-08

    Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.

  9. An automated protocol for performance benchmarking a widefield fluorescence microscope.

    PubMed

    Halter, Michael; Bier, Elianna; DeRose, Paul C; Cooksey, Gregory A; Choquette, Steven J; Plant, Anne L; Elliott, John T

    2014-11-01

    Widefield fluorescence microscopy is a highly used tool for visually assessing biological samples and for quantifying cell responses. Despite its widespread use in high content analysis and other imaging applications, few published methods exist for evaluating and benchmarking the analytical performance of a microscope. Easy-to-use benchmarking methods would facilitate the use of fluorescence imaging as a quantitative analytical tool in research applications, and would aid the determination of instrumental method validation for commercial product development applications. We describe and evaluate an automated method to characterize a fluorescence imaging system's performance by benchmarking the detection threshold, saturation, and linear dynamic range to a reference material. The benchmarking procedure is demonstrated using two different materials as the reference material, uranyl-ion-doped glass and Schott 475 GG filter glass. Both are suitable candidate reference materials that are homogeneously fluorescent and highly photostable, and the Schott 475 GG filter glass is currently commercially available. In addition to benchmarking the analytical performance, we also demonstrate that the reference materials provide for accurate day to day intensity calibration. Published 2014 Wiley Periodicals Inc. Published 2014 Wiley Periodicals Inc. This article is a US government work and, as such, is in the public domain in the United States of America.

  10. Error analysis of analytic solutions for self-excited near-symmetric rigid bodies - A numerical study

    NASA Technical Reports Server (NTRS)

    Kia, T.; Longuski, J. M.

    1984-01-01

    Analytic error bounds are presented for the solutions of approximate models for self-excited near-symmetric rigid bodies. The error bounds are developed for analytic solutions to Euler's equations of motion. The results are applied to obtain a simplified analytic solution for Eulerian rates and angles. The results of a sample application of the range and error bound expressions for the case of the Galileo spacecraft experiencing transverse torques demonstrate the use of the bounds in analyses of rigid body spin change maneuvers.

  11. Thin-Layer Chromatography: The "Eyes" of the Organic Chemist

    ERIC Educational Resources Information Center

    Dickson, Hamilton; Kittredge, Kevin W.; Sarquis, Arlyne

    2004-01-01

    Thin-layer chromatography (TLC) methods are successfully used in many areas of research and development such as clinical medicine, forensic chemistry, biochemistry, and pharmaceutical analysis as TLC is relatively inexpensive and has found widespread application as an easy to use, reliable, and quick analytic tool. The usefulness of TLC in organic…

  12. Developing a Value of Information (VoI) Enabled System from Collection to Analysis

    DTIC Science & Technology

    2016-11-01

    Information, Android, smartphone , information dissemination, visual analytic 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...List of Figures Fig. 1 Spot report main screen .........................................................................2 Fig. 2 Smartphone app...included the creation of 2 Android smartphone applications (apps) and the enhancement of an existing tool (Contour). Prior work with Android

  13. The Application of Morpho-Syntactic Language Processing to Effective Phrase Matching.

    ERIC Educational Resources Information Center

    Sheridan, Paraic; Smeaton, Alan F.

    1992-01-01

    Describes a process of morpho-syntactic language analysis for information retrieval. Tree Structured Analytics (TSA) used for text representation is summarized; the matching process developed for such structures is outlined with an example appended; and experiments carried out to evaluate the effectiveness of TSA matching are discussed. (26…

  14. The Manipulation of Scholarly Rating and Measurement Systems: Constructing Excellence in an Era of Academic Stardom

    ERIC Educational Resources Information Center

    Oravec, Jo Ann

    2017-01-01

    Higher education institutions are joining many other social entities in shifting how participants are evaluated; work is undergoing increasing analysis through metrics, big data analytics, and related methodologies. As applications of academic metrics expand, new formulations of what is considered as "excellence" in teaching and research…

  15. Microplasmas for chemical analysis: analytical tools or research toys?

    NASA Astrophysics Data System (ADS)

    Karanassios, Vassili

    2004-07-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between "liquid" electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided.

  16. Fiber optic evanescent wave biosensor

    NASA Astrophysics Data System (ADS)

    Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1991-09-01

    The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).

  17. Synthesized airfoil data method for prediction of dynamic stall and unsteady airloads

    NASA Technical Reports Server (NTRS)

    Gangwani, S. T.

    1983-01-01

    A detailed analysis of dynamic stall experiments has led to a set of relatively compact analytical expressions, called synthesized unsteady airfoil data, which accurately describe in the time-domain the unsteady aerodynamic characteristics of stalled airfoils. An analytical research program was conducted to expand and improve this synthesized unsteady airfoil data method using additional available sets of unsteady airfoil data. The primary objectives were to reduce these data to synthesized form for use in rotor airload prediction analyses and to generalize the results. Unsteady drag data were synthesized which provided the basis for successful expansion of the formulation to include computation of the unsteady pressure drag of airfoils and rotor blades. Also, an improved prediction model for airfoil flow reattachment was incorporated in the method. Application of this improved unsteady aerodynamics model has resulted in an improved correlation between analytic predictions and measured full scale helicopter blade loads and stress data.

  18. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    PubMed

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  19. Application of Interface Technology in Progressive Failure Analysis of Composite Panels

    NASA Technical Reports Server (NTRS)

    Sleight, D. W.; Lotts, C. G.

    2002-01-01

    A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.

  20. A fully analytic treatment of resonant inductive coupling in the far field

    NASA Astrophysics Data System (ADS)

    Sedwick, Raymond J.

    2012-02-01

    For the application of resonant inductive coupling for wireless power transfer, fabrication of flat spiral coils using ribbon wire allows for analytic expressions of the capacitance and inductance of the coils and therefore the resonant frequency. The expressions can also be used in an approximate way for the analysis of coils constructed from cylindrical wire. Ribbon wire constructed from both standard metals as well as high temperature superconducting material is commercially available, so using these derived expressions as a basis, a fully analytic treatment is presented that allows for design trades to be made for hybrid designs incorporating either technology. The model is then extended to analyze the performance of the technology as applied to inductively coupled communications, which has been demonstrated as having an advantage in circumstances where radiated signals would suffer unacceptable levels of attenuation.

  1. An Artificial Nose Based on Microcantilever Array Sensors

    NASA Astrophysics Data System (ADS)

    Lang, H. P.; Ramseyer, J. P.; Grange, W.; Braun, T.; Schmid, D.; Hunziker, P.; Jung, C.; Hegner, M.; Gerber, C.

    2007-03-01

    We used microfabricated cantilever array sensors for an artificial nose setup. Each cantilever is coated on its top surface with a polymer layer. Volatile gaseous analytes are detected by tracking the diffusion process of the molecules into the polymer layers, resulting in swelling of the polymer layers and therewith bending of the cantilevers. From the bending pattern of all cantilevers in the array, a characteristic 'fingerprint' of the analyte is obtained, which is evaluated using principal component analysis. In a flow of dry nitrogen gas, the bending of the cantilevers is reverted to its initial state before exposure to the analyte, which allows reversible and reproducible operation of the sensor. We show examples of detection of solvents, perfume essences and beverage flavors. In a medical application, the setup provides indication of presence of diseases in patient's breath samples.

  2. Three-Dimensional Field Solutions for Multi-Pole Cylindrical Halbach Arrays in an Axial Orientation

    NASA Technical Reports Server (NTRS)

    Thompson, William K.

    2006-01-01

    This article presents three-dimensional B field solutions for the cylindrical Halbach array in an axial orientation. This arrangement has applications in the design of axial motors and passive axial magnetic bearings and couplers. The analytical model described here assumes ideal magnets with fixed and uniform magnetization. The field component functions are expressed as sums of 2-D definite integrals that are easily computed by a number of mathematical analysis software packages. The analysis is verified with sample calculations and the results are compared to equivalent results from traditional finite-element analysis (FEA). The field solutions are then approximated for use in flux linkage and induced EMF calculations in nearby stator windings by expressing the field variance with angular displacement as pure sinusoidal function whose amplitude depends on radial and axial position. The primary advantage of numerical implementation of the analytical approach presented in the article is that it lends itself more readily to parametric analysis and design tradeoffs than traditional FEA models.

  3. Doctoral training in behavior analysis: Training generalized problem-solving skills

    PubMed Central

    Chase, Philip N.; Wylie, Ruth G.

    1985-01-01

    This essay provides guidelines for designing a doctoral program in behavior analysis. First, we propose a general accomplishment for all behavior analytic doctoral students: that they be able to solve problems concerning individual behavior within a range of environments. Second, in order to achieve this goal, we propose that students be trained in conceptual and experimental analysis of behavior, the application of behavioral principles and the administration of behavioral programs. This training should include class work, but it should emphasize the immersion of students in a variety of environments in which they are required to use behavior analytic strategies. Third, we provide an example of a hypothetical graduate program that involves the proposed training. Finally, an evaluation plan is suggested for determining whether a training program is in fact producing students who are generalized problem-solvers. At each step, we justify our point of view from a perspective that combines principles from behavior analysis and educational systems design. PMID:22478633

  4. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  5. Impact of Recent Hardware and Software Trends on High Performance Transaction Processing and Analytics

    NASA Astrophysics Data System (ADS)

    Mohan, C.

    In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.

  6. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  7. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  8. Development and optimization of an analytical system for volatile organic compound analysis coming from the heating of interstellar/cometary ice analogues.

    PubMed

    Abou Mrad, Ninette; Duvernay, Fabrice; Theulé, Patrice; Chiavassa, Thierry; Danger, Grégoire

    2014-08-19

    This contribution presents an original analytical system for studying volatile organic compounds (VOC) coming from the heating and/or irradiation of interstellar/cometary ice analogues (VAHIIA system) through laboratory experiments. The VAHIIA system brings solutions to three analytical constraints regarding chromatography analysis: the low desorption kinetics of VOC (many hours) in the vacuum chamber during laboratory experiments, the low pressure under which they sublime (10(-9) mbar), and the presence of water in ice analogues. The VAHIIA system which we developed, calibrated, and optimized is composed of two units. The first is a preconcentration unit providing the VOC recovery. This unit is based on a cryogenic trapping which allows VOC preconcentration and provides an adequate pressure allowing their subsequent transfer to an injection unit. The latter is a gaseous injection unit allowing the direct injection into the GC-MS of the VOC previously transferred from the preconcentration unit. The feasibility of the online transfer through this interface is demonstrated. Nanomoles of VOC can be detected with the VAHIIA system, and the variability in replicate measurements is lower than 13%. The advantages of the GC-MS in comparison to infrared spectroscopy are pointed out, the GC-MS allowing an unambiguous identification of compounds coming from complex mixtures. Beyond the application to astrophysical subjects, these analytical developments can be used for all systems requiring vacuum/cryogenic environments.

  9. A New Unified Analysis of Estimate Errors by Model-Matching Phase-Estimation Methods for Sensorless Drive of Permanent-Magnet Synchronous Motors and New Trajectory-Oriented Vector Control, Part I

    NASA Astrophysics Data System (ADS)

    Shinnaka, Shinji; Sano, Kousuke

    This paper presents a new unified analysis of estimate errors by model-matching phase-estimation methods such as rotor-flux state-observers, back EMF state-observers, and back EMF disturbance-observers, for sensorless drive of permanent-magnet synchronous motors. Analytical solutions about estimate errors, whose validity is confirmed by numerical experiments, are rich in universality and applicability. As an example of universality and applicability, a new trajectory-oriented vector control method is proposed, which can realize directly quasi-optimal strategy minimizing total losses with no additional computational loads by simply orienting one of vector-control coordinates to the associated quasi-optimal trajectory. The coordinate orientation rule, which is analytically derived, is surprisingly simple. Consequently the trajectory-oriented vector control method can be applied to a number of conventional vector control systems using one of the model-matching phase-estimation methods.

  10. Analytical application of femtosecond laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Melikechi, Noureddine; Markushin, Yuri

    2015-05-01

    We report on significant advantages provided by femtosecond laser-induced breakdown spectroscopy (LIBS) for analytical applications in fields as diverse as protein characterization and material science. We compare the results of a femto- and nanosecond-laser-induced breakdown spectroscopy analysis of dual-elemental pellets in terms of the shot-to-shot variations of the neutral/ionic emission line intensities. This study is complemented by a numerical model based on two-dimensional random close packing of disks in an enclosed geometry. In addition, we show that LIBS can be used to obtain quantitative identification of the hydrogen composition of bio-macromolecules in a heavy water solution. Finally, we show that simultaneous multi-elemental particle assay analysis combined with LIBS can significantly improve macromolecule detectability up to near single molecule per particle efficiency. Research was supported by grants from the National Science Foundation Centers of Research Excellence in Science and Technology (0630388), National Aeronautics and Space Administration (NX09AU90A). Our gratitude to Dr. D. Connolly, Fox Chase Cancer Center.

  11. Analytical Applications of Transport Through Bulk Liquid Membranes.

    PubMed

    Diaconu, Ioana; Ruse, Elena; Aboul-Enein, Hassan Y; Bunaciu, Andrei A

    2016-07-03

    This review discusses the results of research in the use of bulk liquid membranes in separation processes and preconcentration for analytical purposes. It includes some theoretical aspects, definitions, types of liquid membranes, and transport mechanism, as well as advantages of using liquid membranes in laboratory studies. These concepts are necessary to understand fundamental principles of liquid membrane transport. Due to the multiple advantages of liquid membranes several studies present analytical applications of the transport through liquid membranes in separation or preconcentration processes of metallic cations and some organic compounds, such as phenol and phenolic derivatives, organic acids, amino acids, carbohydrates, and drugs. This review presents coupled techniques such as separation through the liquid membrane coupled with flow injection analysis.

  12. Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.

    2000-01-01

    Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.

  13. Geographic applications of ERTS-1 data to landscape change

    NASA Technical Reports Server (NTRS)

    Rehder, J. B.

    1973-01-01

    The analysis of landscape change requires large area coverage on a periodic basis in order to analyze aggregate changes over an extended period of time. To date, only the ERTS program can provide this capability. Three avenues of experimentation and analysis are being used in the investigation: (1) a multi-scale sampling procedure utilizing aircraft imagery for ground truth and control; (2) a densitometric and computer analytical experiment for the analysis of gray tone signatures, comparisons and ultimately for landscape change detection and monitoring; and (3) an ERTS image enhancement procedure for the detection and analysis of photomorphic regions.

  14. The Barcode of Life Data Portal: Bridging the Biodiversity Informatics Divide for DNA Barcoding

    PubMed Central

    Sarkar, Indra Neil; Trizna, Michael

    2011-01-01

    With the volume of molecular sequence data that is systematically being generated globally, there is a need for centralized resources for data exploration and analytics. DNA Barcode initiatives are on track to generate a compendium of molecular sequence–based signatures for identifying animals and plants. To date, the range of available data exploration and analytic tools to explore these data have only been available in a boutique form—often representing a frustrating hurdle for many researchers that may not necessarily have resources to install or implement algorithms described by the analytic community. The Barcode of Life Data Portal (BDP) is a first step towards integrating the latest biodiversity informatics innovations with molecular sequence data from DNA barcoding. Through establishment of community driven standards, based on discussion with the Data Analysis Working Group (DAWG) of the Consortium for the Barcode of Life (CBOL), the BDP provides an infrastructure for incorporation of existing and next-generation DNA barcode analytic applications in an open forum. PMID:21818249

  15. Integrated communication and control systems. I - Analysis

    NASA Technical Reports Server (NTRS)

    Halevi, Yoram; Ray, Asok

    1988-01-01

    The paper presents the results of an ICCS analysis focusing on discrete-time control systems subject to time-varying delays. The present analytical technique is applicable to integrated dynamic systems such as those encountered in advanced aircraft, spacecraft, and the real-time control of robots and machine tools via a high-speed network within an autonomous manufacturing environment. The significance of data latency and missynchronization between individual system components in ICCS networks is discussed in view of the time-varying delays.

  16. Determination of volatile organic compounds in human breath for Helicobacter pylori detection by SPME-GC/MS.

    PubMed

    Ulanowska, Agnieszka; Kowalkowski, Tomasz; Hrynkiewicz, Katarzyna; Jackowski, Marek; Buszewski, Bogusław

    2011-03-01

    Helicobacter pylori living in the human stomach release volatile organic compounds (VOCs) that can be detected in expired air. The aim of the study was the application of breath analysis for bacteria detection. It was accomplished by determination of VOCs characteristic for patients with H. pylori and the analysis of gases released by bacteria in suspension. Solid-phase microextraction was applied as a selective technique for preconcentration and isolation of analytes. Gas chromatography coupled with mass spectrometry was used for the separation and identification of volatile analytes in breath samples and bacterial headspace. For data calculation and processing, discriminant and factor analyses were used. Endogenous substances such as isobutane, 2-butanone and ethyl acetate were detected in the breath of persons with H. pylori in the stomach and in the gaseous mixture released by the bacteria strain but they were not identified in the breath of healthy volunteers. The canonical analysis of discrimination functions showed a strong difference between the three examined groups. Knowledge of substances emitted by H. pylori with the application of an optimized breath analysis method might become a very useful tool for noninvasive detection of this bacterium. Copyright © 2010 John Wiley & Sons, Ltd.

  17. Boron-doped diamond electrode: synthesis, characterization, functionalization and analytical applications.

    PubMed

    Luong, John H T; Male, Keith B; Glennon, Jeremy D

    2009-10-01

    In recent years, conductive diamond electrodes for electrochemical applications have been a major focus of research and development. The impetus behind such endeavors could be attributed to their wide potential window, low background current, chemical inertness, and mechanical durability. Several analytes can be oxidized by conducting diamond compared to other carbon-based materials before the breakdown of water in aqueous electrolytes. This is important for detecting and/or identifying species in solution since oxygen and hydrogen evolution do not interfere with the analysis. Thus, conductive diamond electrodes take electrochemical detection into new areas and extend their usefulness to analytes which are not feasible with conventional electrode materials. Different types of diamond electrodes, polycrystalline, microcrystalline, nanocrystalline and ultrananocrystalline, have been synthesized and characterized. Of particular interest is the synthesis of boron-doped diamond (BDD) films by chemical vapor deposition on various substrates. In the tetrahedral diamond lattice, each carbon atom is covalently bonded to its neighbors forming an extremely robust crystalline structure. Some carbon atoms in the lattice are substituted with boron to provide electrical conductivity. Modification strategies of doped diamond electrodes with metallic nanoparticles and/or electropolymerized films are of importance to impart novel characteristics or to improve the performance of diamond electrodes. Biofunctionalization of diamond films is also feasible to foster several useful bioanalytical applications. A plethora of opportunities for nanoscale analytical devices based on conducting diamond is anticipated in the very near future.

  18. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.

  19. Proteomics Is Analytical Chemistry: Fitness-for-Purpose in the Application of Top-Down and Bottom-Up Analyses.

    PubMed

    Coorssen, Jens R; Yergey, Alfred L

    2015-12-03

    Molecular mechanisms underlying health and disease function at least in part based on the flexibility and fine-tuning afforded by protein isoforms and post-translational modifications. The ability to effectively and consistently resolve these protein species or proteoforms, as well as assess quantitative changes is therefore central to proteomic analyses. Here we discuss the pros and cons of currently available and developing analytical techniques from the perspective of the full spectrum of available tools and their current applications, emphasizing the concept of fitness-for-purpose in experimental design based on consideration of sample size and complexity; this necessarily also addresses analytical reproducibility and its variance. Data quality is considered the primary criterion, and we thus emphasize that the standards of Analytical Chemistry must apply throughout any proteomic analysis.

  20. Development of a dynamic headspace gas chromatography-mass spectrometry method for on-site analysis of sulfur mustard degradation products in sediments.

    PubMed

    Magnusson, R; Nordlander, T; Östin, A

    2016-01-15

    Sampling teams performing work at sea in areas where chemical munitions may have been dumped require rapid and reliable analytical methods for verifying sulfur mustard leakage from suspected objects. Here we present such an on-site analysis method based on dynamic headspace GC-MS for analysis of five cyclic sulfur mustard degradation products that have previously been detected in sediments from chemical weapon dumping sites: 1,4-oxathiane, 1,3-dithiolane, 1,4-dithiane, 1,4,5-oxadithiephane, and 1,2,5-trithiephane. An experimental design involving authentic Baltic Sea sediments spiked with the target analytes was used to develop an optimized protocol for sample preparation, headspace extraction and analysis that afforded recoveries of up to 60-90%. The optimized method needs no organic solvents, uses only two grams of sediment on a dry weight basis and involves a unique sample presentation whereby sediment is spread uniformly as a thin layer inside the walls of a glass headspace vial. The method showed good linearity for analyte concentrations of 5-200 ng/g dw, good repeatability, and acceptable carry-over. The method's limits of detection for spiked sediment samples ranged from 2.5 to 11 μg/kg dw, with matrix interference being the main limiting factor. The instrumental detection limits were one to two orders of magnitude lower. Full-scan GC-MS analysis enabled the use of automated mass spectral deconvolution for rapid identification of target analytes. Using this approach, analytes could be identified in spiked sediment samples at concentrations down to 13-65 μg/kg dw. On-site validation experiments conducted aboard the research vessel R/V Oceania demonstrated the method's practical applicability, enabling the successful identification of four cyclic sulfur mustard degradation products at concentrations of 15-308μg/kg in sediments immediately after being collected near a wreck at the Bornholm Deep dumpsite in the Baltic Sea. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. The Benefits and Complexities of Operating Geographic Information Systems (GIS) in a High Performance Computing (HPC) Environment

    NASA Astrophysics Data System (ADS)

    Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.

    2017-12-01

    The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).

  2. Instrumental neutron activation analysis for studying size-fractionated aerosols

    NASA Astrophysics Data System (ADS)

    Salma, Imre; Zemplén-Papp, Éva

    1999-10-01

    Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.

  3. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  4. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  5. Chiral drug analysis using mass spectrometric detection relevant to research and practice in clinical and forensic toxicology.

    PubMed

    Schwaninger, Andrea E; Meyer, Markus R; Maurer, Hans H

    2012-12-21

    This paper reviews analytical approaches published in 2002-2012 for chiral drug analysis and their relevance in research and practice in the field of clinical and forensic toxicology. Separation systems such as gas chromatography, high performance liquid chromatography, capillary electromigration, and supercritical fluid chromatography, all coupled to mass spectrometry, are discussed. Typical applications are reviewed for relevant chiral analytes such as amphetamines and amphetamine-derived designer drugs, methadone, tramadol, psychotropic and other CNS acting drugs, anticoagulants, cardiovascular drugs, and some other drugs. Usefulness of chiral drug analysis in the interpretation of analytical results in clinical and forensic toxicology is discussed as well. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Analyzing the influence of institutions on health policy development in Uganda: a case study of the decision to abolish user fees.

    PubMed

    Moat, K A; Abelson, J

    2011-12-01

    During the 2001 election campaign, President Yoweri Museveni announced he was abolishing user fees for health services in Uganda. No analysis has been carried out to explain how he was able to initiate such an important policy decision without encountering any immediate barriers. To explain this outcome through in-depth policy analysis driven by the application of key analytical frameworks. An explanatory case study informed by analytical frameworks from the institutionalism literature was undertaken. Multiple data sources were used including: academic literature, key government documents, grey literature, and a variety of print media. According to the analytical frameworks employed, several formal institutional constraints existed that would have reduced the prospects for the abolition of user fees. However, prevalent informal institutions such as "Big Man" presidentialism and clientelism that were both 'competing' and 'complementary' can be used to explain the policy outcome. The analysis suggests that these factors trumped the impact of more formal institutional structures in the Ugandan context. Consideration should be given to the interactions between formal and informal institutions in the analysis of health policy processes in Uganda, as they provide a more nuanced understanding of how each set of factors influence policy outcomes.

  7. Recent developments in nickel electrode analysis

    NASA Technical Reports Server (NTRS)

    Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.

    1991-01-01

    Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.

  8. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  9. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    PubMed

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  10. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications

    PubMed Central

    Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.

    2016-01-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126

  11. Nucleic Acid i-Motif Structures in Analytical Chemistry.

    PubMed

    Alba, Joan Josep; Sadurní, Anna; Gargallo, Raimundo

    2016-09-02

    Under the appropriate experimental conditions of pH and temperature, cytosine-rich segments in DNA or RNA sequences may produce a characteristic folded structure known as an i-motif. Besides its potential role in vivo, which is still under investigation, this structure has attracted increasing interest in other fields due to its sharp, fast and reversible pH-driven conformational changes. This "on/off" switch at molecular level is being used in nanotechnology and analytical chemistry to develop nanomachines and sensors, respectively. This paper presents a review of the latest applications of this structure in the field of chemical analysis.

  12. Analytic Modeling of Pressurization and Cryogenic Propellant Conditions for Lunar Landing Vehicle

    NASA Technical Reports Server (NTRS)

    Corpening, Jeremy

    2010-01-01

    This slide presentation reviews the development, validation and application of the model to the Lunar Landing Vehicle. The model named, Computational Propellant and Pressurization Program -- One Dimensional (CPPPO), is used to model in this case cryogenic propellant conditions of the Altair Lunar lander. The validation of CPPPO was accomplished via comparison to an existing analytic model (i.e., ROCETS), flight experiment and ground experiments. The model was used to the Lunar Landing Vehicle perform a parametric analysis on pressurant conditions and to examine the results of unequal tank pressurization and draining for multiple tank designs.

  13. Visual analysis of large heterogeneous social networks by semantic and structural abstraction.

    PubMed

    Shen, Zeqian; Ma, Kwan-Liu; Eliassi-Rad, Tina

    2006-01-01

    Social network analysis is an active area of study beyond sociology. It uncovers the invisible relationships between actors in a network and provides understanding of social processes and behaviors. It has become an important technique in a variety of application areas such as the Web, organizational studies, and homeland security. This paper presents a visual analytics tool, OntoVis, for understanding large, heterogeneous social networks, in which nodes and links could represent different concepts and relations, respectively. These concepts and relations are related through an ontology (also known as a schema). OntoVis is named such because it uses information in the ontology associated with a social network to semantically prune a large, heterogeneous network. In addition to semantic abstraction, OntoVis also allows users to do structural abstraction and importance filtering to make large networks manageable and to facilitate analytic reasoning. All these unique capabilities of OntoVis are illustrated with several case studies.

  14. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  15. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  16. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  17. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  18. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  19. 21 CFR 314.50 - Content and format of an application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the protocol and a description of the statistical analyses used to evaluate the study. If the study... application: (i) Three copies of the analytical procedures and related descriptive information contained in... the samples and to validate the applicant's analytical procedures. The related descriptive information...

  20. A genetic algorithm-based job scheduling model for big data analytics.

    PubMed

    Lu, Qinghua; Li, Shanshan; Zhang, Weishan; Zhang, Lei

    Big data analytics (BDA) applications are a new category of software applications that process large amounts of data using scalable parallel processing infrastructure to obtain hidden value. Hadoop is the most mature open-source big data analytics framework, which implements the MapReduce programming model to process big data with MapReduce jobs. Big data analytics jobs are often continuous and not mutually separated. The existing work mainly focuses on executing jobs in sequence, which are often inefficient and consume high energy. In this paper, we propose a genetic algorithm-based job scheduling model for big data analytics applications to improve the efficiency of big data analytics. To implement the job scheduling model, we leverage an estimation module to predict the performance of clusters when executing analytics jobs. We have evaluated the proposed job scheduling model in terms of feasibility and accuracy.

  1. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  2. An integrated platform for directly widely-targeted quantitative analysis of feces part I: Platform configuration and method validation.

    PubMed

    Song, Yuelin; Song, Qingqing; Li, Jun; Zheng, Jiao; Li, Chun; Zhang, Yuan; Zhang, Lingling; Jiang, Yong; Tu, Pengfei

    2016-07-08

    Direct analysis is of great importance to understand the real chemical profile of a given sample, notably biological materials, because either chemical degradation or diverse errors and uncertainties might be resulted from sophisticated protocols. In comparison with biofluids, it is still challenging for direct analysis of solid biological samples using high performance liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Herein, a new analytical platform was configured by online hyphenating pressurized liquid extraction (PLE), turbulent flow chromatography (TFC), and LC-MS/MS. A facile, but robust PLE module was constructed based on the phenomenon that noticeable back-pressure can be generated during rapid fluid passing through a narrow tube. TFC column that is advantageous at extracting low molecular analytes from rushing fluid was employed to link at the outlet of the PLE module to capture constituents-of-interest. An electronic 6-port/2-position valve was introduced between TFC column and LC-MS/MS to fragment each measurement into extraction and elution phases, whereas LC-MS/MS took the charge of analyte separation and monitoring. As a proof of concept, simultaneous determination of 24 endogenous substances including eighteen steroids, five eicosanoids, and one porphyrin in feces was carried out in this paper. Method validation assays demonstrated the analytical platform to be qualified for directly simultaneous measurement of diverse endogenous analytes in fecal matrices. Application of this integrated platform on homolog-focused profiling of feces is discussed in a companion paper. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Desorption electrospray ionization-mass spectrometry for the detection of analytes extracted by thin-film molecularly imprinted polymers.

    PubMed

    Van Biesen, Geert; Wiseman, Justin M; Li, Jessica; Bottaro, Christina S

    2010-09-01

    Desorption electrospray ionization-mass spectrometry (DESI-MS) is a powerful technique for the analysis of solid and liquid surfaces that has found numerous applications in the few years since its invention. For the first time, it is applied to the detection of analytes extracted by molecularly imprinted polymers (MIPs) in a thin-film format. MIPs formed with 2,4-dichlorophenoxyacetic acid (2,4-D) as the template were used for the extraction of this analyte from aqueous solutions spiked at concentrations of 0.0050-2.0 mg L(-1) (approximately 2 x 10(-8) to approximately 1 x 10(-5) M). The response was linear up to 0.50 mg L(-1), and then levelled off due to saturation of the active sites of the MIP. In MS/MS mode, the signal at 0.0050 mg L(-1) was still an order of magnitude higher than the signal of a blank. The MIP DESI-MS approach was also used for the analysis of tap water and river water spiked with 2,4-D and four analogues, which indicated that these analogues were also extracted to various extents. For practical applications of the MIP, a detection technique is required that can distinguish between these structurally similar compounds, and DESI-MS fulfills this purpose.

  4. Recent advances in magnesium assessment: From single selective sensors to multisensory approach.

    PubMed

    Lvova, Larisa; Gonçalves, Carla Guanais; Di Natale, Corrado; Legin, Andrey; Kirsanov, Dmitry; Paolesse, Roberto

    2018-03-01

    The development of efficient analytical procedures for the selective detection of magnesium is an important analytical task, since this element is one of the most abundant metals in cells and plays an essential role in a plenty of cellular processes. Magnesium misbalance has been related to several pathologies and diseases both in plants and animals, as far as in humans, but the number of suitable methods for magnesium detection especially in life sample and biological environments is scarce. Chemical sensors, due to their high reliability, simplicity of handling and instrumentation, fast and real-time in situ and on site analysis are promising candidates for magnesium analysis and represent an attractive alternative to the standard instrumental methods. Here the recent achievements in the development of chemical sensors for magnesium ions detection over the last decade are reviewed. The working principles and the main types of sensors applied are described. Focus is placed on the optical sensors and multisensory systems applications for magnesium assessment in different media. Further, a critical outlook on the employment of multisensory approach in comparison to single selective sensors application in biological samples is presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Analytical method for analysis of electromagnetic scattering from inhomogeneous spherical structures using duality principles

    NASA Astrophysics Data System (ADS)

    Kiani, M.; Abdolali, A.; Safari, M.

    2018-03-01

    In this article, an analytical approach is presented for the analysis of electromagnetic (EM) scattering from radially inhomogeneous spherical structures (RISSs) based on the duality principle. According to the spherical symmetry, similar angular dependencies in all the regions are considered using spherical harmonics. To extract the radial dependency, the system of differential equations of wave propagation toward the inhomogeneity direction is equated with the dual planar ones. A general duality between electromagnetic fields and parameters and scattering parameters of the two structures is introduced. The validity of the proposed approach is verified through a comprehensive example. The presented approach substitutes a complicated problem in spherical coordinate to an easy, well posed, and previously solved problem in planar geometry. This approach is valid for all continuously varying inhomogeneity profiles. One of the major advantages of the proposed method is the capability of studying two general and applicable types of RISSs. As an interesting application, a class of lens antenna based on the physical concept of the gradient refractive index material is introduced. The approach is used to analyze the EM scattering from the structure and validate strong performance of the lens.

  6. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  7. Comparative Kinetic Analysis of Closed-Ended and Open-Ended Porous Sensors

    NASA Astrophysics Data System (ADS)

    Zhao, Yiliang; Gaur, Girija; Mernaugh, Raymond L.; Laibinis, Paul E.; Weiss, Sharon M.

    2016-09-01

    Efficient mass transport through porous networks is essential for achieving rapid response times in sensing applications utilizing porous materials. In this work, we show that open-ended porous membranes can overcome diffusion challenges experienced by closed-ended porous materials in a microfluidic environment. A theoretical model including both transport and reaction kinetics is employed to study the influence of flow velocity, bulk analyte concentration, analyte diffusivity, and adsorption rate on the performance of open-ended and closed-ended porous sensors integrated with flow cells. The analysis shows that open-ended pores enable analyte flow through the pores and greatly reduce the response time and analyte consumption for detecting large molecules with slow diffusivities compared with closed-ended pores for which analytes largely flow over the pores. Experimental confirmation of the results was carried out with open- and closed-ended porous silicon (PSi) microcavities fabricated in flow-through and flow-over sensor configurations, respectively. The adsorption behavior of small analytes onto the inner surfaces of closed-ended and open-ended PSi membrane microcavities was similar. However, for large analytes, PSi membranes in a flow-through scheme showed significant improvement in response times due to more efficient convective transport of analytes. The experimental results and theoretical analysis provide quantitative estimates of the benefits offered by open-ended porous membranes for different analyte systems.

  8. Active Flash: Out-of-core Data Analytics on Flash Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boboila, Simona; Kim, Youngjae; Vazhkudai, Sudharshan S

    2012-01-01

    Next generation science will increasingly come to rely on the ability to perform efficient, on-the-fly analytics of data generated by high-performance computing (HPC) simulations, modeling complex physical phenomena. Scientific computing workflows are stymied by the traditional chaining of simulation and data analysis, creating multiple rounds of redundant reads and writes to the storage system, which grows in cost with the ever-increasing gap between compute and storage speeds in HPC clusters. Recent HPC acquisitions have introduced compute node-local flash storage as a means to alleviate this I/O bottleneck. We propose a novel approach, Active Flash, to expedite data analysis pipelines bymore » migrating to the location of the data, the flash device itself. We argue that Active Flash has the potential to enable true out-of-core data analytics by freeing up both the compute core and the associated main memory. By performing analysis locally, dependence on limited bandwidth to a central storage system is reduced, while allowing this analysis to proceed in parallel with the main application. In addition, offloading work from the host to the more power-efficient controller reduces peak system power usage, which is already in the megawatt range and poses a major barrier to HPC system scalability. We propose an architecture for Active Flash, explore energy and performance trade-offs in moving computation from host to storage, demonstrate the ability of appropriate embedded controllers to perform data analysis and reduction tasks at speeds sufficient for this application, and present a simulation study of Active Flash scheduling policies. These results show the viability of the Active Flash model, and its capability to potentially have a transformative impact on scientific data analysis.« less

  9. Curriculum Analytics: Application of Social Network Analysis for Improving Strategic Curriculum Decision-Making in a Research-Intensive University

    ERIC Educational Resources Information Center

    Dawson, Shane; Hubball, Harry

    2014-01-01

    This paper provides insight into the use of curriculum analytics to enhance learning-centred curricula in diverse higher education contexts. Engagement in evidence-based practice to evaluate and monitor curricula is vital to the success and sustainability of efforts to reform undergraduate and graduate programs. Emerging technology-enabled inquiry…

  10. Application of Microchip Electrophoresis for Clinical Tests

    NASA Astrophysics Data System (ADS)

    Yatsushiro, Shouki; Kataoka, Masatoshi

    Microchip electrophoresis has recently attracted much attention in the field of nuclear acid analysis due to its high efficiency, ease of operation, low consumption of samples and reagents, and relatively low costs. In addition, the analysis has expanded to an analytical field like not only the analysis of DNA but also the analysis of RNA, the protein, the sugar chain, and the cellular function, etc. In this report, we showed that high-performance monitoring systems for human blood glucose levels and α-amylase activity in human plasma using microchip electrophoresis.

  11. Analytical methods for quantitation of prenylated flavonoids from hops.

    PubMed

    Nikolić, Dejan; van Breemen, Richard B

    2013-01-01

    The female flowers of hops ( Humulus lupulus L.) are used as a flavoring agent in the brewing industry. There is growing interest in possible health benefits of hops, particularly as estrogenic and chemopreventive agents. Among the possible active constituents, most of the attention has focused on prenylated flavonoids, which can chemically be classified as prenylated chalcones and prenylated flavanones. Among chalcones, xanthohumol (XN) and desmethylxanthohumol (DMX) have been the most studied, while among flavanones, 8-prenylnaringenin (8-PN) and 6-prenylnaringenin (6-PN) have received the most attention. Because of the interest in medicinal properties of prenylated flavonoids, there is demand for accurate, reproducible and sensitive analytical methods to quantify these compounds in various matrices. Such methods are needed, for example, for quality control and standardization of hop extracts, measurement of the content of prenylated flavonoids in beer, and to determine pharmacokinetic properties of prenylated flavonoids in animals and humans. This review summarizes currently available analytical methods for quantitative analysis of the major prenylated flavonoids, with an emphasis on the LC-MS and LC-MS-MS methods and their recent applications to biomedical research on hops. This review covers all methods in which prenylated flavonoids have been measured, either as the primary analytes or as a part of a larger group of analytes. The review also discusses methodological issues relating to the quantitative analysis of these compounds regardless of the chosen analytical approach.

  12. Tank 241-B-108, cores 172 and 173 analytical results for the final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuzum, J.L., Fluoro Daniel Hanford

    1997-03-04

    The Data Summary Table (Table 3) included in this report compiles analytical results in compliance with all applicable DQOS. Liquid subsamples that were prepared for analysis by an acid adjustment of the direct subsample are indicated by a `D` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a fusion digest are indicated by an `F` in the A column in Table 3. Solid subsamples that were prepared for analysis by performing a water digest are indicated by a I.wl. or an `I` in the A column of Table 3. Due to poormore » precision and accuracy in original analysis of both Lower Half Segment 2 of Core 173 and the core composite of Core 173, fusion and water digests were performed for a second time. Precision and accuracy improved with the repreparation of Core 173 Composite. Analyses with the repreparation of Lower Half Segment 2 of Core 173 did not show improvement and suggest sample heterogeneity. Results from both preparations are included in Table 3.« less

  13. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    PubMed

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. RSE-40: An Alternate Scoring System for the Rosenberg Self-Esteem Scale (RSE).

    ERIC Educational Resources Information Center

    Wallace, Gaylen R.

    The Rosenberg Self-Esteem Inventory (RSE) is a 10-item scale purporting to measure self-esteem using self-acceptance and self-worth statements. This analysis covers concerns about the degree to which the RSE items represent a particular content universe, the RSE's applicability, factor analytic methods used, and the RSE's reliability and validity.…

  15. Examining Cohort Effects in Developmental Trajectories of Substance Use

    ERIC Educational Resources Information Center

    Burns, Alison Reimuller; Hussong, Andrea M.; Solis, Jessica M.; Curran, Patrick J.; McGinley, James S.; Bauer, Daniel J.; Chassin, Laurie; Zucker, Robert A.

    2017-01-01

    The current study demonstrates the application of an analytic approach for incorporating multiple time trends in order to examine the impact of cohort effects on individual trajectories of eight drugs of abuse. Parallel analysis of two independent, longitudinal studies of high-risk youth that span ages 10 to 40 across 23 birth cohorts between 1968…

  16. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  17. A guide for the application of analytics on healthcare processes: A dynamic view on patient pathways.

    PubMed

    Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan

    2016-10-01

    The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Infrared photothermal imaging of trace explosives on relevant substrates

    NASA Astrophysics Data System (ADS)

    Kendziora, Christopher A.; Furstenberg, Robert; Papantonakis, Michael; Nguyen, Viet; Borchert, James; Byers, Jeff; McGill, R. Andrew

    2013-06-01

    We are developing a technique for the stand-off detection of trace explosives on relevant substrate surfaces using photo-thermal infrared (IR) imaging spectroscopy (PT-IRIS). This approach leverages one or more compact IR quantum cascade lasers, tuned to strong absorption bands in the analytes and directed to illuminate an area on a surface of interest. An IR focal plane array is used to image the surface and detect small increases in thermal emission upon laser illumination. The PT-IRIS signal is processed as a hyperspectral image cube comprised of spatial, spectral and temporal dimensions as vectors within a detection algorithm. The ability to detect trace analytes on relevant substrates is critical for stand-off applications, but is complicated by the optical and thermal analyte/substrate interactions. This manuscript describes recent PT-IRIS experimental results and analysis for traces of RDX, TNT, ammonium nitrate (AN) and sucrose on relevant substrates (steel, polyethylene, glass and painted steel panels). We demonstrate that these analytes can be detected on these substrates at relevant surface mass loadings (10 μg/cm2 to 100 μg/cm2) even at the single pixel level.

  19. IDIMS/GEOPAK: Users manual for a geophysical data display and analysis system

    NASA Technical Reports Server (NTRS)

    Libert, J. M.

    1982-01-01

    The application of an existing image analysis system to the display and analysis of geophysical data is described, the potential for expanding the capabilities of such a system toward more advanced computer analytic and modeling functions is investigated. The major features of the IDIMS (Interactive Display and Image Manipulation System) and its applicability for image type analysis of geophysical data are described. Development of a basic geophysical data processing system to permit the image representation, coloring, interdisplay and comparison of geophysical data sets using existing IDIMS functions and to provide for the production of hard copies of processed images was described. An instruction manual and documentation for the GEOPAK subsystem was produced. A training course for personnel in the use of the IDIMS/GEOPAK was conducted. The effectiveness of the current IDIMS/GEOPAK system for geophysical data analysis was evaluated.

  20. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    NASA Astrophysics Data System (ADS)

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-11-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes--including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH--in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media.

  1. Strategy to improve the quantitative LC-MS analysis of molecular ions resistant to gas-phase collision induced dissociation: application to disulfide-rich cyclic peptides.

    PubMed

    Ciccimaro, Eugene; Ranasinghe, Asoka; D'Arienzo, Celia; Xu, Carrie; Onorato, Joelle; Drexler, Dieter M; Josephs, Jonathan L; Poss, Michael; Olah, Timothy

    2014-12-02

    Due to observed collision induced dissociation (CID) fragmentation inefficiency, developing sensitive liquid chromatography tandem mass spectrometry (LC-MS/MS) assays for CID resistant compounds is especially challenging. As an alternative to traditional LC-MS/MS, we present here a methodology that preserves the intact analyte ion for quantification by selectively filtering ions while reducing chemical noise. Utilizing a quadrupole-Orbitrap MS, the target ion is selectively isolated while interfering matrix components undergo MS/MS fragmentation by CID, allowing noise-free detection of the analyte's surviving molecular ion. In this manner, CID affords additional selectivity during high resolution accurate mass analysis by elimination of isobaric interferences, a fundamentally different concept than the traditional approach of monitoring a target analyte's unique fragment following CID. This survivor-selected ion monitoring (survivor-SIM) approach has allowed sensitive and specific detection of disulfide-rich cyclic peptides extracted from plasma.

  2. Sensor failure detection for jet engines

    NASA Technical Reports Server (NTRS)

    Merrill, Walter C.

    1988-01-01

    The use of analytical redundancy to improve gas turbine engine control system reliability through sensor failure detection, isolation, and accommodation is surveyed. Both the theoretical and application papers that form the technology base of turbine engine analytical redundancy research are discussed. Also, several important application efforts are reviewed. An assessment of the state-of-the-art in analytical redundancy technology is given.

  3. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  4. Visual Analytics of integrated Data Systems for Space Weather Purposes

    NASA Astrophysics Data System (ADS)

    Rosa, Reinaldo; Veronese, Thalita; Giovani, Paulo

    Analysis of information from multiple data sources obtained through high resolution instrumental measurements has become a fundamental task in all scientific areas. The development of expert methods able to treat such multi-source data systems, with both large variability and measurement extension, is a key for studying complex scientific phenomena, especially those related to systemic analysis in space and environmental sciences. In this talk, we present a time series generalization introducing the concept of generalized numerical lattice, which represents a discrete sequence of temporal measures for a given variable. In this novel representation approach each generalized numerical lattice brings post-analytical data information. We define a generalized numerical lattice as a set of three parameters representing the following data properties: dimensionality, size and post-analytical measure (e.g., the autocorrelation, Hurst exponent, etc)[1]. From this representation generalization, any multi-source database can be reduced to a closed set of classified time series in spatiotemporal generalized dimensions. As a case study, we show a preliminary application in space science data, highlighting the possibility of a real time analysis expert system. In this particular application, we have selected and analyzed, using detrended fluctuation analysis (DFA), several decimetric solar bursts associated to X flare-classes. The association with geomagnetic activity is also reported. DFA method is performed in the framework of a radio burst automatic monitoring system. Our results may characterize the variability pattern evolution, computing the DFA scaling exponent, scanning the time series by a short windowing before the extreme event [2]. For the first time, the application of systematic fluctuation analysis for space weather purposes is presented. The prototype for visual analytics is implemented in a Compute Unified Device Architecture (CUDA) by using the K20 Nvidia graphics processing units (GPUs) to reduce the integrated analysis runtime. [1] Veronese et al. doi: 10.6062/jcis.2009.01.02.0021, 2010. [2] Veronese et al. doi:http://dx.doi.org/10.1016/j.jastp.2010.09.030, 2011.

  5. The use of ultra-high pressure liquid chromatography with tandem mass spectrometric detection of analysis of agrochemical residues and mycotoxines in food - challenges and applications

    USDA-ARS?s Scientific Manuscript database

    In the field of food contaminant analysis, the most significant development of recent years has been the integration of ultra-high pressure liquid chromatography (UHPLC), coupled to tandem quadrupole mass spectrometry (MS/MS), into analytical applications. In this review, we describe the emergence o...

  6. Analysis methods for Kevlar shield response to rotor fragments

    NASA Technical Reports Server (NTRS)

    Gerstle, J. H.

    1977-01-01

    Several empirical and analytical approaches to rotor burst shield sizing are compared and principal differences in metal and fabric dynamic behavior are discussed. The application of transient structural response computer programs to predict Kevlar containment limits is described. For preliminary shield sizing, present analytical methods are useful if insufficient test data for empirical modeling are available. To provide other information useful for engineering design, analytical methods require further developments in material characterization, failure criteria, loads definition, and post-impact fragment trajectory prediction.

  7. Review on microfluidic paper-based analytical devices towards commercialisation.

    PubMed

    Akyazi, Tugce; Basabe-Desmonts, Lourdes; Benito-Lopez, Fernando

    2018-02-25

    Paper-based analytical devices introduce an innovative platform technology for fluid handling and analysis, with wide range of applications, promoting low cost, ease of fabrication/operation and equipment independence. This review gives a general overview on the fabrication techniques reported to date, revealing and discussing their weak points as well as the newest approaches in order to overtake current mass production limitations and therefore commercialisation. Moreover, this review aims especially to highlight novel technologies appearing in literature for the effective handling and controlling of fluids. The lack of flow control is the main problem of paper-based analytical devices, which generates obstacles for marketing and slows down the transition of paper devices from the laboratory into the consumers' hands. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Composite material bend-twist coupling for wind turbine blade applications

    NASA Astrophysics Data System (ADS)

    Walsh, Justin M.

    Current efforts in wind turbine blade design seek to employ bend-twist coupling of composite materials for passive power control by twisting blades to feather. Past efforts in this area of study have proved to be problematic, especially in formulation of the bend-twist coupling coefficient alpha. Kevlar/epoxy, carbon/epoxy and glass/epoxy specimens were manufactured to study bend-twist coupling, from which numerical and analytical models could be verified. Finite element analysis was implemented to evaluate fiber orientation and material property effects on coupling magnitude. An analytical/empirical model was then derived to describe numerical results and serve as a replacement for the commonly used coupling coefficient alpha. Through the results from numerical and analytical models, a foundation for aeroelastic design of wind turbines blades utilizing biased composite materials is provided.

  9. Application of stiffened cylinder analysis to ATP interior noise studies

    NASA Technical Reports Server (NTRS)

    Wilby, E. G.; Wilby, J. F.

    1983-01-01

    An analytical model developed to predict the interior noise of propeller driven aircraft was applied to experimental configurations for a Fairchild Swearingen Metro II fuselage exposed to simulated propeller excitation. The floor structure of the test fuselage was of unusual construction - mounted on air springs. As a consequence, the analytical model was extended to include a floor treatment transmission coefficient which could be used to describe vibration attenuation through the mounts. Good agreement was obtained between measured and predicted noise reductions when the foor treatment transmission loss was about 20 dB - a value which is consistent with the vibration attenuation provided by the mounts. The analytical model was also adapted to allow the prediction of noise reductions associated with boundary layer excitation as well as propeller and reverberant noise.

  10. Buckling of a stiff thin film on an elastic graded compliant substrate.

    PubMed

    Chen, Zhou; Chen, Weiqiu; Song, Jizhou

    2017-12-01

    The buckling of a stiff film on a compliant substrate has attracted much attention due to its wide applications such as thin-film metrology, surface patterning and stretchable electronics. An analytical model is established for the buckling of a stiff thin film on a semi-infinite elastic graded compliant substrate subjected to in-plane compression. The critical compressive strain and buckling wavelength for the sinusoidal mode are obtained analytically for the case with the substrate modulus decaying exponentially. The rigorous finite element analysis (FEA) is performed to validate the analytical model and investigate the postbuckling behaviour of the system. The critical buckling strain for the period-doubling mode is obtained numerically. The influences of various material parameters on the results are investigated. These results are helpful to provide physical insights on the buckling of elastic graded substrate-supported thin film.

  11. Buckling of a stiff thin film on an elastic graded compliant substrate

    NASA Astrophysics Data System (ADS)

    Chen, Zhou; Chen, Weiqiu; Song, Jizhou

    2017-12-01

    The buckling of a stiff film on a compliant substrate has attracted much attention due to its wide applications such as thin-film metrology, surface patterning and stretchable electronics. An analytical model is established for the buckling of a stiff thin film on a semi-infinite elastic graded compliant substrate subjected to in-plane compression. The critical compressive strain and buckling wavelength for the sinusoidal mode are obtained analytically for the case with the substrate modulus decaying exponentially. The rigorous finite element analysis (FEA) is performed to validate the analytical model and investigate the postbuckling behaviour of the system. The critical buckling strain for the period-doubling mode is obtained numerically. The influences of various material parameters on the results are investigated. These results are helpful to provide physical insights on the buckling of elastic graded substrate-supported thin film.

  12. On the accuracy of analytical models of impurity segregation during directional melt crystallization and their applicability for quantitative calculations

    NASA Astrophysics Data System (ADS)

    Voloshin, A. E.; Prostomolotov, A. I.; Verezub, N. A.

    2016-11-01

    The paper deals with the analysis of the accuracy of some one-dimensional (1D) analytical models of the axial distribution of impurities in the crystal grown from a melt. The models proposed by Burton-Prim-Slichter, Ostrogorsky-Muller and Garandet with co-authors are considered, these models are compared to the results of a two-dimensional (2D) numerical simulation. Stationary solutions as well as solutions for the initial transient regime obtained using these models are considered. The sources of errors are analyzed, a conclusion is made about the applicability of 1D analytical models for quantitative estimates of impurity incorporation into the crystal sample as well as for the solution of the inverse problems.

  13. Aptamer-Modified Magnetic Beads in Biosensing

    PubMed Central

    Scheper, Thomas; Walter, Johanna-Gabriela

    2018-01-01

    Magnetic beads (MBs) are versatile tools for the purification, detection, and quantitative analysis of analytes from complex matrices. The superparamagnetic property of magnetic beads qualifies them for various analytical applications. To provide specificity, MBs can be decorated with ligands like aptamers, antibodies and peptides. In this context, aptamers are emerging as particular promising ligands due to a number of advantages. Most importantly, the chemical synthesis of aptamers enables straightforward and controlled chemical modification with linker molecules and dyes. Moreover, aptamers facilitate novel sensing strategies based on their oligonucleotide nature that cannot be realized with conventional peptide-based ligands. Due to these benefits, the combination of aptamers and MBs was already used in various analytical applications which are summarized in this article. PMID:29601533

  14. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  15. Recent advances in applications of nanomaterials for sample preparation.

    PubMed

    Xu, Linnan; Qi, Xiaoyue; Li, Xianjiang; Bai, Yu; Liu, Huwei

    2016-01-01

    Sample preparation is a key step for qualitative and quantitative analysis of trace analytes in complicated matrix. Along with the rapid development of nanotechnology in material science, numerous nanomaterials have been developed with particularly useful applications in analytical chemistry. Benefitting from their high specific areas, increased surface activities, and unprecedented physical/chemical properties, the potentials of nanomaterials for rapid and efficient sample preparation have been exploited extensively. In this review, recent progress of novel nanomaterials applied in sample preparation has been summarized and discussed. Both nanoparticles and nanoporous materials are evaluated for their unusual performance in sample preparation. Various compositions and functionalizations extended the applications of nanomaterials in sample preparations, and distinct size and shape selectivity was generated from the diversified pore structures of nanoporous materials. Such great variety make nanomaterials a kind of versatile tools in sample preparation for almost all categories of analytes. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Thin layer chromatography residue applicator sampler

    DOEpatents

    Nunes, Peter J [Danville, CA; Kelly, Fredrick R [Modesto, CA; Haas, Jeffrey S [San Ramon, CA; Andresen, Brian D [Livermore, CA

    2007-07-24

    A thin layer chromatograph residue applicator sampler. The residue applicator sampler provides for rapid analysis of samples containing high explosives, chemical warfare, and other analyses of interest under field conditions. This satisfied the need for a field-deployable, small, hand-held, all-in-one device for efficient sampling, sample dissolution, and sample application to an analytical technique. The residue applicator sampler includes a sampling sponge that is resistant to most chemicals and is fastened via a plastic handle in a hermetically sealed tube containing a known amount of solvent. Upon use, the wetted sponge is removed from the sealed tube and used as a swiping device across an environmental sample. The sponge is then replaced in the hermetically sealed tube where the sample remains contained and dissolved in the solvent. A small pipette tip is removably contained in the hermetically sealed tube. The sponge is removed and placed into the pipette tip where a squeezing-out of the dissolved sample from the sponge into the pipette tip results in a droplet captured in a vial for later instrumental analysis, or applied directly to a thin layer chromatography plate for immediate analysis.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daley, P F

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water samplingmore » and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method detection limit estimates and trend plotting were performed with spreadsheets and statistics software. Moreover, the analytical method developed was very limited in compound coverage, and unable to closely mirror the standard analytical methods promulgated by the EPA. To address these deficiencies, during this award the original equipment was operated at the OU 2-GTS to further evaluate the use of columns, commercial standard blends and other components to broaden the compound coverage of the chromatography system. A second-generation ASAP was designed and built to replace the original system at the OU 2-GTS, and include provision for introduction of internal standard compounds and surrogates into each sample analyzed. An enhanced, LabVIEW based chromatogram analysis application was written, that manages and archives chemical standards information, and provides a basis for NIST traceability for all analyses. Within this same package, all compound calibration response curves are managed, and different report formats were incorporated, that simplify trend analysis. Test results focus on operation of the original system at the OU 1 Integrated Chemical and Flow Monitoring System, at the OU 1 Fire Drill Area remediation site.« less

  18. Evaluation of a recent product to remove lipids and other matrix co-extractives in the analysis of pesticide residues and environmental contaminants in foods.

    PubMed

    Han, Lijun; Matarrita, Jessie; Sapozhnikova, Yelena; Lehotay, Steven J

    2016-06-03

    This study demonstrates the application of a novel lipid removal product to the residue analysis of 65 pesticides and 52 environmental contaminants in kale, pork, salmon, and avocado by fast, low pressure gas chromatography - tandem mass spectrometry (LPGC-MS/MS). Sample preparation involves QuEChERS extraction followed by use of EMR-Lipid ("enhanced matrix removal of lipids") and an additional salting out step for cleanup. The optimal amount of EMR-Lipid was determined to be 500mg for 2.5mL extracts for most of the analytes. The co-extractive removal efficiency by the EMR-Lipid cleanup step was 83-98% for fatty samples and 79% for kale, including 76% removal of chlorophyll. Matrix effects were typically less than ±20%, in part because analyte protectants were used in the LPGC-MS/MS analysis. The recoveries of polycyclic aromatic hydrocarbons and diverse pesticides were mostly 70-120%, whereas recoveries of nonpolar polybrominated diphenyl ethers and polychlorinated biphenyls were mostly lower than 70% through the cleanup procedure. With the use of internal standards, method validation results showed that 76-85 of the 117 analytes achieved satisfactory results (recoveries of 70-120% and RSD≤20%) in pork, avocado, and kale, while 53 analytes had satisfactory results in salmon. Detection limits were 5-10ng/g for all but a few analytes. EMR-Lipid is a new sample preparation tool that serves as another useful option for cleanup in multiresidue analysis, particularly of fatty foods. Published by Elsevier B.V.

  19. On the theory of drainage area for regular and non-regular points.

    PubMed

    Bonetti, S; Bragg, A D; Porporato, A

    2018-03-01

    The drainage area is an important, non-local property of a landscape, which controls surface and subsurface hydrological fluxes. Its role in numerous ecohydrological and geomorphological applications has given rise to several numerical methods for its computation. However, its theoretical analysis has lagged behind. Only recently, an analytical definition for the specific catchment area was proposed (Gallant & Hutchinson. 2011 Water Resour. Res. 47 , W05535. (doi:10.1029/2009WR008540)), with the derivation of a differential equation whose validity is limited to regular points of the watershed. Here, we show that such a differential equation can be derived from a continuity equation (Chen et al. 2014 Geomorphology 219 , 68-86. (doi:10.1016/j.geomorph.2014.04.037)) and extend the theory to critical and singular points both by applying Gauss's theorem and by means of a dynamical systems approach to define basins of attraction of local surface minima. Simple analytical examples as well as applications to more complex topographic surfaces are examined. The theoretical description of topographic features and properties, such as the drainage area, channel lines and watershed divides, can be broadly adopted to develop and test the numerical algorithms currently used in digital terrain analysis for the computation of the drainage area, as well as for the theoretical analysis of landscape evolution and stability.

  20. On the theory of drainage area for regular and non-regular points

    NASA Astrophysics Data System (ADS)

    Bonetti, S.; Bragg, A. D.; Porporato, A.

    2018-03-01

    The drainage area is an important, non-local property of a landscape, which controls surface and subsurface hydrological fluxes. Its role in numerous ecohydrological and geomorphological applications has given rise to several numerical methods for its computation. However, its theoretical analysis has lagged behind. Only recently, an analytical definition for the specific catchment area was proposed (Gallant & Hutchinson. 2011 Water Resour. Res. 47, W05535. (doi:10.1029/2009WR008540)), with the derivation of a differential equation whose validity is limited to regular points of the watershed. Here, we show that such a differential equation can be derived from a continuity equation (Chen et al. 2014 Geomorphology 219, 68-86. (doi:10.1016/j.geomorph.2014.04.037)) and extend the theory to critical and singular points both by applying Gauss's theorem and by means of a dynamical systems approach to define basins of attraction of local surface minima. Simple analytical examples as well as applications to more complex topographic surfaces are examined. The theoretical description of topographic features and properties, such as the drainage area, channel lines and watershed divides, can be broadly adopted to develop and test the numerical algorithms currently used in digital terrain analysis for the computation of the drainage area, as well as for the theoretical analysis of landscape evolution and stability.

  1. Comprehensive two-dimensional liquid chromatography for polyphenol analysis in foodstuffs.

    PubMed

    Cacciola, Francesco; Farnetti, Sara; Dugo, Paola; Marriott, Philip John; Mondello, Luigi

    2017-01-01

    Polyphenols are a class of plant secondary metabolites that are recently drawing a special interest because of their broad spectrum of pharmacological effects. As they are characterized by an enormous structural variability, the identification of these molecules in food samples is a difficult task, and sometimes having only a limited number of commercially available reference materials is not of great help. One-dimensional liquid chromatography is the most widely applied analytical approach for their analysis. In particular, the hyphenation of liquid chromatography to mass spectrometry has come to play an influential role by allowing relatively fast tentative identification and accurate quantification of polyphenolic compounds at trace levels in vegetable media. However, when dealing with very complex real-world food samples, a single separation system often does not provide sufficient resolving power for attaining rewarding results. Comprehensive two-dimensional liquid chromatography is a technique of great analytical impact, since it offers much higher peak capacities than separations in a single dimension. In the present review, we describe applications in the field of comprehensive two-dimensional liquid chromatography for polyphenol analysis in real-world food samples. Comprehensive two-dimensional liquid chromatography applications to nonfood matrices fall outside the scope of the current report and will not be discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A method for determining spiral-bevel gear tooth geometry for finite element analysis

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.; Litvin, Faydor L.

    1991-01-01

    An analytical method was developed to determine gear tooth surface coordinates of face-milled spiral bevel gears. The method uses the basic gear design parameters in conjunction with the kinematical aspects of spiral bevel gear manufacturing machinery. A computer program, SURFACE, was developed. The computer program calculates the surface coordinates and outputs 3-D model data that can be used for finite element analysis. Development of the modeling method and an example case are presented. This analysis method could also find application for gear inspection and near-net-shape gear forging die design.

  3. Mars Geochemical Instrument (MarGI): An instrument for the analysis of the Martian surface and the search for evidence of life

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Mancinelli, Rocco; Martin, Joe; Holland, Paul M.; Stimac, Robert M.; Kaye, William J.

    2005-01-01

    The Mars Geochemical Instrument, MarGI, was developed to provide a comprehensive analysis of the rocks and surface material on Mars. The instrument combines Differential Thermal Analysis (DTA) with miniature Gas Chromatography-Ion Mobility Spectrometry (GC-IMS) to identify minerals, the presence and state of water, and organic compounds. Miniature pyrolysis ovens are used to both, conduct DTA analysis of soil or crushed rocks samples, and pyrolyze the samples at temperatures up to 1000 degrees C for GC-IMS analysis of the released gases. This combination of analytical processes and techniques, which can characterize the mineralogy of the rocks and soil, and identify and quantify volatiles released during pyrolysis, has applications across a wide range of target sites including comets, planets, asteroids, and moons such as Titan and Europa. The MarGI analytical approach evolved from the Cometary Ice and Dust Experiment (CIDEX) selected to fly on the Comet Rendezvous Asteroid Flyby Mission (CRAF).

  4. On the Temporal Stability of Analyte Recognition with an E-Nose Based on a Metal Oxide Sensor Array in Practical Applications.

    PubMed

    Kiselev, Ilia; Sysoev, Victor; Kaikov, Igor; Koronczi, Ilona; Adil Akai Tegin, Ruslan; Smanalieva, Jamila; Sommer, Martin; Ilicali, Coskan; Hauptmannl, Michael

    2018-02-11

    The paper deals with a functional instability of electronic nose (e-nose) units which significantly limits their real-life applications. Here we demonstrate how to approach this issue with example of an e-nose based on a metal oxide sensor array developed at the Karlsruhe Institute of Technology (Germany). We consider the instability of e-nose operation at different time scales ranging from minutes to many years. To test the e-nose we employ open-air and headspace sampling of analyte odors. The multivariate recognition algorithm to process the multisensor array signals is based on the linear discriminant analysis method. Accounting for the received results, we argue that the stability of device operation is mostly affected by accidental changes in the ambient air composition. To overcome instabilities, we introduce the add-training procedure which is found to successfully manage both the temporal changes of ambient and the drift of multisensor array properties, even long-term. The method can be easily implemented in practical applications of e-noses and improve prospects for device marketing.

  5. Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.

    PubMed

    Ötles, Semih; Kartal, Canan

    2016-01-01

    Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.

  6. On the Temporal Stability of Analyte Recognition with an E-Nose Based on a Metal Oxide Sensor Array in Practical Applications

    PubMed Central

    Kaikov, Igor; Koronczi, Ilona; Adil Akai Tegin, Ruslan; Smanalieva, Jamila; Sommer, Martin; Ilicali, Coskan; Hauptmannl, Michael

    2018-01-01

    The paper deals with a functional instability of electronic nose (e-nose) units which significantly limits their real-life applications. Here we demonstrate how to approach this issue with example of an e-nose based on a metal oxide sensor array developed at the Karlsruhe Institute of Technology (Germany). We consider the instability of e-nose operation at different time scales ranging from minutes to many years. To test the e-nose we employ open-air and headspace sampling of analyte odors. The multivariate recognition algorithm to process the multisensor array signals is based on the linear discriminant analysis method. Accounting for the received results, we argue that the stability of device operation is mostly affected by accidental changes in the ambient air composition. To overcome instabilities, we introduce the add-training procedure which is found to successfully manage both the temporal changes of ambient and the drift of multisensor array properties, even long-term. The method can be easily implemented in practical applications of e-noses and improve prospects for device marketing. PMID:29439468

  7. A Study on Predictive Analytics Application to Ship Machinery Maintenance

    DTIC Science & Technology

    2013-09-01

    Looking at the nature of the time series forecasting method , it would be better applied to offline analysis . The application for real- time online...other system attributes in future. Two techniques of statistical analysis , mainly time series models and cumulative sum control charts, are discussed in...statistical tool employed for the two techniques of statistical analysis . Both time series forecasting as well as CUSUM control charts are shown to be

  8. Seamless Digital Environment – Data Analytics Use Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oxstrand, Johanna

    Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less

  9. Microorganisms in inorganic chemical analysis.

    PubMed

    Godlewska-Zyłkiewicz, Beata

    2006-01-01

    There are innumerable strains of microbes (bacteria, yeast and fungi) that degrade or transform chemicals and compounds into simpler, safer or less toxic substances. These bioprocesses have been used for centuries in the treatment of municipal wastes, in wine, cheese and bread making, and in bioleaching and metal recovery processes. Recent literature shows that microorganisms can be also used as effective sorbents for solid phase extraction procedures. This review reveals that fundamental nonanalytical studies on the parameters and conditions of biosorption processes and on metal-biomass interactions often result in efficient analytical procedures and biotechnological applications. Some selected examples illustrate the latest developments in the biosorption of metals by microbial biomass, which have opened the door to the application of microorganisms to analyte preconcentration, matrix separation and speciation analysis.

  10. Novel Flourescent Sensors for the Detection of Organic Molecules in Extraterrestrial Samples

    NASA Astrophysics Data System (ADS)

    Adkin, Roy C.; Bruce, James I.; Pearson, Victoria K.

    2015-04-01

    Organic compounds in extraterrestrial samples have mostly been elucidated by destructive analytical techniques therefore information regarding spatial relationships between minerals and organic species is lost. Minerals form under specific chemical and physical conditions so organic compounds associated with these minerals are likely to have formed under the same conditions. It is therefore possible to infer in which cosmological provinces their chemical evolution took place. We will describe progress towards developing fluorescent sensors that may resolve spatial discrimination. Lanthanide elements such as europium and terbium produce well defined line-like, high intensity and long lived fluorescent emissions. Interactions with organic molecules may alter the luminescent emission characteristics. The lanthanide atom needs to be rendered chemically inert but must remain susceptible to these organic molecule interactions. An organic ligand must be employed to attain this. DOTA (1,4,7,10-tetraazacyclododecanetetracetic acid) was chosen as a plausible organic ligand because its structure, a tetra-substituted cyclen ring, and ability to chelate are well characterized. It is also commercially available. Fluorescent lanthanide-DOTA complexes are used in many biological and analytical imaging applications so it is logical to investigate their applicability to fluorimetric analysis of extraterrestrial organics. Lanthanide-DOTA complexes are very stable because the lanthanide metal atom is enveloped within the DOTA structure. Experimental procedures were designed to investigate lanthanide/analyte interactions and their effect upon fluorescent emissions. A range of compounds were chosen giving a good representation of the organics identified in extraterrestrial samples and whether they may to interact with the lanthanide metal ion. An Europium-DOTA baseline fluorescent spectrum was obtained and compared against Europium-DOTA/analyte mixtures of a range of concentrations resembling those present in extraterrestrial samples. Upon collation and analysis of results a much reduced set of analytes were chosen for experimentation with Terbium-DOTA. Results showed no change in fluorescent intensity or emission spectrum for any of the analytes at the concentrations found in extraterrestrial samples (μM to nM). This could be due to no interaction at any concentration of analyte or there is an intrinsic limit of detection. Experiments were carried out at equimolar concentration with fewer analytes. It was found that here was an increase in fluorescent intensity for some analytes and decrease for others (e.g. adenine and ornithine, respectively). There was no discernible trend in behaviour according to analyte structure or how they might interact as a result. Attention has now turned to the tris-substituted cyclen ring, DO3A, which could afford improved scope for interaction. DOTA is an unsuitable ligand to use for the sensor. Experimentation has shown that neither lanthanide-DOTA complexes exhibited a change in fluorescent spectrum; the ligand requires modification not the choice of lanthanide. We will present results from the development and preliminary testing of the DO3A sensor.

  11. Determination of Curcuminoids in Curcuma longa Linn. by UPLC/Q-TOF-MS: An Application in Turmeric Cultivation.

    PubMed

    Ashraf, Kamran; Mujeeb, Mohd; Ahmad, Altaf; Ahmad, Niyaz; Amir, Mohd

    2015-09-01

    Cucuma longa Linn. (Fam-Zingiberaceae) is a valued medicinal plant contains curcuminoids (curcumin, demethoxycurcumin and bisdemethoxycurcumin) as major bioactive constituents. Previously reported analytical methods for analysis of curcuminoids were found to suffer from low resolution, lower sensitivity and longer analytical times. In this study, a rapid, sensitive, selective high-throughput ultra high performance liquid chromatography-tandem mass spectrometry (UPLC/Q-TOF-MS) method was developed and validated for the quantification of curcuminoids with an aim to reduce analysis time and enhance efficiency. UPLC/Q-TOF-MS analysis showed large variation (1.408-5.027% w/w) of curcuminoids among different samples with respect to their occurrence of metabolite and their concentration. The results showed that Erode (south province) contains highest quantity of curcuminoids and concluded to be the superior varieties. The results obtained here could be valuable for devising strategies for cultivating this medicinal plant. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Development of an analytical scheme for simazine and 2,4-D in soil and water runoff from ornamental plant nursery plots.

    PubMed

    Sutherland, Devon J; Stearman, G Kim; Wells, Martha J M

    2003-01-01

    The transport and fate of pesticides applied to ornamental plant nursery crops are not well documented. Methodology for analysis of soil and water runoff samples concomitantly containing the herbicides simazine (1-chloro-4,6-bis(ethylamino)-s-triazine) and 2,4-D ((2,4-dichlorophenoxy)acetic acid) was developed in this research to investigate the potential for runoff and leaching from ornamental nursery plots. Solid-phase extraction was used prior to analysis by gas chromatography and liquid chromatography. Chromatographic results were compared with determination by enzyme-linked immunoassay analysis. The significant analytical contributions of this research include (1) the development of a scheme using chromatographic mode sequencing for the fractionation of simazine and 2,4-D, (2) optimization of the homogeneous derivatization of 2,4-D using the methylating agent boron trifluoride in methanol as an alternative to in situ generation of diazomethane, and (3) the practical application of these techniques to field samples.

  13. Frontiers of two-dimensional correlation spectroscopy. Part 2. Perturbation methods, fields of applications, and types of analytical probes

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.

  14. Study on quantitative analysis of Ti, Al and V in clinical soft tissues after placing the dental implants by laser ablation inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Sajnóg, Adam; Hanć, Anetta; Makuch, Krzysztof; Koczorowski, Ryszard; Barałkiewicz, Danuta

    2016-11-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) was used for in-situ quantitative analysis of oral mucosa of patients before and after implantation with titanium implants and a closing screw based on Ti6Al4V alloy. Two calibration strategies were applied, both were based on matrix matched solid standards with analytes addition. A novel approach was the application of powdered egg white proteins as a matrix material which have a similar composition to the examined tissue. In the another approach, certified reference material Bovine Muscle ERM-BB184 was used. The isotope 34S was found to be the most appropriate as an internal standard since it is homogenously distributed in the examined tissues and resulted in lower relative standard deviation values of signal of analytes of interest. Other isotopes (13C, 26Mg, 43Ca) were also evaluated as potential internal standards. The analytical performance parameters and microwave digestion of solid standards followed by solution nebulization ICP-MS analysis proved that both calibration methods are fit for their intended purpose. The LA-ICP-MS analysis on the surface of tissues after the implantation process revealed an elevated content of elements in comparison to the control group. Analytes are distributed inhomogeneously and display local maximal content of Ti up to ca. 900 μg g- 1, Al up to ca. 760 μg g- 1 and for V up to 160 μg g- 1.

  15. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2012-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory.

  16. Capability of the Gas Analysis and Testing Laboratory at the NASA Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Broerman, Craig; Jimenez, Javier; Sweterlitsch, Jeff

    2011-01-01

    The Gas Analysis and Testing Laboratory is an integral part of the testing performed at the NASA Johnson Space Center. The Gas Analysis and Testing Laboratory is a high performance laboratory providing real time analytical instruments to support manned and unmanned testing. The lab utilizes precision gas chromatographs, gas analyzers and spectrophotometers to support the technology development programs within the NASA community. The Gas Analysis and Testing Laboratory works with a wide variety of customers and provides engineering support for user-specified applications in compressed gas, chemical analysis, general and research laboratory

  17. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  18. Analysis of High-Throughput ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  19. Many-core graph analytics using accelerated sparse linear algebra routines

    NASA Astrophysics Data System (ADS)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  20. Exhaled breath condensate – from an analytical point of view

    PubMed Central

    Dodig, Slavica; Čepelak, Ivana

    2013-01-01

    Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297

  1. Investigation of hydrophobic substrates for solution residue analysis utilizing an ambient desorption liquid sampling-atmospheric pressure glow discharge microplasma.

    PubMed

    Paing, Htoo W; Marcus, R Kenneth

    2018-03-12

    A practical method for preparation of solution residue samples for analysis utilizing the ambient desorption liquid sampling-atmospheric pressure glow discharge optical emission spectroscopy (AD-LS-APGD-OES) microplasma is described. Initial efforts involving placement of solution aliquots in wells drilled into copper substrates, proved unsuccessful. A design-of-experiment (DOE) approach was carried out to determine influential factors during sample deposition including solution volume, solute concentration, number of droplets deposited, and the solution matrix. These various aspects are manifested in the mass of analyte deposited as well as the size/shape of the product residue. Statistical analysis demonstrated that only those initial attributes were significant factors towards the emission response of the analyte. Various approaches were investigated to better control the location/uniformity of the deposited sample. Three alternative substrates, a glass slide, a poly(tetrafluoro)ethylene (PTFE) sheet, and a polydimethylsiloxane (PDMS)-coated glass slide, were evaluated towards the microplasma analytical performance. Co-deposition with simple organic dyes provided an accurate means of determining the location of the analyte with only minor influence on emission responses. The PDMS-coated glass provided the best performance by virtue of its providing a uniform spatial distribution of the residue material. This uniformity yielded an improved limits of detection by approximately 22× for 20 μL and 4 x for 2 μL over the other two substrates. While they operate by fundamentally different processes, this choice of substrate is not restricted to the LS-APGD, but may also be applicable to other AD methods such as DESI, DART, or LIBS. Further developments will be directed towards a field-deployable ambient desorption OES source for quantitative analysis of microvolume solution residues of nuclear forensics importance.

  2. Comparative spectral analysis of veterinary powder product by continuous wavelet and derivative transforms

    NASA Astrophysics Data System (ADS)

    Dinç, Erdal; Kanbur, Murat; Baleanu, Dumitru

    2007-10-01

    Comparative simultaneous determination of chlortetracycline and benzocaine in the commercial veterinary powder product was carried out by continuous wavelet transform (CWT) and classical derivative transform (or classical derivative spectrophotometry). In this quantitative spectral analysis, two proposed analytical methods do not require any chemical separation process. In the first step, several wavelet families were tested to find an optimal CWT for the overlapping signal processing of the analyzed compounds. Subsequently, we observed that the coiflets (COIF-CWT) method with dilation parameter, a = 400, gives suitable results for this analytical application. For a comparison, the classical derivative spectrophotometry (CDS) approach was also applied to the simultaneous quantitative resolution of the same analytical problem. Calibration functions were obtained by measuring the transform amplitudes corresponding to zero-crossing points for both CWT and CDS methods. The utility of these two analytical approaches were verified by analyzing various synthetic mixtures consisting of chlortetracycline and benzocaine and they were applied to the real samples consisting of veterinary powder formulation. The experimental results obtained from the COIF-CWT approach were statistically compared with those obtained by classical derivative spectrophotometry and successful results were reported.

  3. A new approach for downscaling of electromembrane extraction as a lab on-a-chip device followed by sensitive Red-Green-Blue detection.

    PubMed

    Baharfar, Mahroo; Yamini, Yadollah; Seidi, Shahram; Arain, Muhammad Balal

    2018-05-30

    A new design of electromembrane extraction (EME) as a lab on-a-chip device was proposed for the extraction and determination of phenazopyridine as the model analyte. The extraction procedure was accomplished by coupling of EME and the packing of a sorbent. The analyte was extracted under the applied electrical field across a membrane sheet impregnated by nitrophenyl octylether (NPOE) into an acceptor phase. It was followed by the absorption of the analyte on strong cation exchanger as a sorbent. The designed chip contained separate spiral channels for donor and acceptor phases featuring embedded platinum electrodes to enhance extraction efficiency. The selected donor and acceptor phases were 0 mM HCl and 100 mM HCl, respectively. The on-chip electromembrane extraction was carried out under the voltage level of 70 V for 50 min. The analysis was carried out by two modes of a simple Red-Green-Blue (RGB) image analysis tool and a conventional HPLC-UV system. After the absorption of the analyte on the solid phase, its color changed and a digital picture of the sorbent was taken for the RGB analysis. The effective parameters on the performance of the chip device, comprising the EME and solid phase microextraction steps, were distinguished and optimized. The accumulation of the analyte on the solid phase showed excellent sensitivity and a limit of detection (LOD) lower than 1.0 μg L-1 achieved by an image analysis using a smartphone. This device also offered acceptable intra- and inter-assay RSD% (<10%). The calibration curves were linear within the range of 10-1000 μg L-1 and 30-1000 μg L-1 (r2 > 0.9969) for HPLC-UV and RGB analysis, respectively. To investigate the applicability of the method in complicated matrices, urine samples of patients being treated with phenazopyridine were analyzed.

  4. Focused analyte spray emission apparatus and process for mass spectrometric analysis

    DOEpatents

    Roach, Patrick J [Kennewick, WA; Laskin, Julia [Richland, WA; Laskin, Alexander [Richland, WA

    2012-01-17

    An apparatus and process are disclosed that deliver an analyte deposited on a substrate to a mass spectrometer that provides for trace analysis of complex organic analytes. Analytes are probed using a small droplet of solvent that is formed at the junction between two capillaries. A supply capillary maintains the droplet of solvent on the substrate; a collection capillary collects analyte desorbed from the surface and emits analyte ions as a focused spray to the inlet of a mass spectrometer for analysis. The invention enables efficient separation of desorption and ionization events, providing enhanced control over transport and ionization of the analyte.

  5. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  6. A comprehensive analytical model of rotorcraft aerodynamics and dynamics. Part 3: Program manual

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1980-01-01

    The computer program for a comprehensive analytical model of rotorcraft aerodynamics and dynamics is described. This analysis is designed to calculate rotor performance, loads, and noise; the helicopter vibration and gust response; the flight dynamics and handling qualities; and the system aeroelastic stability. The analysis is a combination of structural, inertial, and aerodynamic models that is applicable to a wide range of problems and a wide class of vehicles. The analysis is intended for use in the design, testing, and evaluation of rotors and rotorcraft and to be a basis for further development of rotary wing theories.

  7. A capillary electrophoresis chip for the analysis of print and film photographic developing agents in commercial processing solutions using indirect fluorescence detection.

    PubMed

    Sirichai, S; de Mello, A J

    2001-01-01

    The separation and detection of both print and film developing agents (CD-3 and CD-4) in photographic processing solutions using chip-based capillary electrophoresis is presented. For simultaneous detection of both analytes under identical experimental conditions a buffer pH of 11.9 is used to partially ionise the analytes. Detection is made possible by indirect fluorescence, where the ions of the analytes displace the anionic fluorescing buffer ion to create negative peaks. Under optimal conditions, both analytes can be analyzed within 30 s. The limits of detection for CD-3 and CD-4 are 0.17 mM and 0.39 mM, respectively. The applicability of the method for the analysis of seasoned photographic processing developer solutions is also examined.

  8. Barriers to Achieving Economies of Scale in Analysis of EHR Data. A Cautionary Tale.

    PubMed

    Sendak, Mark P; Balu, Suresh; Schulman, Kevin A

    2017-08-09

    Signed in 2009, the Health Information Technology for Economic and Clinical Health Act infused $28 billion of federal funds to accelerate adoption of electronic health records (EHRs). Yet, EHRs have produced mixed results and have even raised concern that the current technology ecosystem stifles innovation. We describe the development process and report initial outcomes of a chronic kidney disease analytics application that identifies high-risk patients for nephrology referral. The cost to validate and integrate the analytics application into clinical workflow was $217,138. Despite the success of the program, redundant development and validation efforts will require $38.8 million to scale the application across all multihospital systems in the nation. We address the shortcomings of current technology investments and distill insights from the technology industry. To yield a return on technology investments, we propose policy changes that address the underlying issues now being imposed on the system by an ineffective technology business model.

  9. Maxwell-Wagner Effect Applied to Microwave-Induced Self-Ignition: A Novel Approach for Carbon-Based Materials.

    PubMed

    Bizzi, Cezar A; Cruz, Sandra M; Schmidt, Lucas; Burrow, Robert A; Barin, Juliano S; Paniz, Jose N G; Flores, Erico M M

    2018-04-03

    A new method for analytical applications based on the Maxwell-Wagner effect is proposed. Considering the interaction of carbonaceous materials with an electromagnetic field in the microwave frequency range, a very fast heating is observed due to interfacial polarization that results in localized microplasma formation. Such effect was evaluated in this work using a monomode microwave system, and temperature was recorded using an infrared camera. For analytical applications, a closed reactor under oxygen pressure was evaluated. The combination of high temperature and oxidant atmosphere resulted in a very effective self-ignition reaction of sample, allowing its use as sample preparation procedure for further elemental analysis. After optimization, a high sample mass (up to 600 mg of coal and graphite) was efficiently digested using only 4 mol L -1 HNO 3 as absorbing solution. Several elements (Ba, Ca, Fe, K, Li, Mg, Na, and Zn) were determined by inductively coupled plasma optical emission spectrometry (ICP-OES). Accuracy was evaluated by using a certified reference material (NIST 1632b). Blanks were negligible, and only a diluted solution was required for analytes absorption preventing residue generation and making the proposed method in agreement with green chemistry recommendations. The feasibility of the proposed method for hard-to-digest materials, the minimization of reagent consumption, and the possibility of multi elemental analysis with lower blanks and better limits of detection can be considered as the main advantages of this method.

  10. Theory for polymer analysis using nanopore-based single-molecule mass spectrometry

    PubMed Central

    Reiner, Joseph E.; Kasianowicz, John J.; Nablo, Brian J.; Robertson, Joseph W. F.

    2010-01-01

    Nanometer-scale pores have demonstrated potential for the electrical detection, quantification, and characterization of molecules for biomedical applications and the chemical analysis of polymers. Despite extensive research in the nanopore sensing field, there is a paucity of theoretical models that incorporate the interactions between chemicals (i.e., solute, solvent, analyte, and nanopore). Here, we develop a model that simultaneously describes both the current blockade depth and residence times caused by individual poly(ethylene glycol) (PEG) molecules in a single α-hemolysin ion channel. Modeling polymer-cation binding leads to a description of two significant effects: a reduction in the mobile cation concentration inside the pore and an increase in the affinity between the polymer and the pore. The model was used to estimate the free energy of formation for K+-PEG inside the nanopore (≈-49.7 meV) and the free energy of PEG partitioning into the nanopore (≈0.76 meV per ethylene glycol monomer). The results suggest that rational, physical models for the analysis of analyte-nanopore interactions will develop the full potential of nanopore-based sensing for chemical and biological applications. PMID:20566890

  11. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    PubMed

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  12. Preliminary Evaluation of an Aviation Safety Thesaurus' Utility for Enhancing Automated Processing of Incident Reports

    NASA Technical Reports Server (NTRS)

    Barrientos, Francesca; Castle, Joseph; McIntosh, Dawn; Srivastava, Ashok

    2007-01-01

    This document presents a preliminary evaluation the utility of the FAA Safety Analytics Thesaurus (SAT) utility in enhancing automated document processing applications under development at NASA Ames Research Center (ARC). Current development efforts at ARC are described, including overviews of the statistical machine learning techniques that have been investigated. An analysis of opportunities for applying thesaurus knowledge to improving algorithm performance is then presented.

  13. Critical evaluation of sample pretreatment techniques.

    PubMed

    Hyötyläinen, Tuulia

    2009-06-01

    Sample preparation before chromatographic separation is the most time-consuming and error-prone part of the analytical procedure. Therefore, selecting and optimizing an appropriate sample preparation scheme is a key factor in the final success of the analysis, and the judicious choice of an appropriate procedure greatly influences the reliability and accuracy of a given analysis. The main objective of this review is to critically evaluate the applicability, disadvantages, and advantages of various sample preparation techniques. Particular emphasis is placed on extraction techniques suitable for both liquid and solid samples.

  14. Technology Assessment and Policy Analysis

    ERIC Educational Resources Information Center

    Majone, Giandomenico

    1977-01-01

    Argues that the application of policy analysis to technology assessment requires the abandonment of stereotyped approaches and a reformulation of analytical paradigms to include consideration of institutional constraints. Available from: Elsevier Scientific Publishing Company, Box 211, Amsterdam, the Netherlands, single copies available.…

  15. 10 CFR 431.173 - Requirements applicable to all manufacturers.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... COMMERCIAL AND INDUSTRIAL EQUIPMENT Provisions for Commercial Heating, Ventilating, Air-Conditioning and... is based on engineering or statistical analysis, computer simulation or modeling, or other analytic... method or methods used; (B) The mathematical model, the engineering or statistical analysis, computer...

  16. NASTRAN applications to aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    White, J. L.; Beste, D. L.

    1975-01-01

    The use of NASTRAN in propulsion system structural integration analysis is described. Computer support programs for modeling, substructuring, and plotting analysis results are discussed. Requirements on interface information and data exchange by participants in a NASTRAN substructure analysis are given. Static and normal modes vibration analysis results are given with comparison to test and other analytical results.

  17. Application of Non-Deterministic Methods to Assess Modeling Uncertainties for Reinforced Carbon-Carbon Debris Impacts

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Fasanella, Edwin L.; Melis, Matthew; Carney, Kelly; Gabrys, Jonathan

    2004-01-01

    The Space Shuttle Columbia Accident Investigation Board (CAIB) made several recommendations for improving the NASA Space Shuttle Program. An extensive experimental and analytical program has been developed to address two recommendations related to structural impact analysis. The objective of the present work is to demonstrate the application of probabilistic analysis to assess the effect of uncertainties on debris impacts on Space Shuttle Reinforced Carbon-Carbon (RCC) panels. The probabilistic analysis is used to identify the material modeling parameters controlling the uncertainty. A comparison of the finite element results with limited experimental data provided confidence that the simulations were adequately representing the global response of the material. Five input parameters were identified as significantly controlling the response.

  18. Fast liquid chromatography combined with mass spectrometry for the analysis of metabolites and proteins in human body fluids.

    PubMed

    Kortz, Linda; Helmschrodt, Christin; Ceglarek, Uta

    2011-03-01

    In the last decade various analytical strategies have been established to enhance separation speed and efficiency in high performance liquid chromatography applications. Chromatographic supports based on monolithic material, small porous particles, and porous layer beads have been developed and commercialized to improve throughput and separation efficiency. This paper provides an overview of current developments in fast chromatography combined with mass spectrometry for the analysis of metabolites and proteins in clinical applications. Advances and limitations of fast chromatography for the combination with mass spectrometry are discussed. Practical aspects of, recent developments in, and the present status of high-throughput analysis of human body fluids for therapeutic drug monitoring, toxicology, clinical metabolomics, and proteomics are presented.

  19. Parallel Aircraft Trajectory Optimization with Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Falck, Robert D.; Gray, Justin S.; Naylor, Bret

    2016-01-01

    Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.

  20. Numerical modelling and experimental analysis of acoustic emission

    NASA Astrophysics Data System (ADS)

    Gerasimov, S. I.; Sych, T. V.

    2018-05-01

    In the present paper, the authors report on the application of non-destructive acoustic waves technologies to determine the structural integrity of engineering components. In particular, a finite element (FE) system COSMOS/M is used to investigate propagation characteristics of ultrasonic waves in linear, plane and three-dimensional structures without and with geometric concentrators. In addition, the FE results obtained are compared to the analytical and experimental ones. The study illustrates the efficient use of the FE method to model guided wave propagation problems and demonstrates the FE method’s potential to solve problems when an analytical solution is not possible due to “complicated” geometry.

  1. On fatigue crack growth under random loading

    NASA Astrophysics Data System (ADS)

    Zhu, W. Q.; Lin, Y. K.; Lei, Y.

    1992-09-01

    A probabilistic analysis of the fatigue crack growth, fatigue life and reliability of a structural or mechanical component is presented on the basis of fracture mechanics and theory of random processes. The material resistance to fatigue crack growth and the time-history of the stress are assumed to be random. Analytical expressions are obtained for the special case in which the random stress is a stationary narrow-band Gaussian random process, and a randomized Paris-Erdogan law is applicable. As an example, the analytical method is applied to a plate with a central crack, and the results are compared with those obtained from digital Monte Carlo simulations.

  2. A simple formula for the effective complex conductivity of periodic fibrous composites with interfacial impedance and applications to biological tissues

    NASA Astrophysics Data System (ADS)

    Bisegna, Paolo; Caselli, Federica

    2008-06-01

    This paper presents a simple analytical expression for the effective complex conductivity of a periodic hexagonal arrangement of conductive circular cylinders embedded in a conductive matrix, with interfaces exhibiting a capacitive impedance. This composite material may be regarded as an idealized model of a biological tissue comprising tubular cells, such as skeletal muscle. The asymptotic homogenization method is adopted, and the corresponding local problem is solved by resorting to Weierstrass elliptic functions. The effectiveness of the present analytical result is proved by convergence analysis and comparison with finite-element solutions and existing models.

  3. An Analytic Form for the Interresponse Time Analysis of Shull, Gaynor, and Grimes with Applications and Extensions

    ERIC Educational Resources Information Center

    Kessel, Robert; Lucke, Robert L.

    2008-01-01

    Shull, Gaynor and Grimes advanced a model for interresponse time distribution using probabilistic cycling between a higher-rate and a lower-rate response process. Both response processes are assumed to be random in time with a constant rate. The cycling between the two processes is assumed to have a constant transition probability that is…

  4. Identification of Fatty Acids, Phospholipids, and Their Oxidation Products Using Matrix-Assisted Laser Desorption Ionization Mass Spectrometry and Electrospray Ionization Mass Spectrometry

    ERIC Educational Resources Information Center

    Harmon, Christopher W.; Mang, Stephen A.; Greaves, John; Finlayson-Pitts, Barbara J.

    2010-01-01

    Electrospray ionization mass spectrometry (ESI-MS) and matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) have found increasing application in the analysis of biological samples. Using these techniques to solve problems in analytical chemistry should be an essential component of the training of undergraduate chemists. We…

  5. Development, optimization, validation and application of faster gas chromatography - flame ionization detector method for the analysis of total petroleum hydrocarbons in contaminated soils.

    PubMed

    Zubair, Abdulrazaq; Pappoe, Michael; James, Lesley A; Hawboldt, Kelly

    2015-12-18

    This paper presents an important new approach to improving the timeliness of Total Petroleum Hydrocarbon (TPH) analysis in the soil by Gas Chromatography - Flame Ionization Detector (GC-FID) using the CCME Canada-Wide Standard reference method. The Canada-Wide Standard (CWS) method is used for the analysis of petroleum hydrocarbon compounds across Canada. However, inter-laboratory application of this method for the analysis of TPH in the soil has often shown considerable variability in the results. This could be due, in part, to the different gas chromatography (GC) conditions, other steps involved in the method, as well as the soil properties. In addition, there are differences in the interpretation of the GC results, which impacts the determination of the effectiveness of remediation at hydrocarbon-contaminated sites. In this work, multivariate experimental design approach was used to develop and validate the analytical method for a faster quantitative analysis of TPH in (contaminated) soil. A fractional factorial design (fFD) was used to screen six factors to identify the most significant factors impacting the analysis. These factors included: injection volume (μL), injection temperature (°C), oven program (°C/min), detector temperature (°C), carrier gas flow rate (mL/min) and solvent ratio (v/v hexane/dichloromethane). The most important factors (carrier gas flow rate and oven program) were then optimized using a central composite response surface design. Robustness testing and validation of model compares favourably with the experimental results with percentage difference of 2.78% for the analysis time. This research successfully reduced the method's standard analytical time from 20 to 8min with all the carbon fractions eluting. The method was successfully applied for fast TPH analysis of Bunker C oil contaminated soil. A reduced analytical time would offer many benefits including an improved laboratory reporting times, and overall improved clean up efficiency. The method was successfully applied for the analysis of TPH of Bunker C oil in contaminated soil. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  6. MALDI matrices for low molecular weight compounds: an endless story?

    PubMed

    Calvano, Cosima Damiana; Monopoli, Antonio; Cataldi, Tommaso R I; Palmisano, Francesco

    2018-04-23

    Since its introduction in the 1980s, matrix-assisted laser desorption/ionization mass spectrometry (MALDI MS) has gained a prominent role in the analysis of high molecular weight biomolecules such as proteins, peptides, oligonucleotides, and polysaccharides. Its application to low molecular weight compounds has remained for long time challenging due to the spectral interferences produced by conventional organic matrices in the low m/z window. To overcome this problem, specific sample preparation such as analyte/matrix derivatization, addition of dopants, or sophisticated deposition technique especially useful for imaging experiments, have been proposed. Alternative approaches based on second generation (rationally designed) organic matrices, ionic liquids, and inorganic matrices, including metallic nanoparticles, have been the object of intense and continuous research efforts. Definite evidences are now provided that MALDI MS represents a powerful and invaluable analytical tool also for small molecules, including their quantification, thus opening new, exciting applications in metabolomics and imaging mass spectrometry. This review is intended to offer a concise critical overview of the most recent achievements about MALDI matrices capable of specifically address the challenging issue of small molecules analysis. Graphical abstract An ideal Book of matrices for MALDI MS of small molecules.

  7. Fusion Analytics: A Data Integration System for Public Health and Medical Disaster Response Decision Support

    PubMed Central

    Passman, Dina B.

    2013-01-01

    Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.

  8. Hadoop for High-Performance Climate Analytics: Use Cases and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Tamkin, Glenn

    2013-01-01

    Scientific data services are a critical aspect of the NASA Center for Climate Simulations mission (NCCS). Hadoop, via MapReduce, provides an approach to high-performance analytics that is proving to be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. The NCCS is particularly interested in the potential of Hadoop to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we prototyped a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. The initial focus was on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. After preliminary results suggested that this approach improves efficiencies within data intensive analytic workflows, we invested in building a cyber infrastructure resource for developing a new generation of climate data analysis capabilities using Hadoop. This resource is focused on reducing the time spent in the preparation of reanalysis data used in data-model inter-comparison, a long sought goal of the climate community. This paper summarizes the related use cases and lessons learned.

  9. Analysis of effect of flameholder characteristics on lean, premixed, partially vaporized fuel-air mixtures quality and nitrogen oxides emissions

    NASA Technical Reports Server (NTRS)

    Cooper, L. P.

    1981-01-01

    An analysis was conducted of the effect of flameholding devices on the precombustion fuel-air characteristics and on oxides of nitrogen (NOx) emissions for combustion of premixed partially vaporized mixtures. The analysis includes the interrelationships of flameholder droplet collection efficiency, reatomization efficiency and blockage, and the initial droplet size distribution and accounts for the contribution of droplet combustion in partially vaporized mixtures to NOx emissions. Application of the analytical procedures is illustrated and parametric predictions of NOx emissions are presented.

  10. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    PubMed

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Heavy hydrocarbon main injector technology program

    NASA Technical Reports Server (NTRS)

    Arbit, H. A.; Tuegel, L. M.; Dodd, F. E.

    1991-01-01

    The Heavy Hydrocarbon Main Injector Program was an analytical, design, and test program to demonstrate an injection concept applicable to an Isolated Combustion Compartment of a full-scale, high pressure, LOX/RP-1 engine. Several injector patterns were tested in a 3.4-in. combustor. Based on these results, features of the most promising injector design were incorporated into a 5.7-in. injector which was then hot-fire tested. In turn, a preliminary design of a 5-compartment 2D combustor was based on this pattern. Also the additional subscale injector testing and analysis was performed with an emphasis on improving analytical techniques and acoustic cavity design methodology. Several of the existing 3.5-in. diameter injectors were hot-fire tested with and without acoustic cavities for spontaneous and dynamic stability characteristics.

  12. A surrogate analyte method to determine D-serine in mouse brain using liquid chromatography-tandem mass spectrometry.

    PubMed

    Kinoshita, Kohnosuke; Jingu, Shigeji; Yamaguchi, Jun-ichi

    2013-01-15

    A bioanalytical method for determining endogenous d-serine levels in the mouse brain using a surrogate analyte and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed. [2,3,3-(2)H]D-serine and [(15)N]D-serine were used as a surrogate analyte and an internal standard, respectively. The surrogate analyte was spiked into brain homogenate to yield calibration standards and quality control (QC) samples. Both endogenous and surrogate analytes were extracted using protein precipitation followed by solid phase extraction. Enantiomeric separation was achieved on a chiral crown ether column with an analysis time of only 6 min without any derivatization. The column eluent was introduced into an electrospray interface of a triple-quadrupole mass spectrometer. The calibration range was 1.00 to 300 nmol/g, and the method showed acceptable accuracy and precision at all QC concentration levels from a validation point of view. In addition, the brain d-serine levels of normal mice determined using this method were the same as those obtained by a standard addition method, which is time-consuming but is often used for the accurate measurement of endogenous substances. Thus, this surrogate analyte method should be applicable to the measurement of d-serine levels as a potential biomarker for monitoring certain effects of drug candidates on the central nervous system. Copyright © 2012 Elsevier Inc. All rights reserved.

  13. CADDIS Volume 3. Examples and Applications: Analytical Examples

    EPA Pesticide Factsheets

    Examples illustrating the use of statistical analysis to support different types of evidence, stream temperature, temperature inferred from macroinverterbate, macroinvertebrate responses, zinc concentrations, observed trait characteristics.

  14. Direct Surface and Droplet Microsampling for Electrospray Ionization Mass Spectrometry Analysis with an Integrated Dual-Probe Microfluidic Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Cong-Min; Zhu, Ying; Jin, Di-Qiong

    Ambient mass spectrometry (MS) has revolutionized the way of MS analysis and broadened its application in various fields. This paper describes the use of microfluidic techniques to simplify the setup and improve the functions of ambient MS by integrating the sampling probe, electrospray emitter probe, and online mixer on a single glass microchip. Two types of sampling probes, including a parallel-channel probe and a U-shaped channel probe, were designed for dryspot and liquid-phase droplet samples, respectively. We demonstrated that the microfabrication techniques not only enhanced the capability of ambient MS methods in analysis of dry-spot samples on various surfaces, butmore » also enabled new applications in the analysis of nanoliter-scale chemical reactions in an array of droplets. The versatility of the microchip-based ambient MS method was demonstrated in multiple different applications including evaluation of residual pesticide on fruit surfaces, sensitive analysis of low-ionizable analytes using postsampling derivatization, and high-throughput screening of Ugi-type multicomponent reactions.« less

  15. The Application of Bayesian Analysis to Issues in Developmental Research

    ERIC Educational Resources Information Center

    Walker, Lawrence J.; Gustafson, Paul; Frimer, Jeremy A.

    2007-01-01

    This article reviews the concepts and methods of Bayesian statistical analysis, which can offer innovative and powerful solutions to some challenging analytical problems that characterize developmental research. In this article, we demonstrate the utility of Bayesian analysis, explain its unique adeptness in some circumstances, address some…

  16. Application of a voltammetric electronic tongue and near infrared spectroscopy for a rapid umami taste assessment.

    PubMed

    Bagnasco, Lucia; Cosulich, M Elisabetta; Speranza, Giovanna; Medini, Luca; Oliveri, Paolo; Lanteri, Silvia

    2014-08-15

    The relationships between sensory attribute and analytical measurements, performed by electronic tongue (ET) and near-infrared spectroscopy (NIRS), were investigated in order to develop a rapid method for the assessment of umami taste. Commercially available umami products and some aminoacids were submitted to sensory analysis. Results were analysed in comparison with the outcomes of analytical measurements. Multivariate exploratory analysis was performed by principal component analysis (PCA). Calibration models for prediction of the umami taste on the basis of ET and NIR signals were obtained using partial least squares (PLS) regression. Different approaches for merging data from the two different analytical instruments were considered. Both of the techniques demonstrated to provide information related with umami taste. In particular, ET signals showed the higher correlation with umami attribute. Data fusion was found to be slightly beneficial - not so significantly as to justify the coupled use of the two analytical techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  18. A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.; Lopes, Leonard V.

    2017-01-01

    More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.

  19. Performance testing and analysis results of AMTEC cells for space applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borkowski, C.A.; Barkan, A.; Hendricks, T.J.

    1998-01-01

    Testing and analysis has shown that AMTEC (Alkali Metal Thermal to Electric Conversion) (Weber, 1974) cells can reach the performance (power) levels required by a variety of space applications. The performance of an AMTEC cell is highly dependent on the thermal environment to which it is subjected. A guard heater assembly has been designed, fabricated, and used to expose individual AMTEC cells to various thermal environments. The design and operation of the guard heater assembly will be discussed. Performance test results of an AMTEC cell operated under guard heated conditions to simulate an adiabatic cell wall thermal environment are presented.more » Experimental data and analytic model results are compared to illustrate validation of the model. {copyright} {ital 1998 American Institute of Physics.}« less

  20. High-Throughput Epitope Binning Assays on Label-Free Array-Based Biosensors Can Yield Exquisite Epitope Discrimination That Facilitates the Selection of Monoclonal Antibodies with Functional Activity

    PubMed Central

    Abdiche, Yasmina Noubia; Miles, Adam; Eckman, Josh; Foletti, Davide; Van Blarcom, Thomas J.; Yeung, Yik Andy; Pons, Jaume; Rajpal, Arvind

    2014-01-01

    Here, we demonstrate how array-based label-free biosensors can be applied to the multiplexed interaction analysis of large panels of analyte/ligand pairs, such as the epitope binning of monoclonal antibodies (mAbs). In this application, the larger the number of mAbs that are analyzed for cross-blocking in a pairwise and combinatorial manner against their specific antigen, the higher the probability of discriminating their epitopes. Since cross-blocking of two mAbs is necessary but not sufficient for them to bind an identical epitope, high-resolution epitope binning analysis determined by high-throughput experiments can enable the identification of mAbs with similar but unique epitopes. We demonstrate that a mAb's epitope and functional activity are correlated, thereby strengthening the relevance of epitope binning data to the discovery of therapeutic mAbs. We evaluated two state-of-the-art label-free biosensors that enable the parallel analysis of 96 unique analyte/ligand interactions and nearly ten thousand total interactions per unattended run. The IBIS-MX96 is a microarray-based surface plasmon resonance imager (SPRi) integrated with continuous flow microspotting technology whereas the Octet-HTX is equipped with disposable fiber optic sensors that use biolayer interferometry (BLI) detection. We compared their throughput, versatility, ease of sample preparation, and sample consumption in the context of epitope binning assays. We conclude that the main advantages of the SPRi technology are its exceptionally low sample consumption, facile sample preparation, and unparalleled unattended throughput. In contrast, the BLI technology is highly flexible because it allows for the simultaneous interaction analysis of 96 independent analyte/ligand pairs, ad hoc sensor replacement and on-line reloading of an analyte- or ligand-array. Thus, the complementary use of these two platforms can expedite applications that are relevant to the discovery of therapeutic mAbs, depending upon the sample availability, and the number and diversity of the interactions being studied. PMID:24651868

  1. In Vitro Electrochemistry of Biological Systems

    PubMed Central

    Adams, Kelly L.; Puchades, Maja; Ewing, Andrew G.

    2009-01-01

    This article reviews recent work involving electrochemical methods for in vitro analysis of biomolecules, with an emphasis on detection and manipulation at and of single cells and cultures of cells. The techniques discussed include constant potential amperometry, chronoamperometry, cellular electroporation, scanning electrochemical microscopy, and microfluidic platforms integrated with electrochemical detection. The principles of these methods are briefly described, followed in most cases with a short description of an analytical or biological application and its significance. The use of electrochemical methods to examine specific mechanistic issues in exocytosis is highlighted, as a great deal of recent work has been devoted to this application. PMID:20151038

  2. Application of NASTRAN/COSMIC in the analysis of ship structures to underwater explosion shock

    NASA Technical Reports Server (NTRS)

    Fallon, D. J.; Costanzo, F. A.; Handleton, R. T.; Camp, G. C.; Smith, D. C.

    1987-01-01

    The application of NASTRAN/COSMIC in predicting the transient motion of ship structures to underwater, non-contact explosions is discussed. Examples illustrate the finite element models, mathematical formulations of loading functions and, where available, comparisons between analytical and experimental results.

  3. Recent Development in Optical Chemical Sensors Coupling with Flow Injection Analysis

    PubMed Central

    Ojeda, Catalina Bosch; Rojas, Fuensanta Sánchez

    2006-01-01

    Optical techniques for chemical analysis are well established and sensors based on these techniques are now attracting considerable attention because of their importance in applications such as environmental monitoring, biomedical sensing, and industrial process control. On the other hand, flow injection analysis (FIA) is advisable for the rapid analysis of microliter volume samples and can be interfaced directly to the chemical process. The FIA has become a widespread automatic analytical method for more reasons; mainly due to the simplicity and low cost of the setups, their versatility, and ease of assembling. In this paper, an overview of flow injection determinations by using optical chemical sensors is provided, and instrumentation, sensor design, and applications are discussed. This work summarizes the most relevant manuscripts from 1980 to date referred to analysis using optical chemical sensors in FIA.

  4. Chiral Drug Analysis in Forensic Chemistry: An Overview.

    PubMed

    Ribeiro, Cláudia; Santos, Cristiana; Gonçalves, Valter; Ramos, Ana; Afonso, Carlos; Tiritan, Maria Elizabeth

    2018-01-28

    Many substances of forensic interest are chiral and available either as racemates or pure enantiomers. Application of chiral analysis in biological samples can be useful for the determination of legal or illicit drugs consumption or interpretation of unexpected toxicological effects. Chiral substances can also be found in environmental samples and revealed to be useful for determination of community drug usage (sewage epidemiology), identification of illicit drug manufacturing locations, illegal discharge of sewage and in environmental risk assessment. Thus, the purpose of this paper is to provide an overview of the application of chiral analysis in biological and environmental samples and their relevance in the forensic field. Most frequently analytical methods used to quantify the enantiomers are liquid and gas chromatography using both indirect, with enantiomerically pure derivatizing reagents, and direct methods recurring to chiral stationary phases.

  5. Analytic Methods for Benchmarking Hydrogen and Fuel Cell Technologies; NREL (National Renewable Energy Laboratory)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melaina, Marc; Saur, Genevieve; Ramsden, Todd

    2015-05-28

    This presentation summarizes NREL's hydrogen and fuel cell analysis work in three areas: resource potential, greenhouse gas emissions and cost of delivered energy, and influence of auxiliary revenue streams. NREL's hydrogen and fuel cell analysis projects focus on low-­carbon and economic transportation and stationary fuel cell applications. Analysis tools developed by the lab provide insight into the degree to which bridging markets can strengthen the business case for fuel cell applications.

  6. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  7. Increasing productivity for the analysis of trace contaminants in food by gas chromatography-mass spectrometry using automated liner exchange, backflushing and heart-cutting.

    PubMed

    David, Frank; Tienpont, Bart; Devos, Christophe; Lerch, Oliver; Sandra, Pat

    2013-10-25

    Laboratories focusing on residue analysis in food are continuously seeking to increase sample throughput by minimizing sample preparation. Generic sample extraction methods such as QuEChERS lack selectivity and consequently extracts are not free from non-volatile material that contaminates the analytical system. Co-extracted matrix constituents interfere with target analytes, even if highly sensitive and selective GC-MS/MS is used. A number of GC approaches are described that can be used to increase laboratory productivity. These techniques include automated inlet liner exchange and column backflushing for preservation of the performance of the analytical system and heart-cutting two-dimensional GC for increasing sensitivity and selectivity. The application of these tools is illustrated by the analysis of pesticides in vegetables and fruits, PCBs in milk powder and coplanar PCBs in fish. It is demonstrated that considerable increase in productivity can be achieved by decreasing instrument down-time, while analytical performance is equal or better compared to conventional trace contaminant analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Analytical solution for the effect of the permittivity of coating layer on eddy current generated in an aluminum sample by EMAT

    NASA Astrophysics Data System (ADS)

    Sun, Feiran; Sun, Zhenguo; Chen, Qiang

    2016-02-01

    In order to improve the ultrasonic wave amplitude excited by electromagnetic acoustic transducers (EMATs), many researchers have proposed models. But they always ignored displacement current or the effect of the permittivity of the air or the metal sample during modeling, due to its low permittivity. However, more durable dielectric materials are replacing or coating with metals in many applications which have a much higher permittivity than air or metal sample so that the effect of permittivity cannot be ignored. Based on an analytical model, the effect of the permittivity of coating layer on the eddy current generated in an aluminum sample by EMAT has been studied. The analytical analysis indicates that the eddy current density excited by the spiral coil of EMAT slowly increases in the beginning and then decreases rapidly while the permittivity increases, and it has much relation to the thickness of the coating layer and the exciting frequency, which is verified by the simulation result.

  9. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  10. A versatile, refrigerant- and cryogen-free cryofocusing-thermodesorption unit for preconcentration of traces gases in air

    NASA Astrophysics Data System (ADS)

    Obersteiner, Florian; Bönisch, Harald; Keber, Timo; O'Doherty, Simon; Engel, Andreas

    2016-10-01

    We present a compact and versatile cryofocusing-thermodesorption unit, which we developed for quantitative analysis of halogenated trace gases in ambient air. Possible applications include aircraft-based in situ measurements, in situ monitoring and laboratory operation for the analysis of flask samples. Analytes are trapped on adsorptive material cooled by a Stirling cooler to low temperatures (e.g. -80 °C) and subsequently desorbed by rapid heating of the adsorptive material (e.g. +200 °C). The set-up involves neither the exchange of adsorption tubes nor any further condensation or refocusing steps. No moving parts are used that would require vacuum insulation. This allows for a simple and robust design. Reliable operation is ensured by the Stirling cooler, which neither contains a liquid refrigerant nor requires refilling a cryogen. At the same time, it allows for significantly lower adsorption temperatures compared to commonly used Peltier elements. We use gas chromatography - mass spectrometry (GC-MS) for separation and detection of the preconcentrated analytes after splitless injection. A substance boiling point range of approximately -80 to +150 °C and a substance mixing ratio range of less than 1 ppt (pmol mol-1) to more than 500 ppt in preconcentrated sample volumes of 0.1 to 10 L of ambient air is covered, depending on the application and its analytical demands. We present the instrumental design of the preconcentration unit and demonstrate capabilities and performance through the examination of analyte breakthrough during adsorption, repeatability of desorption and analyte residues in blank tests. Examples of application are taken from the analysis of flask samples collected at Mace Head Atmospheric Research Station in Ireland using our laboratory GC-MS instruments and by data obtained during a research flight with our in situ aircraft instrument GhOST-MS (Gas chromatograph for the Observation of Tracers - coupled with a Mass Spectrometer).

  11. Development and application of an information-analytic system on the problem of flow accelerated corrosion of pipeline elements in the secondary coolant circuit of VVER-440-based power units at the Novovoronezh nuclear power plant

    NASA Astrophysics Data System (ADS)

    Tomarov, G. V.; Povarov, V. P.; Shipkov, A. A.; Gromov, A. F.; Kiselev, A. N.; Shepelev, S. V.; Galanin, A. V.

    2015-02-01

    Specific features relating to development of the information-analytical system on the problem of flow-accelerated corrosion of pipeline elements in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh nuclear power plant are considered. The results from a statistical analysis of data on the quantity, location, and operating conditions of the elements and preinserted segments of pipelines used in the condensate-feedwater and wet steam paths are presented. The principles of preparing and using the information-analytical system for determining the lifetime to reaching inadmissible wall thinning in elements of pipelines used in the secondary coolant circuit of the VVER-440-based power units at the Novovoronezh NPP are considered.

  12. Multibeam antenna study, phase 1

    NASA Technical Reports Server (NTRS)

    Bellamy, J. L.

    1972-01-01

    A multibeam antenna concept was developed for providing spot beam coverage of the contiguous 48 states. The selection of a suitable antenna concept for the multibeam application and an experimental evaluation of the antenna concept selected are described. The final analysis indicates that the preferred concept is a dual-antenna, circular artificial dielectric lens. A description of the analytical methods is provided, as well as a discussion of the absolute requirements placed on the antenna concepts. Finally, a comparative analysis of reflector antenna off-axis beam performance is presented.

  13. [Adequate application of quantitative and qualitative statistic analytic methods in acupuncture clinical trials].

    PubMed

    Tan, Ming T; Liu, Jian-ping; Lao, Lixing

    2012-08-01

    Recently, proper use of the statistical methods in traditional Chinese medicine (TCM) randomized controlled trials (RCTs) has received increased attention. Statistical inference based on hypothesis testing is the foundation of clinical trials and evidence-based medicine. In this article, the authors described the methodological differences between literature published in Chinese and Western journals in the design and analysis of acupuncture RCTs and the application of basic statistical principles. In China, qualitative analysis method has been widely used in acupuncture and TCM clinical trials, while the between-group quantitative analysis methods on clinical symptom scores are commonly used in the West. The evidence for and against these analytical differences were discussed based on the data of RCTs assessing acupuncture for pain relief. The authors concluded that although both methods have their unique advantages, quantitative analysis should be used as the primary analysis while qualitative analysis can be a secondary criterion for analysis. The purpose of this paper is to inspire further discussion of such special issues in clinical research design and thus contribute to the increased scientific rigor of TCM research.

  14. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  15. Building pit dewatering: application of transient analytic elements.

    PubMed

    Zaadnoordijk, Willem J

    2006-01-01

    Analytic elements are well suited for the design of building pit dewatering. Wells and drains can be modeled accurately by analytic elements, both nearby to determine the pumping level and at some distance to verify the targeted drawdown at the building site and to estimate the consequences in the vicinity. The ability to shift locations of wells or drains easily makes the design process very flexible. The temporary pumping has transient effects, for which transient analytic elements may be used. This is illustrated using the free, open-source, object-oriented analytic element simulator Tim(SL) for the design of a building pit dewatering near a canal. Steady calculations are complemented with transient calculations. Finally, the bandwidths of the results are estimated using linear variance analysis.

  16. Net analyte signal standard addition method (NASSAM) as a novel spectrofluorimetric and spectrophotometric technique for simultaneous determination, application to assay of melatonin and pyridoxine

    NASA Astrophysics Data System (ADS)

    Asadpour-Zeynali, Karim; Bastami, Mohammad

    2010-02-01

    In this work a new modification of the standard addition method called "net analyte signal standard addition method (NASSAM)" is presented for the simultaneous spectrofluorimetric and spectrophotometric analysis. The proposed method combines the advantages of standard addition method with those of net analyte signal concept. The method can be applied for the determination of analyte in the presence of known interferents. The accuracy of the predictions against H-point standard addition method is not dependent on the shape of the analyte and interferent spectra. The method was successfully applied to simultaneous spectrofluorimetric and spectrophotometric determination of pyridoxine (PY) and melatonin (MT) in synthetic mixtures and in a pharmaceutical formulation.

  17. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  18. KC-135 winglet program overview

    NASA Technical Reports Server (NTRS)

    Barber, M. R.; Selegan, D.

    1982-01-01

    A joint NASA/USAF program was conducted to accomplish the following objectives: (1) evaluate the benefits that could be achieved from the application of winglets to KC-135 aircraft; and (2) determine the ability of wind tunnel tests and analytical analysis to predict winglet characteristics. The program included wind-tunnel development of a test winglet configuration; analytical predictions of the changes to the aircraft resulting from the application of the test winglet; and finally, flight tests of the developed configuration. Pressure distribution, loads, stability and control, buffet, fuel mileage, and flutter data were obtained to fulfill the objectives of the program.

  19. Analysis of dual coupler nested coupled cavities.

    PubMed

    Adib, George A; Sabry, Yasser M; Khalil, Diaa

    2017-12-01

    Coupled ring resonators are now forming the basic building blocks in several optical systems serving different applications. In many of these applications, a small full width at half maximum is required, along with a large free spectral range. In this work, a configuration of passive coupled cavities constituting dual coupler nested cavities is proposed. A theoretical study of the configuration is presented allowing us to obtain analytical expressions of its different spectral characteristics. The transfer function of the configuration is also used to generate design curves while comparing these results with analytical expressions. Finally, the configuration is compared with other coupled cavity configurations.

  20. Explorative visual analytics on interval-based genomic data and their metadata.

    PubMed

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  1. Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review

    PubMed Central

    Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef

    2014-01-01

    Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409

  2. Separation of aromatic carboxylic acids using quaternary ammonium salts on reversed-phase HPLC. 2. Application for the analysis of Loy Yang coal oxidation products

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawamura, K.; Okuwaki, A.; Verheyen, T.V.

    In order to develop separation processes and analytical methods for aromatic carboxylic acids for the coal oxidation products, the separation behavior of aromatic carboxylic acids on a reversed-phase HPLC using eluent containing quaternary ammonium salt was optimized using the solvent gradient method. This method was applied for the analysis of Loy Yang coal oxidation products. It was confirmed that the analytical data using this method were consistent with those determined using gas chromatography.

  3. Cylindrical optical resonators: fundamental properties and bio-sensing characteristics

    NASA Astrophysics Data System (ADS)

    Khozeymeh, Foroogh; Razaghi, Mohammad

    2018-04-01

    In this paper, detailed theoretical analysis of cylindrical resonators is demonstrated. As illustrated, these kinds of resonators can be used as optical bio-sensing devices. The proposed structure is analyzed using an analytical method based on Lam's approximation. This method is systematic and has simplified the tedious process of whispering-gallery mode (WGM) wavelength analysis in optical cylindrical biosensors. By this method, analysis of higher radial orders of high angular momentum WGMs has been possible. Using closed-form analytical equations, resonance wavelengths of higher radial and angular order WGMs of TE and TM polarization waves are calculated. It is shown that high angular momentum WGMs are more appropriate for bio-sensing applications. Some of the calculations are done using a numerical non-linear Newton method. A perfect match of 99.84% between the analytical and the numerical methods has been achieved. In order to verify the validity of the calculations, Meep simulations based on the finite difference time domain (FDTD) method are performed. In this case, a match of 96.70% between the analytical and FDTD results has been obtained. The analytical predictions are in good agreement with other experimental work (99.99% match). These results validate the proposed analytical modelling for the fast design of optical cylindrical biosensors. It is shown that by extending the proposed two-layer resonator structure analyzing scheme, it is possible to study a three-layer cylindrical resonator structure as well. Moreover, by this method, fast sensitivity optimization in cylindrical resonator-based biosensors has been possible. Sensitivity of the WGM resonances is analyzed as a function of the structural parameters of the cylindrical resonators. Based on the results, fourth radial order WGMs, with a resonator radius of 50 μm, display the most bulk refractive index sensitivity of 41.50 (nm/RIU).

  4. Parameter identification of hyperelastic material properties of the heel pad based on an analytical contact mechanics model of a spherical indentation.

    PubMed

    Suzuki, Ryo; Ito, Kohta; Lee, Taeyong; Ogihara, Naomichi

    2017-01-01

    Accurate identification of the material properties of the plantar soft tissue is important for computer-aided analysis of foot pathologies and design of therapeutic footwear interventions based on subject-specific models of the foot. However, parameter identification of the hyperelastic material properties of plantar soft tissues usually requires an inverse finite element analysis due to the lack of a practical contact model of the indentation test. In the present study, we derive an analytical contact model of a spherical indentation test in order to directly estimate the material properties of the plantar soft tissue. Force-displacement curves of the heel pads are obtained through an indentation experiment. The experimental data are fit to the analytical stress-strain solution of the spherical indentation in order to obtain the parameters. A spherical indentation approach successfully predicted the non-linear material properties of the heel pad without iterative finite element calculation. The force-displacement curve obtained in the present study was found to be situated lower than those identified in previous studies. The proposed framework for identifying the hyperelastic material parameters may facilitate the development of subject-specific FE modeling of the foot for possible clinical and ergonomic applications. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    USGS Publications Warehouse

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  6. Applications of nuclear analytical techniques to environmental studies

    NASA Astrophysics Data System (ADS)

    Freitas, M. C.; Pacheco, A. M. G.; Marques, A. P.; Barros, L. I. C.; Reis, M. A.

    2001-07-01

    A few examples of application of nuclear-analytical techniques to biological monitors—natives and transplants—are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal—the Setúbal peninsula, about 50 km south of Lisbon—where indigenous lichens are rare. The whole area was 10×15 km around an oil-fired power station, and a 2.5×2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50×50 km, using a 10×10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively.

  7. Metabolomics for laboratory diagnostics.

    PubMed

    Bujak, Renata; Struck-Lewicka, Wiktoria; Markuszewski, Michał J; Kaliszan, Roman

    2015-09-10

    Metabolomics is an emerging approach in a systems biology field. Due to continuous development in advanced analytical techniques and in bioinformatics, metabolomics has been extensively applied as a novel, holistic diagnostic tool in clinical and biomedical studies. Metabolome's measurement, as a chemical reflection of a current phenotype of a particular biological system, is nowadays frequently implemented to understand pathophysiological processes involved in disease progression as well as to search for new diagnostic or prognostic biomarkers of various organism's disorders. In this review, we discussed the research strategies and analytical platforms commonly applied in the metabolomics studies. The applications of the metabolomics in laboratory diagnostics in the last 5 years were also reviewed according to the type of biological sample used in the metabolome's analysis. We also discussed some limitations and further improvements which should be considered taking in mind potential applications of metabolomic research and practice. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    PubMed

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  9. Mechanics of the tapered interference fit in dental implants.

    PubMed

    Bozkaya, Dinçer; Müftü, Sinan

    2003-11-01

    In evaluation of the long-term success of a dental implant, the reliability and the stability of the implant-abutment interface plays a great role. Tapered interference fits provide a reliable connection method between the abutment and the implant. In this work, the mechanics of the tapered interference fits were analyzed using a closed-form formula and the finite element (FE) method. An analytical solution, which is used to predict the contact pressure in a straight interference, was modified to predict the contact pressure in the tapered implant-abutment interface. Elastic-plastic FE analysis was used to simulate the implant and abutment material behavior. The validity and the applicability of the analytical solution were investigated by comparisons with the FE model for a range of problem parameters. It was shown that the analytical solution could be used to determine the pull-out force and loosening-torque with 5-10% error. Detailed analysis of the stress distribution due to tapered interference fit, in a commercially available, abutment-implant system was carried out. This analysis shows that plastic deformation in the implant limits the increase in the pull-out force that would have been otherwise predicted by higher interference values.

  10. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  11. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  12. A harmonized immunoassay with liquid chromatography-mass spectrometry analysis in egg allergen determination.

    PubMed

    Nimata, Masaomi; Okada, Hideki; Kurihara, Kei; Sugimoto, Tsukasa; Honjoh, Tsutomu; Kuroda, Kazuhiko; Yano, Takeo; Tachibana, Hirofumi; Shoji, Masahiro

    2018-01-01

    Food allergy is a serious health issue worldwide. Implementing allergen labeling regulations is extremely challenging for regulators, food manufacturers, and analytical kit manufacturers. Here we have developed an "amino acid sequence immunoassay" approach to ELISA. The new ELISA comprises of a monoclonal antibody generated via an analyte specific peptide antigen and sodium lauryl sulfate/sulfite solution. This combination enables the antibody to access the epitope site in unfolded analyte protein. The newly developed ELISA recovered 87.1%-106.4% ovalbumin from ovalbumin-incurred model processed foods, thereby demonstrating its applicability as practical egg allergen determination. Furthermore, the comparison of LC-MS/MS and the new ELISA, which targets the amino acid sequence conforming to the LC-MS/MS detection peptide, showed a good agreement. Consequently the harmonization of two methods was demonstrated. The complementary use of the new ELISA and LC-MS analysis can offer a wide range of practical benefits in terms of easiness, cost, accuracy, and efficiency in food allergen analysis. In addition, the new assay is attractive in respect to its easy antigen preparation and predetermined specificity. Graphical abstract The ELISA composing of the monoclonal antibody targeting the amino acid sequence conformed to LC-MS detection peptide, and the protein conformation unfolding reagent was developed. In ovalbumin determination, the developed ELISA showed a good agreement with LC-MS analysis. Consequently the harmonization of immunoassay with LC-MS analysis by using common target amino acid sequence was demonstrated.

  13. The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234

    PubMed Central

    Mudge, Elizabeth M; Brown, Paula N

    2016-01-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823

  14. Supramolecular analytical chemistry.

    PubMed

    Anslyn, Eric V

    2007-02-02

    A large fraction of the field of supramolecular chemistry has focused in previous decades upon the study and use of synthetic receptors as a means of mimicking natural receptors. Recently, the demand for synthetic receptors is rapidly increasing within the analytical sciences. These classes of receptors are finding uses in simple indicator chemistry, cellular imaging, and enantiomeric excess analysis, while also being involved in various truly practical assays of bodily fluids. Moreover, one of the most promising areas for the use of synthetic receptors is in the arena of differential sensing. Although many synthetic receptors have been shown to yield exquisite selectivities, in general, this class of receptor suffers from cross-reactivities. Yet, cross-reactivity is an attribute that is crucial to the success of differential sensing schemes. Therefore, both selective and nonselective synthetic receptors are finding uses in analytical applications. Hence, a field of chemistry that herein is entitled "Supramolecular Analytical Chemistry" is emerging, and is predicted to undergo increasingly rapid growth in the near future.

  15. The Importance of Method Selection in Determining Product Integrity for Nutrition Research.

    PubMed

    Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N

    2016-03-01

    The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.

  16. Fabrication strategies, sensing modes and analytical applications of ratiometric electrochemical biosensors.

    PubMed

    Jin, Hui; Gui, Rijun; Yu, Jianbo; Lv, Wei; Wang, Zonghua

    2017-05-15

    Previously developed electrochemical biosensors with single-electric signal output are probably affected by intrinsic and extrinsic factors. In contrast, the ratiometric electrochemical biosensors (RECBSs) with dual-electric signal outputs have an intrinsic built-in correction to the effects from system or background electric signals, and therefore exhibit a significant potential to improve the accuracy and sensitivity in electrochemical sensing applications. In this review, we systematically summarize the fabrication strategies, sensing modes and analytical applications of RECBSs. First, the different fabrication strategies of RECBSs were introduced, referring to the analytes-induced single- and dual-dependent electrochemical signal strategies for RECBSs. Second, the different sensing modes of RECBSs were illustrated, such as differential pulse voltammetry, square wave voltammetry, cyclic voltammetry, alternating current voltammetry, electrochemiluminescence, and so forth. Third, the analytical applications of RECBSs were discussed based on the types of target analytes. Finally, the forthcoming development and future prospects in the research field of RECBSs were also highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Stability analysis and application of a mathematical cholera model.

    PubMed

    Liao, Shu; Wang, Jin

    2011-07-01

    In this paper, we conduct a dynamical analysis of the deterministic cholera model proposed in [9]. We study the stability of both the disease-free and endemic equilibria so as to explore the complex epidemic and endemic dynamics of the disease. We demonstrate a real-world application of this model by investigating the recent cholera outbreak in Zimbabwe. Meanwhile, we present numerical simulation results to verify the analytical predictions.

  18. Immunoaffinity chromatography: an introduction to applications and recent developments

    PubMed Central

    Moser, Annette C

    2010-01-01

    Immunoaffinity chromatography (IAC) combines the use of LC with the specific binding of antibodies or related agents. The resulting method can be used in assays for a particular target or for purification and concentration of analytes prior to further examination by another technique. This review discusses the history and principles of IAC and the various formats that can be used with this method. An overview is given of the general properties of antibodies and of antibody-production methods. The supports and immobilization methods used with antibodies in IAC and the selection of application and elution conditions for IAC are also discussed. Several applications of IAC are considered, including its use in purification, immunodepletion, direct sample analysis, chromatographic immunoassays and combined analysis methods. Recent developments include the use of IAC with CE or MS, ultrafast immunoextraction methods and the use of immunoaffinity columns in microanalytical systems. PMID:20640220

  19. Collective phase response curves for heterogeneous coupled oscillators

    NASA Astrophysics Data System (ADS)

    Hannay, Kevin M.; Booth, Victoria; Forger, Daniel B.

    2015-08-01

    Phase response curves (PRCs) have become an indispensable tool in understanding the entrainment and synchronization of biological oscillators. However, biological oscillators are often found in large coupled heterogeneous systems and the variable of physiological importance is the collective rhythm resulting from an aggregation of the individual oscillations. To study this phenomena we consider phase resetting of the collective rhythm for large ensembles of globally coupled Sakaguchi-Kuramoto oscillators. Making use of Ott-Antonsen theory we derive an asymptotically valid analytic formula for the collective PRC. A result of this analysis is a characteristic scaling for the change in the amplitude and entrainment points for the collective PRC compared to the individual oscillator PRC. We support the analytical findings with numerical evidence and demonstrate the applicability of the theory to large ensembles of coupled neuronal oscillators.

  20. Analytical investigations on the thermal properties of microscale inorganic light-emitting diodes on an orthotropic substrate

    NASA Astrophysics Data System (ADS)

    Li, Y.; Chen, J.; Xing, Y.; Song, J.

    2017-03-01

    The microscale inorganic light-emitting diodes (μ-ILEDs) create novel opportunities in biointegrated applications such as wound healing acceleration and optogenetics. Analytical expressions, validated by finite element analysis, are obtained for the temperature increase of a rectangular μ-ILED device on an orthotropic substrate, which could offer an appealing advantage in controlling the heat flow direction to achieve the goal in thermal management. The influences of various parameters (e.g., thermal conductivities of orthotropic substrate, loading parameters) on the temperature increase of the μ-ILED are investigated based on the obtained closed-form solutions. These results provide a novel route to control the temperature distribution in the μ-ILED system and provide easily interpretable guidelines to minimize the adverse thermal effects.

  1. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  2. An UPLC-ESI-MS/MS Assay Using 6-Aminoquinolyl-N-Hydroxysuccinimidyl Carbamate Derivatization for Targeted Amino Acid Analysis: Application to Screening of Arabidopsis thaliana Mutants.

    PubMed

    Salazar, Carolina; Armenta, Jenny M; Shulaev, Vladimir

    2012-07-06

    In spite of the large arsenal of methodologies developed for amino acid assessment in complex matrices, their implementation in metabolomics studies involving wide-ranging mutant screening is hampered by their lack of high-throughput, sensitivity, reproducibility, and/or wide dynamic range. In response to the challenge of developing amino acid analysis methods that satisfy the criteria required for metabolomic studies, improved reverse-phase high-performance liquid chromatography-mass spectrometry (RPHPLC-MS) methods have been recently reported for large-scale screening of metabolic phenotypes. However, these methods focus on the direct analysis of underivatized amino acids and, therefore, problems associated with insufficient retention and resolution are observed due to the hydrophilic nature of amino acids. It is well known that derivatization methods render amino acids more amenable for reverse phase chromatographic analysis by introducing highly-hydrophobic tags in their carboxylic acid or amino functional group. Therefore, an analytical platform that combines the 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AQC) pre-column derivatization method with ultra performance liquid chromatography-electrospray ionization-tandem mass spectrometry (UPLC-ESI-MS/MS) is presented in this article. For numerous reasons typical amino acid derivatization methods would be inadequate for large scale metabolic projects. However, AQC derivatization is a simple, rapid and reproducible way of obtaining stable amino acid adducts amenable for UPLC-ESI-MS/MS and the applicability of the method for high-throughput metabolomic analysis in Arabidopsis thaliana is demonstrated in this study. Overall, the major advantages offered by this amino acid analysis method include high-throughput, enhanced sensitivity and selectivity; characteristics that showcase its utility for the rapid screening of the preselected plant metabolites without compromising the quality of the metabolic data. The presented method enabled thirty-eight metabolites (proteinogenic amino acids and related compounds) to be analyzed within 10 min with detection limits down to 1.02 × 10-11 M (i.e., atomole level on column), which represents an improved sensitivity of 1 to 5 orders of magnitude compared to existing methods. Our UPLC-ESI-MS/MS method is one of the seven analytical platforms used by the Arabidopsis Metabolomics Consortium. The amino acid dataset obtained by analysis of Arabidopsis T-DNA mutant stocks with our platform is captured and open to the public in the web portal PlantMetabolomics.org. The analytical platform herein described could find important applications in other studies where the rapid, high-throughput and sensitive assessment of low abundance amino acids in complex biosamples is necessary.

  3. An UPLC-ESI-MS/MS Assay Using 6-Aminoquinolyl-N-Hydroxysuccinimidyl Carbamate Derivatization for Targeted Amino Acid Analysis: Application to Screening of Arabidopsis thaliana Mutants

    PubMed Central

    Salazar, Carolina; Armenta, Jenny M.; Shulaev, Vladimir

    2012-01-01

    In spite of the large arsenal of methodologies developed for amino acid assessment in complex matrices, their implementation in metabolomics studies involving wide-ranging mutant screening is hampered by their lack of high-throughput, sensitivity, reproducibility, and/or wide dynamic range. In response to the challenge of developing amino acid analysis methods that satisfy the criteria required for metabolomic studies, improved reverse-phase high-performance liquid chromatography-mass spectrometry (RPHPLC-MS) methods have been recently reported for large-scale screening of metabolic phenotypes. However, these methods focus on the direct analysis of underivatized amino acids and, therefore, problems associated with insufficient retention and resolution are observed due to the hydrophilic nature of amino acids. It is well known that derivatization methods render amino acids more amenable for reverse phase chromatographic analysis by introducing highly-hydrophobic tags in their carboxylic acid or amino functional group. Therefore, an analytical platform that combines the 6-aminoquinolyl-N-hydroxysuccinimidyl carbamate (AQC) pre-column derivatization method with ultra performance liquid chromatography-electrospray ionization-tandem mass spectrometry (UPLC-ESI-MS/MS) is presented in this article. For numerous reasons typical amino acid derivatization methods would be inadequate for large scale metabolic projects. However, AQC derivatization is a simple, rapid and reproducible way of obtaining stable amino acid adducts amenable for UPLC-ESI-MS/MS and the applicability of the method for high-throughput metabolomic analysis in Arabidopsis thaliana is demonstrated in this study. Overall, the major advantages offered by this amino acid analysis method include high-throughput, enhanced sensitivity and selectivity; characteristics that showcase its utility for the rapid screening of the preselected plant metabolites without compromising the quality of the metabolic data. The presented method enabled thirty-eight metabolites (proteinogenic amino acids and related compounds) to be analyzed within 10 min with detection limits down to 1.02 × 10−11 M (i.e., atomole level on column), which represents an improved sensitivity of 1 to 5 orders of magnitude compared to existing methods. Our UPLC-ESI-MS/MS method is one of the seven analytical platforms used by the Arabidopsis Metabolomics Consortium. The amino acid dataset obtained by analysis of Arabidopsis T-DNA mutant stocks with our platform is captured and open to the public in the web portal PlantMetabolomics.org. The analytical platform herein described could find important applications in other studies where the rapid, high-throughput and sensitive assessment of low abundance amino acids in complex biosamples is necessary. PMID:24957640

  4. Analysis of gene network robustness based on saturated fixed point attractors

    PubMed Central

    2014-01-01

    The analysis of gene network robustness to noise and mutation is important for fundamental and practical reasons. Robustness refers to the stability of the equilibrium expression state of a gene network to variations of the initial expression state and network topology. Numerical simulation of these variations is commonly used for the assessment of robustness. Since there exists a great number of possible gene network topologies and initial states, even millions of simulations may be still too small to give reliable results. When the initial and equilibrium expression states are restricted to being saturated (i.e., their elements can only take values 1 or −1 corresponding to maximum activation and maximum repression of genes), an analytical gene network robustness assessment is possible. We present this analytical treatment based on determination of the saturated fixed point attractors for sigmoidal function models. The analysis can determine (a) for a given network, which and how many saturated equilibrium states exist and which and how many saturated initial states converge to each of these saturated equilibrium states and (b) for a given saturated equilibrium state or a given pair of saturated equilibrium and initial states, which and how many gene networks, referred to as viable, share this saturated equilibrium state or the pair of saturated equilibrium and initial states. We also show that the viable networks sharing a given saturated equilibrium state must follow certain patterns. These capabilities of the analytical treatment make it possible to properly define and accurately determine robustness to noise and mutation for gene networks. Previous network research conclusions drawn from performing millions of simulations follow directly from the results of our analytical treatment. Furthermore, the analytical results provide criteria for the identification of model validity and suggest modified models of gene network dynamics. The yeast cell-cycle network is used as an illustration of the practical application of this analytical treatment. PMID:24650364

  5. An analytic performance model of disk arrays and its application

    NASA Technical Reports Server (NTRS)

    Lee, Edward K.; Katz, Randy H.

    1991-01-01

    As disk arrays become widely used, tools for understanding and analyzing their performance become increasingly important. In particular, performance models can be invaluable in both configuring and designing disk arrays. Accurate analytic performance models are desirable over other types of models because they can be quickly evaluated, are applicable under a wide range of system and workload parameters, and can be manipulated by a range of mathematical techniques. Unfortunately, analytical performance models of disk arrays are difficult to formulate due to the presence of queuing and fork-join synchronization; a disk array request is broken up into independent disk requests which must all complete to satisfy the original request. We develop, validate, and apply an analytic performance model for disk arrays. We derive simple equations for approximating their utilization, response time, and throughput. We then validate the analytic model via simulation and investigate the accuracy of each approximation used in deriving the analytical model. Finally, we apply the analytical model to derive an equation for the optimal unit of data striping in disk arrays.

  6. Comparison of three-way and four-way calibration for the real-time quantitative analysis of drug hydrolysis in complex dynamic samples by excitation-emission matrix fluorescence.

    PubMed

    Yin, Xiao-Li; Gu, Hui-Wen; Liu, Xiao-Lu; Zhang, Shan-Hui; Wu, Hai-Long

    2018-03-05

    Multiway calibration in combination with spectroscopic technique is an attractive tool for online or real-time monitoring of target analyte(s) in complex samples. However, how to choose a suitable multiway calibration method for the resolution of spectroscopic-kinetic data is a troubling problem in practical application. In this work, for the first time, three-way and four-way fluorescence-kinetic data arrays were generated during the real-time monitoring of the hydrolysis of irinotecan (CPT-11) in human plasma by excitation-emission matrix fluorescence. Alternating normalization-weighted error (ANWE) and alternating penalty trilinear decomposition (APTLD) were used as three-way calibration for the decomposition of the three-way kinetic data array, whereas alternating weighted residual constraint quadrilinear decomposition (AWRCQLD) and alternating penalty quadrilinear decomposition (APQLD) were applied as four-way calibration to the four-way kinetic data array. The quantitative results of the two kinds of calibration models were fully compared from the perspective of predicted real-time concentrations, spiked recoveries of initial concentration, and analytical figures of merit. The comparison study demonstrated that both three-way and four-way calibration models could achieve real-time quantitative analysis of the hydrolysis of CPT-11 in human plasma under certain conditions. However, it was also found that both of them possess some critical advantages and shortcomings during the process of dynamic analysis. The conclusions obtained in this paper can provide some helpful guidance for the reasonable selection of multiway calibration models to achieve the real-time quantitative analysis of target analyte(s) in complex dynamic systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Flexible manipulator control experiments and analysis

    NASA Technical Reports Server (NTRS)

    Yurkovich, S.; Ozguner, U.; Tzes, A.; Kotnik, P. T.

    1987-01-01

    Modeling and control design for flexible manipulators, both from an experimental and analytical viewpoint, are described. From the application perspective, an ongoing effort within the laboratory environment at the Ohio State University, where experimentation on a single link flexible arm is underway is described. Several unique features of this study are described here. First, the manipulator arm is slewed by a direct drive dc motor and has a rigid counterbalance appendage. Current experimentation is from two viewpoints: (1) rigid body slewing and vibration control via actuation with the hub motor, and (2) vibration suppression through the use of structure-mounted proof-mass actuation at the tip. Such an application to manipulator control is of interest particularly in design of space-based telerobotic control systems, but has received little attention to date. From an analytical viewpoint, parameter estimation techniques within the closed-loop for self-tuning adaptive control approaches are discussed. Also introduced is a control approach based on output feedback and frequency weighting to counteract effects of spillover in reduced-order model design. A model of the flexible manipulator based on experimental measurements is evaluated for such estimation and control approaches.

  8. The intrinsic fluorescence of FAD and its application in analytical chemistry: a review

    NASA Astrophysics Data System (ADS)

    Galbán, Javier; Sanz-Vicente, Isabel; Navarro, Jesús; de Marcos, Susana

    2016-12-01

    This review (with 106 references) mainly deals with the analytical applications of flavin-adenine dinucleotide (FAD) fluorescence. In the first section, the spectroscopic properties of this compound are reviewed at the light of his different acid-base, oxidation and structural forms; the chemical and spectroscopic properties of flavin mononucleotide (FMN) and other flavins will be also briefly discussed. The second section discusses how the properties of FAD fluorescence changes in flavoenzymes (FvEs), again considering the different chemical and structural forms; the glucose oxidase (GOx) and the choline oxidase (ChOx) cases will be commented. Since almost certainly the most reported analytical application of FAD fluorescence is as an auto-indicator in enzymatic methods catalysed by FvE oxidoreductases, it is important to know how the concentrations of the different forms of FAD changes along the reaction and, consequently, the fluorescence and the analytical signals. An approach to do this will be presented in section 3. The fourth part of the paper compiles the analytical applications which have been reported until now based in these fluorescence properties. Finally, some suggestions about tentative future research are also given.

  9. The intrinsic fluorescence of FAD and its application in analytical chemistry: a review.

    PubMed

    Galbán, Javier; Sanz-Vicente, Isabel; Navarro, Jesús; de Marcos, Susana

    2016-12-19

    This review (with 106 references) mainly deals with the analytical applications of flavin-adenine dinucleotide (FAD) fluorescence. In the first section, the spectroscopic properties of this compound are reviewed at the light of his different acid-base, oxidation and structural forms; the chemical and spectroscopic properties of flavin mononucleotide (FMN) and other flavins will be also briefly discussed. The second section discusses how the properties of FAD fluorescence changes in flavoenzymes (FvEs), again considering the different chemical and structural forms; the glucose oxidase (GOx) and the choline oxidase (ChOx) cases will be commented. Since almost certainly the most reported analytical application of FAD fluorescence is as an auto-indicator in enzymatic methods catalysed by FvE oxidoreductases, it is important to know how the concentrations of the different forms of FAD changes along the reaction and, consequently, the fluorescence and the analytical signals. An approach to do this will be presented in section 3. The fourth part of the paper compiles the analytical applications which have been reported until now based in these fluorescence properties. Finally, some suggestions about tentative future research are also given.

  10. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  11. A Finite Element Procedure for Calculating Fluid-Structure Interaction Using MSC/NASTRAN

    NASA Technical Reports Server (NTRS)

    Chargin, Mladen; Gartmeier, Otto

    1990-01-01

    This report is intended to serve two purposes. The first is to present a survey of the theoretical background of the dynamic interaction between a non-viscid, compressible fluid and an elastic structure is presented. Section one presents a short survey of the application of the finite element method (FEM) to the area of fluid-structure-interaction (FSI). Section two describes the mathematical foundation of the structure and fluid with special emphasis on the fluid. The main steps in establishing the finite element (FE) equations for the fluid structure coupling are discussed in section three. The second purpose is to demonstrate the application of MSC/NASTRAN to the solution of FSI problems. Some specific topics, such as fluid structure analogy, acoustic absorption, and acoustic contribution analysis are described in section four. Section five deals with the organization of the acoustic procedure flowchart. Section six includes the most important information that a user needs for applying the acoustic procedure to practical FSI problems. Beginning with some rules concerning the FE modeling of the coupled system, the NASTRAN USER DECKs for the different steps are described. The goal of section seven is to demonstrate the use of the acoustic procedure with some examples. This demonstration includes an analytic verification of selected FE results. The analytical description considers only some aspects of FSI and is not intended to be mathematically complete. Finally, section 8 presents an application of the acoustic procedure to vehicle interior acoustic analysis with selected results.

  12. 1990 National Water Quality Laboratory Services Catalog

    USGS Publications Warehouse

    Pritt, Jeffrey; Jones, Berwyn E.

    1989-01-01

    PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.

  13. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  14. New trends in the analytical determination of emerging contaminants and their transformation products in environmental waters.

    PubMed

    Agüera, Ana; Martínez Bueno, María Jesús; Fernández-Alba, Amadeo R

    2013-06-01

    Since the so-called emerging contaminants were established as a new group of pollutants of environmental concern, a great effort has been devoted to the knowledge of their distribution, fate and effects in the environment. After more than 20 years of work, a significant improvement in knowledge about these contaminants has been achieved, but there is still a large gap of information on the growing number of new potential contaminants that are appearing and especially of their unpredictable transformation products. Although the environmental problem arising from emerging contaminants must be addressed from an interdisciplinary point of view, it is obvious that analytical chemistry plays an important role as the first step of the study, as it allows establishing the presence of chemicals in the environment, estimate their concentration levels, identify sources and determine their degradation pathways. These tasks involve serious difficulties requiring different analytical solutions adjusted to purpose. Thus, the complexity of the matrices requires highly selective analytical methods; the large number and variety of compounds potentially present in the samples demands the application of wide scope methods; the low concentrations at which these contaminants are present in the samples require a high detection sensitivity, and high demands on the confirmation and high structural information are needed for the characterisation of unknowns. New developments on analytical instrumentation have been applied to solve these difficulties. Furthermore and not less important has been the development of new specific software packages intended for data acquisition and, in particular, for post-run analysis. Thus, the use of sophisticated software tools has allowed successful screening analysis, determining several hundreds of analytes, and assisted in the structural elucidation of unknown compounds in a timely manner.

  15. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    DOT National Transportation Integrated Search

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  16. Application and further development of diffusion based 2D chemical imaging techniques in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Hoefer, Christoph; Santner, Jakob; Borisov, Sergey; Kreuzeder, Andreas; Wenzel, Walter; Puschenreiter, Markus

    2015-04-01

    Two dimensional chemical imaging of root processes refers to novel in situ methods to investigate and map solutes at a high spatial resolution (sub-mm). The visualization of these solutes reveals new insights in soil biogeochemistry and root processes. We derive chemical images by using data from DGT-LA-ICP-MS (Diffusive Gradients in Thin Films and Laser Ablation Inductively Coupled Plasma Mass Spectrometry) and POS (Planar Optode Sensors). Both technologies have shown promising results when applied in aqueous environment but need to be refined and improved for imaging at the soil-plant interface. Co-localized mapping using combined DGT and POS technologies and the development of new gel combinations are in our focus. DGTs are smart and thin (<0.4 mm) hydrogels; containing a binding resin for the targeted analytes (e.g. trace metals, phosphate, sulphide or radionuclides). The measurement principle is passive and diffusion based. The present analytes are diffusing into the gel and are bound by the resin. Thereby, the resin acts as zero sink. After application, DGTs are retrieved, dried, and analysed using LA-ICP-MS. The data is then normalized by an internal standard (e.g. 13C), calibrated using in-house standards and chemical images of the target area are plotted using imaging software. POS are, similar to DGT, thin sensor foils containing a fluorophore coating depending on the target analyte. The measurement principle is based on excitation of the flourophore by a specific wavelength and emission of the fluorophore depending on the presence of the analyte. The emitted signal is captured using optical filters and a DSLR camera. While DGT analysis is destructive, POS measurements can be performed continuously during the application. Both semi-quantitative techniques allow an in situ application to visualize chemical processes directly at the soil-plant interface. Here, we present a summary of results from rhizotron experiments with different plants in metal contaminated and agricultural soils.

  17. CPTAC Accelerates Precision Proteomics Biomedical Research | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.

  18. Analytical applications of microbial fuel cells. Part I: Biochemical oxygen demand.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells (MFCs) are bio-electrochemical devices, where usually the anode (but sometimes the cathode, or both) contains microorganisms able to generate and sustain an electrochemical gradient which is used typically to generate electrical power. In the more studied set-up, the anode contains heterotrophic bacteria in anaerobic conditions, capable to oxidize organic molecules releasing protons and electrons, as well as other by-products. Released protons could reach the cathode (through a membrane or not) whereas electrons travel across an external circuit originating an easily measurable direct current flow. MFCs have been proposed fundamentally as electric power producing devices or more recently as hydrogen producing devices. Here we will review the still incipient development of analytical uses of MFCs or related devices or set-ups, in the light of a non-restrictive MFC definition, as promising tools to asset water quality or other measurable parameters. An introduction to biological based analytical methods, including bioassays and biosensors, as well as MFCs design and operating principles, will also be included. Besides, the use of MFCs as biochemical oxygen demand sensors (perhaps the main analytical application of MFCs) is discussed. In a companion review (Part 2), other new analytical applications are reviewed used for toxicity sensors, metabolic sensors, life detectors, and other proposed applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Markov Chain Ontology Analysis (MCOA)

    PubMed Central

    2012-01-01

    Background Biomedical ontologies have become an increasingly critical lens through which researchers analyze the genomic, clinical and bibliographic data that fuels scientific research. Of particular relevance are methods, such as enrichment analysis, that quantify the importance of ontology classes relative to a collection of domain data. Current analytical techniques, however, remain limited in their ability to handle many important types of structural complexity encountered in real biological systems including class overlaps, continuously valued data, inter-instance relationships, non-hierarchical relationships between classes, semantic distance and sparse data. Results In this paper, we describe a methodology called Markov Chain Ontology Analysis (MCOA) and illustrate its use through a MCOA-based enrichment analysis application based on a generative model of gene activation. MCOA models the classes in an ontology, the instances from an associated dataset and all directional inter-class, class-to-instance and inter-instance relationships as a single finite ergodic Markov chain. The adjusted transition probability matrix for this Markov chain enables the calculation of eigenvector values that quantify the importance of each ontology class relative to other classes and the associated data set members. On both controlled Gene Ontology (GO) data sets created with Escherichia coli, Drosophila melanogaster and Homo sapiens annotations and real gene expression data extracted from the Gene Expression Omnibus (GEO), the MCOA enrichment analysis approach provides the best performance of comparable state-of-the-art methods. Conclusion A methodology based on Markov chain models and network analytic metrics can help detect the relevant signal within large, highly interdependent and noisy data sets and, for applications such as enrichment analysis, has been shown to generate superior performance on both real and simulated data relative to existing state-of-the-art approaches. PMID:22300537

  20. Markov Chain Ontology Analysis (MCOA).

    PubMed

    Frost, H Robert; McCray, Alexa T

    2012-02-03

    Biomedical ontologies have become an increasingly critical lens through which researchers analyze the genomic, clinical and bibliographic data that fuels scientific research. Of particular relevance are methods, such as enrichment analysis, that quantify the importance of ontology classes relative to a collection of domain data. Current analytical techniques, however, remain limited in their ability to handle many important types of structural complexity encountered in real biological systems including class overlaps, continuously valued data, inter-instance relationships, non-hierarchical relationships between classes, semantic distance and sparse data. In this paper, we describe a methodology called Markov Chain Ontology Analysis (MCOA) and illustrate its use through a MCOA-based enrichment analysis application based on a generative model of gene activation. MCOA models the classes in an ontology, the instances from an associated dataset and all directional inter-class, class-to-instance and inter-instance relationships as a single finite ergodic Markov chain. The adjusted transition probability matrix for this Markov chain enables the calculation of eigenvector values that quantify the importance of each ontology class relative to other classes and the associated data set members. On both controlled Gene Ontology (GO) data sets created with Escherichia coli, Drosophila melanogaster and Homo sapiens annotations and real gene expression data extracted from the Gene Expression Omnibus (GEO), the MCOA enrichment analysis approach provides the best performance of comparable state-of-the-art methods. A methodology based on Markov chain models and network analytic metrics can help detect the relevant signal within large, highly interdependent and noisy data sets and, for applications such as enrichment analysis, has been shown to generate superior performance on both real and simulated data relative to existing state-of-the-art approaches.

  1. Biological Nanopores: Confined Spaces for Electrochemical Single-Molecule Analysis.

    PubMed

    Cao, Chan; Long, Yi-Tao

    2018-02-20

    Nanopore sensing is developing into a powerful single-molecule approach to investigate the features of biomolecules that are not accessible by studying ensemble systems. When a target molecule is transported through a nanopore, the ions occupying the pore are excluded, resulting in an electrical signal from the intermittent ionic blockade event. By statistical analysis of the amplitudes, duration, frequencies, and shapes of the blockade events, many properties of the target molecule can be obtained in real time at the single-molecule level, including its size, conformation, structure, charge, geometry, and interactions with other molecules. With the development of the use of α-hemolysin to characterize individual polynucleotides, nanopore technology has attracted a wide range of research interest in the fields of biology, physics, chemistry, and nanoscience. As a powerful single-molecule analytical method, nanopore technology has been applied for the detection of various biomolecules, including oligonucleotides, peptides, oligosaccharides, organic molecules, and disease-related proteins. In this Account, we highlight recent developments of biological nanopores in DNA-based sensing and in studying the conformational structures of DNA and RNA. Furthermore, we introduce the application of biological nanopores to investigate the conformations of peptides affected by charge, length, and dipole moment and to study disease-related proteins' structures and aggregation transitions influenced by an inhibitor, a promoter, or an applied voltage. To improve the sensing ability of biological nanopores and further extend their application to a wider range of molecular sensing, we focus on exploring novel biological nanopores, such as aerolysin and Stable Protein 1. Aerolysin exhibits an especially high sensitivity for the detection of single oligonucleotides both in current separation and duration. Finally, to facilitate the use of nanopore measurements and statistical analysis, we develop an integrated current measurement system and an accurate data processing method for nanopore sensing. The unique geometric structure of a biological nanopore offers a distinct advantage as a nanosensor for single-molecule sensing. The construction of the pore entrance is responsible for capturing the target molecule, while the lumen region determines the translocation process of the single molecule. Since the capture of the target molecule is predominantly diffusion-limited, it is expected that the capture ability of the nanopore toward the target analyte could be effectively enhanced by site-directed mutations of key amino acids with desirable groups. Additionally, changing the side chains inside the wall of the biological nanopore could optimize the geometry of the pore and realize an optimal interaction between the single-molecule interface and the analyte. These improvements would allow for high spatial and current resolution of nanopore sensors, which would ensure the possibility of dynamic study of single biomolecules, including their metastable conformations, charge distributions, and interactions. In the future, data analysis with powerful algorithms will make it possible to automatically and statistically extract detailed information while an analyte translocates through the pore. We conclude that these improvements could have tremendous potential applications for nanopore sensing in the near future.

  2. Source-term development for a contaminant plume for use by multimedia risk assessment models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whelan, Gene; McDonald, John P.; Taira, Randal Y.

    1999-12-01

    Multimedia modelers from the U.S. Environmental Protection Agency (EPA) and the U.S. Department of Energy (DOE) are collaborating to conduct a comprehensive and quantitative benchmarking analysis of four intermedia models: DOE's Multimedia Environmental Pollutant Assessment System (MEPAS), EPA's MMSOILS, EPA's PRESTO, and DOE's RESidual RADioactivity (RESRAD). These models represent typical analytically, semi-analytically, and empirically based tools that are utilized in human risk and endangerment assessments for use at installations containing radioactive and/or hazardous contaminants. Although the benchmarking exercise traditionally emphasizes the application and comparison of these models, the establishment of a Conceptual Site Model (CSM) should be viewed with equalmore » importance. This paper reviews an approach for developing a CSM of an existing, real-world, Sr-90 plume at DOE's Hanford installation in Richland, Washington, for use in a multimedia-based benchmarking exercise bet ween MEPAS, MMSOILS, PRESTO, and RESRAD. In an unconventional move for analytically based modeling, the benchmarking exercise will begin with the plume as the source of contamination. The source and release mechanism are developed and described within the context of performing a preliminary risk assessment utilizing these analytical models. By beginning with the plume as the source term, this paper reviews a typical process and procedure an analyst would follow in developing a CSM for use in a preliminary assessment using this class of analytical tool.« less

  3. Pharmaceutical applications of dynamic mechanical thermal analysis.

    PubMed

    Jones, David S; Tian, Yiwei; Abu-Diak, Osama; Andrews, Gavin P

    2012-04-01

    The successful development of polymeric drug delivery and biomedical devices requires a comprehensive understanding of the viscoleastic properties of polymers as these have been shown to directly affect clinical efficacy. Dynamic mechanical thermal analysis (DMTA) is an accessible and versatile analytical technique in which an oscillating stress or strain is applied to a sample as a function of oscillatory frequency and temperature. Through cyclic application of a non-destructive stress or strain, a comprehensive understanding of the viscoelastic properties of polymers may be obtained. In this review, we provide a concise overview of the theory of DMTA and the basic instrumental/operating principles. Moreover, the application of DMTA for the characterization of solid pharmaceutical and biomedical systems has been discussed in detail. In particular we have described the potential of DMTA to measure and understand relaxation transitions and miscibility in binary and higher-order systems and describe the more recent applications of the technique for this purpose. © 2011 Elsevier B.V. All rights reserved.

  4. Carbon nanotubes as adsorbent of solid-phase extraction and matrix for laser desorption/ionization mass spectrometry.

    PubMed

    Pan, Chensong; Xu, Songyun; Zou, Hanfa; Guo, Zhong; Zhang, Yu; Guo, Baochuan

    2005-02-01

    A method with carbon nanotubes functioning both as the adsorbent of solid-phase extraction (SPE) and the matrix for matrix assisted laser desorption/ionization mass spectrometry (MALDI-MS) to analyze small molecules in solution has been developed. In this method, 10 microL suspensions of carbon nanotubes in 50% (vol/vol) methanol were added to the sample solution to extract analytes onto surface of carbon nanotubes because of their dramatic hydrophobicity. Carbon nanotubes in solution are deposited onto the bottom of tube with centrifugation. After removing the supernatant fluid, carbon nanotubes are suspended again with dispersant and pipetted directly onto the sample target of the MALDI-MS to perform a mass spectrometric analysis. It was demonstrated by analysis of a variety of small molecules that the resolution of peaks and the efficiency of desorption/ionization on the carbon nanotubes are better than those on the activated carbon. It is found that with the addition of glycerol and sucrose to the dispersant, the intensity, the ratio of signal to noise (S/N), and the resolution of peaks for analytes by mass spectrometry increased greatly. Compared with the previously reported method by depositing sample solution onto thin layer of carbon nanotubes, it is observed that the detection limit for analytes can be enhanced about 10 to 100 times due to solid-phase extraction of analytes in solution by carbon nanotubes. An acceptable result of simultaneously quantitative analysis of three analytes in solution has been achieved. The application in determining drugs spiked into urine has also been realized.

  5. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  6. Shelf-life dating of shelf-stable strawberry juice based on survival analysis of consumer acceptance information.

    PubMed

    Buvé, Carolien; Van Bedts, Tine; Haenen, Annelien; Kebede, Biniam; Braekers, Roel; Hendrickx, Marc; Van Loey, Ann; Grauwet, Tara

    2018-07-01

    Accurate shelf-life dating of food products is crucial for consumers and industries. Therefore, in this study we applied a science-based approach for shelf-life assessment, including accelerated shelf-life testing (ASLT), acceptability testing and the screening of analytical attributes for fast shelf-life predictions. Shelf-stable strawberry juice was selected as a case study. Ambient storage (20 °C) had no effect on the aroma-based acceptance of strawberry juice. The colour-based acceptability decreased during storage under ambient and accelerated (28-42 °C) conditions. The application of survival analysis showed that the colour-based shelf-life was reached in the early stages of storage (≤11 weeks) and that the shelf-life was shortened at higher temperatures. None of the selected attributes (a * and ΔE * value, anthocyanin and ascorbic acid content) is an ideal analytical marker for shelf-life predictions in the investigated temperature range (20-42 °C). Nevertheless, an overall analytical cut-off value over the whole temperature range can be selected. Colour changes of strawberry juice during storage are shelf-life limiting. Combining ASLT with acceptability testing allowed to gain faster insight into the change in colour-based acceptability and to perform shelf-life predictions relying on scientific data. An analytical marker is a convenient tool for shelf-life predictions in the context of ASLT. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  7. 21 CFR 314.94 - Content and format of an abbreviated application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... bioequivalence study contained in the abbreviated new drug application, a description of the analytical and... exclusivity under section 505(j)(5)(F) of the act. (9) Chemistry, manufacturing, and controls. (i) The... the act and one copy of the analytical procedures and descriptive information needed by FDA's...

  8. 21 CFR 314.94 - Content and format of an abbreviated application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... bioequivalence study contained in the abbreviated new drug application, a description of the analytical and... exclusivity under section 505(j)(5)(F) of the act. (9) Chemistry, manufacturing, and controls. (i) The... the act and one copy of the analytical procedures and descriptive information needed by FDA's...

  9. Fault tolerance analysis and applications to microwave modules and MMIC's

    NASA Astrophysics Data System (ADS)

    Boggan, Garry H.

    A project whose objective was to provide an overview of built-in-test (BIT) considerations applicable to microwave systems, modules, and MMICs (monolithic microwave integrated circuits) is discussed. Available analytical techniques and software for assessing system failure characteristics were researched, and the resulting investigation provides a review of two techniques which have applicability to microwave systems design. A system-level approach to fault tolerance and redundancy management is presented in its relationship to the subsystem/element design. An overview of the microwave BIT focus from the Air Force Integrated Diagnostics program is presented. The technical reports prepared by the GIMADS team were reviewed for applicability to microwave modules and components. A review of MIMIC (millimeter and microwave integrated circuit) program activities relative to BIT/BITE is given.

  10. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  11. Learning Analytics for Online Discussions: Embedded and Extracted Approaches

    ERIC Educational Resources Information Center

    Wise, Alyssa Friend; Zhao, Yuting; Hausknecht, Simone Nicole

    2014-01-01

    This paper describes an application of learning analytics that builds on an existing research program investigating how students contribute and attend to the messages of others in asynchronous online discussions. We first overview the E-Listening research program and then explain how this work was translated into analytics that students and…

  12. Usage Patterns of a Mobile Palliative Care Application.

    PubMed

    Zhang, Haipeng; Liu, David; Marks, Sean; Rickerson, Elizabeth M; Wright, Adam; Gordon, William J; Landman, Adam

    2018-06-01

    Fast Facts Mobile (FFM) was created to be a convenient way for clinicians to access the Fast Facts and Concepts database of palliative care articles on a smartphone or tablet device. We analyzed usage patterns of FFM through an integrated analytics platform on the mobile versions of the FFM application. The primary objective of this study was to evaluate the usage data from FFM as a way to better understand user behavior for FFM as a palliative care educational tool. This is an exploratory, retrospective analysis of de-identified analytics data collected through the iOS and Android versions of FFM captured from November 2015 to November 2016. FFM App download statistics from November 1, 2015, to November 1, 2016, were accessed from the Apple and Google development websites. Further FFM session data were obtained from the analytics platform built into FFM. FFM was downloaded 9409 times over the year with 201,383 articles accessed. The most searched-for terms in FFM include the following: nausea, methadone, and delirium. We compared frequent users of FFM to infrequent users of FFM and found that 13% of all users comprise 66% of all activity in the application. Demand for useful and scalable tools for both primary palliative care and specialty palliative care will likely continue to grow. Understanding the usage patterns for FFM has the potential to inform the development of future versions of Fast Facts. Further studies of mobile palliative care educational tools will be needed to further define the impact of these educational tools.

  13. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    PubMed Central

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-01-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes—including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH—in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media. PMID:27834380

  14. National Water Quality Laboratory, 1995 services catalog

    USGS Publications Warehouse

    Timme, P.J.

    1995-01-01

    This Services Catalog contains information about field supplies and analytical services available from the National Water Quality Laboratory in Denver, Colo., and field supplies available from the Quality Water Service Unit in Ocala, Fla., to members of the U.S. Geological Survey. To assist personnel in the selection of analytical services, this catalog lists sample volume, required containers, applicable concentration range, detection level, precision of analysis, and preservation requirements for samples.

  15. Canada on the Move: an intensive media analysis from inception to reception.

    PubMed

    Faulkner, Guy; Finlay, Sara-Jane

    2006-01-01

    Research evaluating mediated physical activity campaigns uses an unsophisticated conceptualization of the media and would benefit from the application of a media studies approach. The purpose of this article is to report on the application of this type of analysis to the Canada on the Move media campaign. Through interviews and document analysis, the press release surrounding Canada on the Move was examined at four levels: inception, production, transmission and reception. Analytic strategies of thematic and textual analysis were conducted. The press release was well received by journalists and editors and was successfully transmitted as inferred from national and local television coverage, although there was no national print pickup. Canada on the Move was perceived by sampled audience members as a useful and interesting strategy to encourage walking. A holistic approach to media analysis reveals the complex and frequently messy process of this mediated communication process. Implications for future media disseminations of Canada on the Move are discussed.

  16. Investigation of air transportation technology at Ohio University, 1980. [general aviation aircraft and navigation aids

    NASA Technical Reports Server (NTRS)

    Mcfarland, R. H.

    1981-01-01

    Specific configurations of first and second order all digital phase locked loops were analyzed for both ideal and additive gaussian noise inputs. In addition, a design for a hardware digital phase locked loop capable of either first or second order operation was evaluated along with appropriate experimental data obtained from testing of the hardware loop. All parameters chosen for the analysis and the design of the digital phase locked loop were consistent with an application to an Omega navigation receiver although neither the analysis nor the design are limited to this application. For all cases tested, the experimental data showed close agreement with the analytical results indicating that the Markov chain model for first and second order digital phase locked loops are valid.

  17. Application of the boundary element method to the micromechanical analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Goldberg, R. K.; Hopkins, D. A.

    1995-01-01

    A new boundary element formulation for the micromechanical analysis of composite materials is presented in this study. A unique feature of the formulation is the use of circular shape functions to convert the two-dimensional integrations of the composite fibers to one-dimensional integrations. To demonstrate the applicability of the formulations, several example problems including elastic and thermal analysis of laminated composites and elastic analyses of woven composites are presented and the boundary element results compared to experimental observations and/or results obtained through alternate analytical procedures. While several issues remain to be addressed in order to make the methodology more robust, the formulations presented here show the potential in providing an alternative to traditional finite element methods, particularly for complex composite architectures.

  18. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  19. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  20. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

Top