Sample records for specific analytical techniques

  1. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  2. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  3. Quantitative and qualitative sensing techniques for biogenic volatile organic compounds and their oxidation products.

    PubMed

    Kim, Saewung; Guenther, Alex; Apel, Eric

    2013-07-01

    The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.

  4. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  5. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  6. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  7. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  10. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  11. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  12. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  13. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  15. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  16. Constitutive Equations: Plastic and Viscoelastic Properties. (Latest citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The bibliography contains citations concerning analytical techniques using constitutive equations, applied to materials under stress. The properties explored with these techniques include viscoelasticity, thermoelasticity, and plasticity. While many of the references are general as to material type, most refer to specific metals or composites, or to specific shapes, such as flat plate or spherical vessels.

  17. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  18. Raman spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Raman spectroscopy has gained increased use and importance in recent years for accurate and precise detection of physical and chemical properties of food materials, due to the greater specificity and sensitivity of Raman techniques over other analytical techniques. This book chapter presents Raman s...

  19. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  20. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    PubMed

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.

  1. The Utility of Person-Specific Analyses for Investigating Developmental Processes: An Analytic Primer on Studying the Individual

    ERIC Educational Resources Information Center

    Gayles, Jochebed G.; Molenaar, Peter C. M.

    2013-01-01

    The fields of psychology and human development are experiencing a resurgence of scientific inquiries about phenomena that unfold at the level of the individual. This article addresses the issues of analyzing intraindividual psychological/developmental phenomena using standard analytical techniques for interindividual variation. When phenomena are…

  2. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  3. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  4. Analytical Chemistry: A retrospective view on some current trends.

    PubMed

    Niessner, Reinhard

    2018-04-01

    In a retrospective view some current trends in Analytical Chemistry are outlined and connected to work published more than a hundred years ago in the same field. For example, gravimetric microanalysis after specific precipitation, once the sole basis for chemical analysis, has been transformed into a mass-sensitive transducer in combination with compound-specific receptors. Molecular spectroscopy, still practising the classical absorption/emission techniques for detecting elements or molecules experiences a change to Raman spectroscopy, is now allowing analysis of a multitude of additional features. Chemical sensors are now used to perform a vast number of analytical measurements. Especially paper-based devices (dipsticks, microfluidic pads) celebrate a revival as they can potentially revolutionize medicine in the developing world. Industry 4.0 will lead to a further increase of sensor applications. Preceding separation and enrichment of analytes from complicated matrices remains the backbone for a successful analysis, despite increasing attempts to avoid clean-up. Continuous separation techniques will become a key element for 24/7 production of goods with certified quality. Attempts to get instantaneous and specific chemical information by optical or electrical transduction will need highly selective receptors in large quantities. Further understanding of ligand - receptor complex structures is the key for successful generation of artificial bio-inspired receptors. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. 40 CFR Appendix B to Part 60 - Performance Specifications

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 6216-98 is the reference for design specifications, manufacturer's performance specifications, and test... representative of a group of monitors produced during a specified period or lot, for conformance with the design... technique and a single analytical program are used. One Run may include results for more than one test...

  6. Comparative study of inorganic elements determined in whole blood from Dmd(mdx)/J mice strain by EDXRF and NAA analytical techniques.

    PubMed

    Redígolo, M M; Sato, I M; Metairon, S; Zamboni, C B

    2016-04-01

    Several diseases can be diagnosed observing the variation of specific elements concentration in body fluids. In this study the concentration of inorganic elements in blood samples of dystrophic (Dmd(mdx)/J) and C57BL/6J (control group) mice strain were determined. The results obtained from Energy Dispersive X-ray Fluorescence (EDXRF) were compared with Neutron Activation Analysis (NAA) technique. Both analytical techniques showed to be appropriate and complementary offering a new contribution for veterinary medicine as well as detailed knowledge of this pathology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  8. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  9. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  10. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  11. Analytical Applications Of High-Resolution Molecular Fluorescence Spectroscopy In Low Temperature Solid Matrices

    NASA Astrophysics Data System (ADS)

    Hofstraat, Johannes W.; van Zeijl, W. J.; Smedes, F.; Ariese, Freek; Gooijer, Cees; Velthorst, Nel H.; Locher, R.; Renn, Alois; Wild, Urs P.

    1989-05-01

    High-resolution fluorescence spectroscopy may be used to obtain highly specific, vibrationally resolved spectral signatures of molecules. Two techniques are presented that both make use of low temperature, solid matrices. In Shpol'skii spectroscopy highly resolved spectra are obtained by employing n-alkanes as solvents that form neat crystalline matrices at low temperatures in which the guest molecules occupy well defined substitutional sites. Fluorescence line-narrowing spectroscopy is based on the application of selective (mostly laser-) excitation of the guest molecules. Principles and analytical applications of both techniques will be discussed. Specific attention will be paid to the determination of pyrene in bird meat by means of Shpol'skii spectroscopy and to the possibilities of applying two-dimensional fluorescence line-narrowing spectroscopy.

  12. Shape optimization of disc-type flywheels

    NASA Technical Reports Server (NTRS)

    Nizza, R. S.

    1976-01-01

    Techniques were developed for presenting an analytical and graphical means for selecting an optimum flywheel system design, based on system requirements, geometric constraints, and weight limitations. The techniques for creating an analytical solution are formulated from energy and structural principals. The resulting flywheel design relates stress and strain pattern distribution, operating speeds, geometry, and specific energy levels. The design techniques incorporate the lowest stressed flywheel for any particular application and achieve the highest specific energy per unit flywheel weight possible. Stress and strain contour mapping and sectional profile plotting reflect the results of the structural behavior manifested under rotating conditions. This approach toward flywheel design is applicable to any metal flywheel, and permits the selection of the flywheel design to be based solely on the criteria of the system requirements that must be met, those that must be optimized, and those system parameters that may be permitted to vary.

  13. The Nature and Effects of Transformational School Leadership: A Meta-Analytic Review of Unpublished Research

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Sun, Jingping

    2012-01-01

    Background: Using meta-analytic review techniques, this study synthesized the results of 79 unpublished studies about the nature of transformational school leadership (TSL) and its impact on the school organization, teachers, and students. This corpus of research associates TSL with 11 specific leadership practices. These practices, as a whole,…

  14. What We Do and Do Not Know about Teaching Medical Image Interpretation.

    PubMed

    Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F

    2017-01-01

    Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.

  15. Analytic and subjective assessments of operator workload imposed by communications tasks in transport aircraft

    NASA Technical Reports Server (NTRS)

    Eckel, J. S.; Crabtree, M. S.

    1984-01-01

    Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.

  16. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  17. A Review of Interface Electronic Systems for AT-cut Quartz Crystal Microbalance Applications in Liquids

    PubMed Central

    Arnau, Antonio

    2008-01-01

    From the first applications of AT-cut quartz crystals as sensors in solutions more than 20 years ago, the so-called quartz crystal microbalance (QCM) sensor is becoming into a good alternative analytical method in a great deal of applications such as biosensors, analysis of biomolecular interactions, study of bacterial adhesion at specific interfaces, pathogen and microorganism detection, study of polymer film-biomolecule or cell-substrate interactions, immunosensors and an extensive use in fluids and polymer characterization and electrochemical applications among others. The appropriate evaluation of this analytical method requires recognizing the different steps involved and to be conscious of their importance and limitations. The first step involved in a QCM system is the accurate and appropriate characterization of the sensor in relation to the specific application. The use of the piezoelectric sensor in contact with solutions strongly affects its behavior and appropriate electronic interfaces must be used for an adequate sensor characterization. Systems based on different principles and techniques have been implemented during the last 25 years. The interface selection for the specific application is important and its limitations must be known to be conscious of its suitability, and for avoiding the possible error propagation in the interpretation of results. This article presents a comprehensive overview of the different techniques used for AT-cut quartz crystal microbalance in in-solution applications, which are based on the following principles: network or impedance analyzers, decay methods, oscillators and lock-in techniques. The electronic interfaces based on oscillators and phase-locked techniques are treated in detail, with the description of different configurations, since these techniques are the most used in applications for detection of analytes in solutions, and in those where a fast sensor response is necessary. PMID:27879713

  18. Determination of a Limited Scope Network's Lightning Detection Efficiency

    NASA Technical Reports Server (NTRS)

    Rompala, John T.; Blakeslee, R.

    2008-01-01

    This paper outlines a modeling technique to map lightning detection efficiency variations over a region surveyed by a sparse array of ground based detectors. A reliable flash peak current distribution (PCD) for the region serves as the technique's base. This distribution is recast as an event probability distribution function. The technique then uses the PCD together with information regarding: site signal detection thresholds, type of solution algorithm used, and range attenuation; to formulate the probability that a flash at a specified location will yield a solution. Applying this technique to the full region produces detection efficiency contour maps specific to the parameters employed. These contours facilitate a comparative analysis of each parameter's effect on the network's detection efficiency. In an alternate application, this modeling technique gives an estimate of the number, strength, and distribution of events going undetected. This approach leads to a variety of event density contour maps. This application is also illustrated. The technique's base PCD can be empirical or analytical. A process for formulating an empirical PCD specific to the region and network being studied is presented. A new method for producing an analytical representation of the empirical PCD is also introduced.

  19. Electromigrative separation techniques in forensic science: combining selectivity, sensitivity, and robustness.

    PubMed

    Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin

    2015-01-01

    In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.

  20. A Meta-Analytic Review of the Role of Child Anxiety Sensitivity in Child Anxiety

    ERIC Educational Resources Information Center

    Noel, Valerie A.; Francis, Sarah E.

    2011-01-01

    Conflicting findings exist regarding (1) whether anxiety sensitivity (AS) is a construct distinct from anxiety in children and (2) the specific nature of the role of AS in child anxiety. This study uses meta-analytic techniques to (1) determine whether youth (ages 6-18 years) have been reported to experience AS, (2) examine whether AS…

  1. Constitutive Equations: Plastic and Viscoelastic Properties. (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning analytical techniques using constitutive equations, applied to materials under stress. The properties explored with these techniques include viscoelasticity, thermoelasticity, and plasticity. While many of the references are general as to material type, most refer to specific metals or composites, or to specific shapes, such as flat plate or spherical vessels. (Contains 50-250 citations and includes a subject term index and title list.)

  2. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel.

    PubMed

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed.

  3. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel

    PubMed Central

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G.; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed. PMID:27252672

  4. [Composition of chicken and quail eggs].

    PubMed

    Closa, S J; Marchesich, C; Cabrera, M; Morales, J C

    1999-06-01

    Qualified food composition data on lipids composition are needed to evaluate intakes as a risk factor in the development of heart disease. Proximal composition, cholesterol and fatty acid content of chicken and quail eggs, usually consumed or traded, were analysed. Proximal composition were determined using AOAC (1984) specific techniques; lipids were extracted by a Folch's modified technique and cholesterol and fatty acids were determined by gas chromatography. Results corroborate the stability of eggs composition. Cholesterol content of quail eggs is similar to chicken eggs, but it is almost the half content of data registered in Handbook 8. Differences may be attributed to the analytical methodology used to obtain them. This study provides data obtained with up-date analytical techniques and accessory information useful for food composition tables.

  5. Immobilization of Fab' fragments onto substrate surfaces: A survey of methods and applications.

    PubMed

    Crivianu-Gaita, Victor; Thompson, Michael

    2015-08-15

    Antibody immobilization onto surfaces has widespread applications in many different fields. It is desirable to bind antibodies such that their fragment-antigen-binding (Fab) units are oriented away from the surface in order to maximize analyte binding. The immobilization of only Fab' fragments yields benefits over the more traditional whole antibody immobilization technique. Bound Fab' fragments display higher surface densities, yielding a higher binding capacity for the analyte. The nucleophilic sulfide of the Fab' fragments allows for specific orientations to be achieved. For biosensors, this indicates a higher sensitivity and lower detection limit for a target analyte. The last thirty years have shown tremendous progress in the immobilization of Fab' fragments onto gold, Si-based, polysaccharide-based, plastic-based, magnetic, and inorganic surfaces. This review will show the current scope of Fab' immobilization techniques available and illustrate methods employed to minimize non-specific adsorption of undesirables. Furthermore, a variety of examples will be given to show the versatility of immobilized Fab' fragments in different applications and future directions of the field will be addressed, especially regarding biosensors. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Preanalytics in lung cancer.

    PubMed

    Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko

    2015-01-01

    Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.

  7. Protein assay structured on paper by using lithography

    NASA Astrophysics Data System (ADS)

    Wilhelm, E.; Nargang, T. M.; Al Bitar, W.; Waterkotte, B.; Rapp, B. E.

    2015-03-01

    There are two main challenges in producing a robust, paper-based analytical device. The first one is to create a hydrophobic barrier which unlike the commonly used wax barriers does not break if the paper is bent. The second one is the creation of the (bio-)specific sensing layer. For this proteins have to be immobilized without diminishing their activity. We solve both problems using light-based fabrication methods that enable fast, efficient manufacturing of paper-based analytical devices. The first technique relies on silanization by which we create a flexible hydrophobic barrier made of dimethoxydimethylsilane. The second technique demonstrated within this paper uses photobleaching to immobilize proteins by means of maskless projection lithography. Both techniques have been tested on a classical lithography setup using printed toner masks and on a lithography system for maskless lithography. Using these setups we could demonstrate that the proposed manufacturing techniques can be carried out at low costs. The resolution of the paper-based analytical devices obtained with static masks was lower due to the lower mask resolution. Better results were obtained using advanced lithography equipment. By doing so we demonstrated, that our technique enables fabrication of effective hydrophobic boundary layers with a thickness of only 342 μm. Furthermore we showed that flourescine-5-biotin can be immobilized on the non-structured paper and be employed for the detection of streptavidinalkaline phosphatase. By carrying out this assay on a paper-based analytical device which had been structured using the silanization technique we proofed biological compatibility of the suggested patterning technique.

  8. General and Domain-Specific Self-Concepts of Adults with Learning Disabilities: A Meta-Analysis

    ERIC Educational Resources Information Center

    Nelson, Jason M.

    2012-01-01

    The empirical literature investigating general and domain-specific self-concepts of adults with learning disabilities was examined using meta-analytic techniques. Eight inclusion criteria were developed to evaluate this literature and led to the inclusion of 22 studies. Results indicated that adults with learning disabilities reported lower…

  9. Characterizing and modeling protein-surface interactions in lab-on-chip devices

    NASA Astrophysics Data System (ADS)

    Katira, Parag

    Protein adsorption on surfaces determines the response of other biological species present in the surrounding solution. This phenomenon plays a major role in the design of biomedical and biotechnological devices. While specific protein adsorption is essential for device function, non-specific protein adsorption leads to the loss of device function. For example, non-specific protein adsorption on bioimplants triggers foreign body response, in biosensors it leads to reduced signal to noise ratios, and in hybrid bionanodevices it results in the loss of confinement and directionality of molecular shuttles. Novel surface coatings are being developed to reduce or completely prevent the non-specific adsorption of proteins to surfaces. A novel quantification technique for extremely low protein coverage on surfaces has been developed. This technique utilizes measurement of the landing rate of microtubule filaments on kinesin proteins adsorbed on a surface to determine the kinesin density. Ultra-low limits of detection, dynamic range, ease of detection and availability of a ready-made kinesin-microtubule kit makes this technique highly suitable for detecting protein adsorption below the detection limits of standard techniques. Secondly, a random sequential adsorption model is presented for protein adsorption to PEO-coated surfaces. The derived analytical expressions accurately predict the observed experimental results from various research groups, suggesting that PEO chains act as almost perfect steric barriers to protein adsorption. These expressions can be used to predict the performance of a variety of systems towards resisting protein adsorption and can help in the design of better non-fouling surface coatings. Finally, in biosensing systems, target analytes are captured and concentrated on specifically adsorbed proteins for detection. Non-specific adsorption of proteins results in the loss of signal, and an increase in the background. The use of nanoscale transducers as receptors is beneficial from the point of view of signal enhancement, but has a strong mass transport limitation. To overcome this, the use of molecular shuttles has been proposed that can selectively capture analytes and actively transport them to the nanoreceptors. The effect of employing such a two-stage capture process on biosensor sensitivity is theoretically investigated and an optimum design for a kinesin-microtubule-driven hybrid biosensor is proposed.

  10. Deep Space Telecommunications Systems Engineering

    NASA Technical Reports Server (NTRS)

    Yuen, J. H. (Editor)

    1982-01-01

    Descriptive and analytical information useful for the optimal design, specification, and performance evaluation of deep space telecommunications systems is presented. Telemetry, tracking, and command systems, receiver design, spacecraft antennas, frequency selection, interference, and modulation techniques are addressed.

  11. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. 3D ToF-SIMS Analysis of Peptide Incorporation into MALDI Matrix Crystals with Sub-micrometer Resolution.

    PubMed

    Körsgen, Martin; Pelster, Andreas; Dreisewerd, Klaus; Arlinghaus, Heinrich F

    2016-02-01

    The analytical sensitivity in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is largely affected by the specific analyte-matrix interaction, in particular by the possible incorporation of the analytes into crystalline MALDI matrices. Here we used time-of-flight secondary ion mass spectrometry (ToF-SIMS) to visualize the incorporation of three peptides with different hydrophobicities, bradykinin, Substance P, and vasopressin, into two classic MALDI matrices, 2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (HCCA). For depth profiling, an Ar cluster ion beam was used to gradually sputter through the matrix crystals without causing significant degradation of matrix or biomolecules. A pulsed Bi3 ion cluster beam was used to image the lateral analyte distribution in the center of the sputter crater. Using this dual beam technique, the 3D distribution of the analytes and spatial segregation effects within the matrix crystals were imaged with sub-μm resolution. The technique could in the future enable matrix-enhanced (ME)-ToF-SIMS imaging of peptides in tissue slices at ultra-high resolution. Graphical Abstract ᅟ.

  13. 3D ToF-SIMS Analysis of Peptide Incorporation into MALDI Matrix Crystals with Sub-micrometer Resolution

    NASA Astrophysics Data System (ADS)

    Körsgen, Martin; Pelster, Andreas; Dreisewerd, Klaus; Arlinghaus, Heinrich F.

    2016-02-01

    The analytical sensitivity in matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is largely affected by the specific analyte-matrix interaction, in particular by the possible incorporation of the analytes into crystalline MALDI matrices. Here we used time-of-flight secondary ion mass spectrometry (ToF-SIMS) to visualize the incorporation of three peptides with different hydrophobicities, bradykinin, Substance P, and vasopressin, into two classic MALDI matrices, 2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (HCCA). For depth profiling, an Ar cluster ion beam was used to gradually sputter through the matrix crystals without causing significant degradation of matrix or biomolecules. A pulsed Bi3 ion cluster beam was used to image the lateral analyte distribution in the center of the sputter crater. Using this dual beam technique, the 3D distribution of the analytes and spatial segregation effects within the matrix crystals were imaged with sub-μm resolution. The technique could in the future enable matrix-enhanced (ME)-ToF-SIMS imaging of peptides in tissue slices at ultra-high resolution.

  14. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  15. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  16. ULTRASONIC STUDIES OF THE FUNDAMENTAL MECHANISMS OF RECRYSTALLIZATION AND SINTERING OF METALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TURNER, JOSEPH A.

    2005-11-30

    The purpose of this project was to develop a fundamental understanding of the interaction of an ultrasonic wave with complex media, with specific emphases on recrystallization and sintering of metals. A combined analytical, numerical, and experimental research program was implemented. Theoretical models of elastic wave propagation through these complex materials were developed using stochastic wave field techniques. The numerical simulations focused on finite element wave propagation solutions through complex media. The experimental efforts were focused on corroboration of the models developed and on the development of new experimental techniques. The analytical and numerical research allows the experimental results to bemore » interpreted quantitatively.« less

  17. New techniques for the analysis of manual control systems. [mathematical models of human operator behavior

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.

    1971-01-01

    Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.

  18. An Analytic Approximation to Very High Specific Impulse and Specific Power Interplanetary Space Mission Analysis

    NASA Technical Reports Server (NTRS)

    Williams, Craig Hamilton

    1995-01-01

    A simple, analytic approximation is derived to calculate trip time and performance for propulsion systems of very high specific impulse (50,000 to 200,000 seconds) and very high specific power (10 to 1000 kW/kg) for human interplanetary space missions. The approach assumed field-free space, constant thrust/constant specific power, and near straight line (radial) trajectories between the planets. Closed form, one dimensional equations of motion for two-burn rendezvous and four-burn round trip missions are derived as a function of specific impulse, specific power, and propellant mass ratio. The equations are coupled to an optimizing parameter that maximizes performance and minimizes trip time. Data generated for hypothetical one-way and round trip human missions to Jupiter were found to be within 1% and 6% accuracy of integrated solutions respectively, verifying that for these systems, credible analysis does not require computationally intensive numerical techniques.

  19. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  20. [Detection of rubella virus RNA in clinical material by real time polymerase chain reaction method].

    PubMed

    Domonova, É A; Shipulina, O Iu; Kuevda, D A; Larichev, V F; Safonova, A P; Burchik, M A; Butenko, A M; Shipulin, G A

    2012-01-01

    Development of a reagent kit for detection of rubella virus RNA in clinical material by PCR-RT. During development and determination of analytical specificity and sensitivity DNA and RNA of 33 different microorganisms including 4 rubella strains were used. Comparison of analytical sensitivity of virological and molecular-biological methods was performed by using rubella virus strains Wistar RA 27/3, M-33, "Orlov", Judith. Evaluation of diagnostic informativity of rubella virus RNAisolation in various clinical material by PCR-RT method was performed in comparison with determination of virus specific serum antibodies by enzyme immunoassay. A reagent kit for the detection of rubella virus RNA in clinical material by PCR-RT was developed. Analytical specificity was 100%, analytical sensitivity - 400 virus RNA copies per ml. Analytical sensitivity of the developed technique exceeds analytical sensitivity of the Vero E6 cell culture infection method in studies of rubella virus strains Wistar RA 27/3 and "Orlov" by 11g and 31g, and for M-33 and Judith strains is analogous. Diagnostic specificity is 100%. Diagnostic specificity for testing samples obtained within 5 days of rash onset: for peripheral blood sera - 20.9%, saliva - 92.5%, nasopharyngeal swabs - 70.1%, saliva and nasopharyngeal swabs - 97%. Positive and negative predictive values of the results were shown depending on the type of clinical material tested. Application of reagent kit will allow to increase rubella diagnostics effectiveness at the early stages of infectious process development, timely and qualitatively perform differential diagnostics of exanthema diseases, support tactics of anti-epidemic regime.

  1. Multimodal technique to eliminate humidity interference for specific detection of ethanol.

    PubMed

    Jalal, Ahmed Hasnain; Umasankar, Yogeswaran; Gonzalez, Pablo J; Alfonso, Alejandro; Bhansali, Shekhar

    2017-01-15

    Multimodal electrochemical technique incorporating both open circuit potential (OCP) and amperometric techniques have been conceptualized and implemented to improve the detection of specific analyte in systems where more than one analyte is present. This approach has been demonstrated through the detection of ethanol while eliminating the contribution of water in a micro fuel cell sensor system. The sensor was interfaced with LMP91000 potentiostat, controlled through MSP430F5529LP microcontroller to implement an auto-calibration algorithm tailored to improve the detection of alcohol. The sensor was designed and fabricated as a three electrode system with Nafion as a proton exchange membrane (PEM). The electrochemical signal of the interfering phase (water) was eliminated by implementing the multimodal electrochemical detection technique. The results were validated by comparing sensor and potentiostat performances with a commercial sensor and potentiostat respectively. The results suggest that such a sensing system can detect ethanol at concentrations as low as 5ppm. The structure and properties such as low detection limit, selectivity and miniaturized size enables potential application of this device in wearable transdermal alcohol measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Commodity-Free Calibration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Commodity-free calibration is a reaction rate calibration technique that does not require the addition of any commodities. This technique is a specific form of the reaction rate technique, where all of the necessary reactants, other than the sample being analyzed, are either inherent in the analyzing system or specifically added or provided to the system for a reason other than calibration. After introduction, the component of interest is exposed to other reactants or flow paths already present in the system. The instrument detector records one of the following to determine the rate of reaction: the increase in the response of the reaction product, a decrease in the signal of the analyte response, or a decrease in the signal from the inherent reactant. With this data, the initial concentration of the analyte is calculated. This type of system can analyze and calibrate simultaneously, reduce the risk of false positives and exposure to toxic vapors, and improve accuracy. Moreover, having an excess of the reactant already present in the system eliminates the need to add commodities, which further reduces cost, logistic problems, and potential contamination. Also, the calculations involved can be simplified by comparison to those of the reaction rate technique. We conducted tests with hypergols as an initial investigation into the feasiblility of the technique.

  3. On the design of decoupling controllers for advanced rotorcraft in the hover case

    NASA Technical Reports Server (NTRS)

    Fan, M. K. H.; Tits, A.; Barlow, J.; Tsing, N. K.; Tischler, M.; Takahashi, M.

    1991-01-01

    A methodology for design of helicopter control systems is proposed that can account for various types of concurrent specifications: stability, decoupling between longitudinal and lateral motions, handling qualities, and physical limitations of the swashplate motions. This is achieved by synergistic use of analytical techniques (Q-parameterization of all stabilizing controllers, transfer function interpolation) and advanced numerical optimization techniques. The methodology is used to design a controller for the UH-60 helicopter in hover. Good results are achieved for decoupling and handling quality specifications.

  4. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  5. Application of techniques to identify coal-mine and power-generation effects on surface-water quality, San Juan River basin, New Mexico and Colorado

    USGS Publications Warehouse

    Goetz, C.L.; Abeyta, Cynthia G.; Thomas, E.V.

    1987-01-01

    Numerous analytical techniques were applied to determine water quality changes in the San Juan River basin upstream of Shiprock , New Mexico. Eight techniques were used to analyze hydrologic data such as: precipitation, water quality, and streamflow. The eight methods used are: (1) Piper diagram, (2) time-series plot, (3) frequency distribution, (4) box-and-whisker plot, (5) seasonal Kendall test, (6) Wilcoxon rank-sum test, (7) SEASRS procedure, and (8) analysis of flow adjusted, specific conductance data and smoothing. Post-1963 changes in dissolved solids concentration, dissolved potassium concentration, specific conductance, suspended sediment concentration, or suspended sediment load in the San Juan River downstream from the surface coal mines were examined to determine if coal mining was having an effect on the quality of surface water. None of the analytical methods used to analyzed the data showed any increase in dissolved solids concentration, dissolved potassium concentration, or specific conductance in the river downstream from the mines; some of the analytical methods used showed a decrease in dissolved solids concentration and specific conductance. Chaco River, an ephemeral stream tributary to the San Juan River, undergoes changes in water quality due to effluent from a power generation facility. The discharge in the Chaco River contributes about 1.9% of the average annual discharge at the downstream station, San Juan River at Shiprock, NM. The changes in water quality detected at the Chaco River station were not detected at the downstream Shiprock station. It was not possible, with the available data, to identify any effects of the surface coal mines on water quality that were separable from those of urbanization, agriculture, and other cultural and natural changes. In order to determine the specific causes of changes in water quality, it would be necessary to collect additional data at strategically located stations. (Author 's abstract)

  6. Tidal analysis of Met rocket wind data

    NASA Technical Reports Server (NTRS)

    Bedinger, J. F.; Constantinides, E.

    1976-01-01

    A method of analyzing Met Rocket wind data is described. Modern tidal theory and specialized analytical techniques were used to resolve specific tidal modes and prevailing components in observed wind data. A representation of the wind which is continuous in both space and time was formulated. Such a representation allows direct comparison with theory, allows the derivation of other quantities such as temperature and pressure which in turn may be compared with observed values, and allows the formation of a wind model which extends over a broader range of space and time. Significant diurnal tidal modes with wavelengths of 10 and 7 km were present in the data and were resolved by the analytical technique.

  7. Image correlation and sampling study

    NASA Technical Reports Server (NTRS)

    Popp, D. J.; Mccormack, D. S.; Sedwick, J. L.

    1972-01-01

    The development of analytical approaches for solving image correlation and image sampling of multispectral data is discussed. Relevant multispectral image statistics which are applicable to image correlation and sampling are identified. The general image statistics include intensity mean, variance, amplitude histogram, power spectral density function, and autocorrelation function. The translation problem associated with digital image registration and the analytical means for comparing commonly used correlation techniques are considered. General expressions for determining the reconstruction error for specific image sampling strategies are developed.

  8. Viking dynamics experience with application to future payload design

    NASA Technical Reports Server (NTRS)

    Barrett, S.; Rader, W. P.; Payne, K. R.

    1978-01-01

    Analytical and test techniques are discussed. Areas in which hindsight indicated erroneous, redundant, or unnecessarily severe design and test specifications are identified. Recommendations are made for improvements in the dynamic design and criteria philosophy, aimed at reducing costs for payloads.

  9. Challenging loop-mediated isothermal amplification (LAMP) technique for molecular detection of Toxoplasma gondii.

    PubMed

    Fallahi, Shirzad; Mazar, Zahra Arab; Ghasemian, Mehrdad; Haghighi, Ali

    2015-05-01

    To compare analytical sensitivity and specificity of a newly described DNA amplification technique, LAMP and nested PCR assay targeting the RE and B1 genes for the detection of Toxoplasma gondii (T. gondii) DNA. The analytical sensitivity of LAMP and nested-PCR was obtained against10-fold serial dilutions of T. gondii DNA ranging from 1 ng to 0.01 fg. DNA samples of other parasites and human chromosomal DNA were used to determine the specificity of molecular assays. After testing LAMP and nested-PCR in duplicate, the detection limit of RE-LAMP, B1-LAMP, RE-nested PCR and B1-nested PCR assays was one fg, 100 fg, 1 pg and 10 pg of T. gondii DNA respectively. All the LAMP assays and nested PCRs were 100% specific. The RE-LAMP assay revealed the most sensitivity for the detection of T. gondii DNA. The obtained results demonstrate that the LAMP technique has a greater sensitivity for detection of T. gondii. Furthermore, these findings indicate that primers based on the RE are more suitable than those based on the B1 gene. However, the B1-LAMP assay has potential as a diagnostic tool for detection of T. gondii. Copyright © 2015 Hainan Medical College. Production and hosting by Elsevier B.V. All rights reserved.

  10. Electrochemical detection of a single cytomegalovirus at an ultramicroelectrode and its antibody anchoring

    PubMed Central

    Dick, Jeffrey E.; Hilterbrand, Adam T.; Boika, Aliaksei; Upton, Jason W.; Bard, Allen J.

    2015-01-01

    We report observations of stochastic collisions of murine cytomegalovirus (MCMV) on ultramicroelectrodes (UMEs), extending the observation of discrete collision events on UMEs to biologically relevant analytes. Adsorption of an antibody specific for a virion surface glycoprotein allowed differentiation of MCMV from MCMV bound by antibody from the collision frequency decrease and current magnitudes in the electrochemical collision experiments, which shows the efficacy of the method to size viral samples. To add selectivity to the technique, interactions between MCMV, a glycoprotein-specific primary antibody to MCMV, and polystyrene bead “anchors,” which were functionalized with a secondary antibody specific to the Fc region of the primary antibody, were used to affect virus mobility. Bead aggregation was observed, and the extent of aggregation was measured using the electrochemical collision technique. Scanning electron microscopy and optical microscopy further supported aggregate shape and extent of aggregation with and without MCMV. This work extends the field of collisions to biologically relevant antigens and provides a novel foundation upon which qualitative sensor technology might be built for selective detection of viruses and other biologically relevant analytes. PMID:25870261

  11. A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques

    PubMed Central

    Weller, Michael G.

    2012-01-01

    The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539

  12. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  13. [Amanitine determination as an example of peptide analysis in the biological samples with HPLC-MS technique].

    PubMed

    Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof

    Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers satisfactory sensitivity of determination.

  14. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  15. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  16. Positron Emission Tomography: Human Brain Function and Biochemistry.

    ERIC Educational Resources Information Center

    Phelps, Michael E.; Mazziotta, John C.

    1985-01-01

    Describes the method, present status, and application of positron emission tomography (PET), an analytical imaging technique for "in vivo" measurements of the anatomical distribution and rates of specific biochemical reactions. Measurements and image dynamic biochemistry link basic and clinical neurosciences with clinical findings…

  17. Recent advances in immunosensor for narcotic drug detection

    PubMed Central

    Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman

    2015-01-01

    Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925

  18. Guidelines for collection and field analysis of water-quality samples from streams in Texas

    USGS Publications Warehouse

    Wells, F.C.; Gibbons, W.J.; Dorsey, M.E.

    1990-01-01

    Analyses for unstable constituents or properties are by necessity performed in the field. This manual addresses analytical techniques and quality assurance for: (1) Water temperature; (2) specific conductance; (3) pH; (4) alkalinity; (5) dissolved oxygen; and (6) bacteria.

  19. Constructing and predicting solitary pattern solutions for nonlinear time-fractional dispersive partial differential equations

    NASA Astrophysics Data System (ADS)

    Arqub, Omar Abu; El-Ajou, Ahmad; Momani, Shaher

    2015-07-01

    Building fractional mathematical models for specific phenomena and developing numerical or analytical solutions for these fractional mathematical models are crucial issues in mathematics, physics, and engineering. In this work, a new analytical technique for constructing and predicting solitary pattern solutions of time-fractional dispersive partial differential equations is proposed based on the generalized Taylor series formula and residual error function. The new approach provides solutions in the form of a rapidly convergent series with easily computable components using symbolic computation software. For method evaluation and validation, the proposed technique was applied to three different models and compared with some of the well-known methods. The resultant simulations clearly demonstrate the superiority and potentiality of the proposed technique in terms of the quality performance and accuracy of substructure preservation in the construct, as well as the prediction of solitary pattern solutions for time-fractional dispersive partial differential equations.

  20. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  1. A technique for computation of noise temperature due to a beam waveguide shroud

    NASA Technical Reports Server (NTRS)

    Veruttipong, W.; Franco, M. M.

    1993-01-01

    Direct analytical computation of the noise temperature of real beam waveguide (BWG) systems, including all mirrors and the surrounding shroud, is an extremely complex problem and virtually impossible to achieve. Yet the DSN antennas are required to be ultra low-noise in order to be effective, and a reasonably accurate prediction is essential. This article presents a relatively simple technique to compute a real BWG system noise temperature by combining analytical techniques with data from experimental tests. Specific expressions and parameters for X-band (8.45-GHz) BWG noise computation are obtained for DSS 13 and DSS 24, now under construction. These expressions are also valid for various conditions of the BWG feed systems, including horn sizes and positions, and mirror sizes, curvatures, and positions. Parameters for S- and Ka-bands (2.3 and 32.0 GHz) have not been determined; however, those can be obtained following the same procedure as for X-band.

  2. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.

  3. [The requirements of standard and conditions of interchangeability of medical articles].

    PubMed

    Men'shikov, V V; Lukicheva, T I

    2013-11-01

    The article deals with possibility to apply specific approaches under evaluation of interchangeability of medical articles for laboratory analysis. The development of standardized analytical technologies of laboratory medicine and formulation of requirements of standards addressed to manufacturers of medical articles the clinically validated requirements are to be followed. These requirements include sensitivity and specificity of techniques, accuracy and precision of research results, stability of reagents' quality in particular conditions of their transportation and storage. The validity of requirements formulated in standards and addressed to manufacturers of medical articles can be proved using reference system, which includes master forms and standard samples, reference techniques and reference laboratories. This approach is supported by data of evaluation of testing systems for measurement of level of thyrotrophic hormone, thyroid hormones and glycated hemoglobin HB A1c. The versions of testing systems can be considered as interchangeable only in case of results corresponding to the results of reference technique and comparable with them. In case of absence of functioning reference system the possibilities of the Joined committee of traceability in laboratory medicine make it possible for manufacturers of reagent sets to apply the certified reference materials under development of manufacturing of sets for large listing of analytes.

  4. Electrostatic Interactions between OmpG Nanopore and Analyte Protein Surface Can Distinguish between Glycosylated Isoforms.

    PubMed

    Fahie, Monifa A; Chen, Min

    2015-08-13

    The flexible loops decorating the entrance of OmpG nanopore move dynamically during ionic current recording. The gating caused by these flexible loops changes when a target protein is bound. The gating is characterized by parameters including frequency, duration, and open-pore current, and these features combine to reveal the identity of a specific analyte protein. Here, we show that OmpG nanopore equipped with a biotin ligand can distinguish glycosylated and deglycosylated isoforms of avidin by their differences in surface charge. Our studies demonstrate that the direct interaction between the nanopore and analyte surface, induced by the electrostatic attraction between the two molecules, is essential for protein isoform detection. Our technique is remarkably sensitive to the analyte surface, which may provide a useful tool for glycoprotein profiling.

  5. Analysis of small droplets with a new detector for liquid chromatography based on laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Janzen, Christoph; Fleige, Rüdiger; Noll, Reinhard; Schwenke, Heinrich; Lahmann, Wilhelm; Knoth, Joachim; Beaven, Peter; Jantzen, Eckard; Oest, Andreas; Koke, Peter

    2005-08-01

    The miniaturization of analytical techniques is a general trend in speciation analytics. We have developed a new analytical technique combining high pressure liquid chromatography (HPLC) with laser-induced breakdown spectroscopy (LIBS). This enables a molecule-specific separation followed by an element-specific analysis of smallest amounts of complex samples. The liquid flow coming from a HPLC pump is transformed into a continuous stream of small droplets (diameter 50-100 μm, volume 65-500 pl) using a piezoelectric pulsed nozzle. After the detection of single droplets with a droplet detector, a Q-switched Nd:YAG Laser is triggered to emit a synchronized laser pulse that irradiates a single droplet. The droplets are evaporated and transformed to the plasma state. The spectrum emitted from the plasma is collected by a spherical mirror and directed through the entrance slit of a Paschen-Runge spectrometer equipped with channel photomultipliers. The spectrometer detects 31 elements simultaneously covering a spectral range from 120 to 589 nm. Purging the measurement chamber with argon enables the detection of vacuum-UV lines. Since the sample is transferred to the plasma state without dilution, very low flow rates in the sub-μl/min range can be realised.

  6. Philosophical Inquiry: (An Investigation of Basic Philosophical Presuppositions) Teacher's Manual.

    ERIC Educational Resources Information Center

    Institute for Services to Education, Inc., Washington, DC.

    This guide provides teaching techniques for an undergraduate philosophy course. Students examine specific philosophic issues related to the black person's experience. They are required to apply critical and analytical procedures leading to philosophical investigations of topics of both philosophical and nonphilosophical origins. The teaching…

  7. Analytical techniques for identification and study of organic matter in returned lunar samples

    NASA Technical Reports Server (NTRS)

    Burlingame, A. L.

    1974-01-01

    The results of geochemical research are reviewed. Emphasis is placed on the contribution of mass spectrometric data to the solution of specific structural problems. Information on the mass spectrometric behavior of compounds of geochemical interest is reviewed and currently available techniques of particular importance to geochemistry, such as gas chromatograph-mass spectrometer coupling, modern sample introduction methods, and computer application in high resolution mass spectrometry, receive particular attention.

  8. New techniques for imaging and analyzing lung tissue.

    PubMed Central

    Roggli, V L; Ingram, P; Linton, R W; Gutknecht, W F; Mastin, P; Shelburne, J D

    1984-01-01

    The recent technological revolution in the field of imaging techniques has provided pathologists and toxicologists with an expanding repertoire of analytical techniques for studying the interaction between the lung and the various exogenous materials to which it is exposed. Analytical problems requiring elemental sensitivity or specificity beyond the range of that offered by conventional scanning electron microscopy and energy dispersive X-ray analysis are particularly appropriate for the application of these newer techniques. Electron energy loss spectrometry, Auger electron spectroscopy, secondary ion mass spectrometry, and laser microprobe mass analysis each offer unique advantages in this regard, but also possess their own limitations and disadvantages. Diffraction techniques provide crystalline structural information available through no other means. Bulk chemical techniques provide useful cross-checks on the data obtained by microanalytical approaches. It is the purpose of this review to summarize the methodology of these techniques, acknowledge situations in which they have been used in addressing problems in pulmonary toxicology, and comment on the relative advantages and disadvantages of each approach. It is necessary for an investigator to weigh each of these factors when deciding which technique is best suited for any given analytical problem; often it is useful to employ a combination of two or more of the techniques discussed. It is anticipated that there will be increasing utilization of these technologies for problems in pulmonary toxicology in the decades to come. Images FIGURE 3. A FIGURE 3. B FIGURE 3. C FIGURE 3. D FIGURE 4. FIGURE 5. FIGURE 7. A FIGURE 7. B FIGURE 8. A FIGURE 8. B FIGURE 8. C FIGURE 9. A FIGURE 9. B FIGURE 10. PMID:6090115

  9. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  11. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  12. Biosensors for hepatitis B virus detection.

    PubMed

    Yao, Chun-Yan; Fu, Wei-Ling

    2014-09-21

    A biosensor is an analytical device used for the detection of analytes, which combines a biological component with a physicochemical detector. Recently, an increasing number of biosensors have been used in clinical research, for example, the blood glucose biosensor. This review focuses on the current state of biosensor research with respect to efficient, specific and rapid detection of hepatitis B virus (HBV). The biosensors developed based on different techniques, including optical methods (e.g., surface plasmon resonance), acoustic wave technologies (e.g., quartz crystal microbalance), electrochemistry (amperometry, voltammetry and impedance) and novel nanotechnology, are also discussed.

  13. Discourse for slide presentation: An overview of chemical detection systems

    NASA Technical Reports Server (NTRS)

    Peters, Randy Alan; Galen, Theodore J.; Pierson, Duane L.

    1990-01-01

    A brief overview of some of the analytical techniques currently used in monitoring and analyzing permanent gases and selected volatile organic compound in air are presented. Some of the analytical considerations in developing a specific method are discussed. Four broad groups of hardware are discussed: compound class specific personal monitors, gas chromatographic systems, infrared spectroscopic systems, and mass spectrometric residual gas analyzer systems. Three types of detectors are also discussed: catalytic sensor based systems, photoionization detectors, and wet or dry chemical reagent systems. Under gas chromatograph based systems five detector systems used in combination with a GC are covered: thermal conductivity detectors, photoionization detectors, Fourier transform infrared spectrophotometric systems, quadrapole mass spectrometric systems, and a relatively recent development, a surface acoustic wave vapor detector.

  14. Characterization and measurement of natural gas trace constituents. Volume 1. Arsenic. Final report, June 1989-October 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, S.S.; Attari, A.

    1995-01-01

    The discovery of arsenic compounds, as alkylarsines, in natural gas prompted this research program to develop reliable measurement techniques needed to assess the efficiency of removal processes for these environmentally sensitive substances. These techniques include sampling, speciation, quantitation and on-line instrumental methods for monitoring the total arsenic concentration. The current program has yielded many products, including calibration standards, arsenic-specific sorbents, sensitive analytical methods and instrumentation. Four laboratory analytical methods have been developed and successfully employed for arsenic determination in natural gas. These methods use GC-AED and GC-MS instruments to speciate alkylarsines, and peroxydisulfate extraction with FIAS, special carbon sorbent withmore » XRF and an IGT developed sorbent with GFAA for total arsenic measurement.« less

  15. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  17. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  18. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  19. Analytical and Theranostic Applications of Gold Nanoparticles and Multifunctional Nanocomposites

    PubMed Central

    Khlebtsov, Nikolai; Bogatyrev, Vladimir; Dykman, Lev; Khlebtsov, Boris; Staroverov, Sergey; Shirokov, Alexander; Matora, Larisa; Khanadeev, Vitaly; Pylaev, Timofey; Tsyganova, Natalia; Terentyuk, Georgy

    2013-01-01

    Gold nanoparticles (GNPs) and GNP-based multifunctional nanocomposites are the subject of intensive studies and biomedical applications. This minireview summarizes our recent efforts in analytical and theranostic applications of engineered GNPs and nanocomposites by using plasmonic properties of GNPs and various optical techniques. Specifically, we consider analytical biosensing; visualization and bioimaging of bacterial, mammalian, and plant cells; photodynamic treatment of pathogenic bacteria; and photothermal therapy of xenografted tumors. In addition to recently published reports, we discuss new data on dot immunoassay diagnostics of mycobacteria, multiplexed immunoelectron microscopy analysis of Azospirillum brasilense, materno-embryonic transfer of GNPs in pregnant rats, and combined photodynamic and photothermal treatment of rat xenografted tumors with gold nanorods covered by a mesoporous silica shell doped with hematoporphyrin. PMID:23471188

  20. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  2. Orbiter Avionics Radiation Handbook

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon D.

    1999-01-01

    This handbook was assembled to document he radiation environment for design of Orbiter avionics. It also maps the environment through vehicle shielding and mission usage into discrete requirements such as total dose. Some details of analytical techniques for calculating radiation effects are provided. It is anticipated that appropriate portions of this document will be added to formal program specifications.

  3. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    PubMed

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  4. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  5. High-freezing-point fuel studies

    NASA Technical Reports Server (NTRS)

    Tolle, F. F.

    1980-01-01

    Considerable progress in developing the experimental and analytical techniques needed to design airplanes to accommodate fuels with less stringent low temperature specifications is reported. A computer technique for calculating fuel temperature profiles in full tanks was developed. The computer program is being extended to include the case of partially empty tanks. Ultimately, the completed package is to be incorporated into an aircraft fuel tank thermal analyser code to permit the designer to fly various thermal exposure patterns, study fuel temperatures versus time, and determine holdup.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  7. A loop-mediated isothermal amplification assay for rapid and sensitive detection of bovine papular stomatitis virus.

    PubMed

    Kurosaki, Yohei; Okada, Sayaka; Nakamae, Sayuri; Yasuda, Jiro

    2016-12-01

    Bovine papular stomatitis virus (BPSV) causes pustular cutaneous disease in cattle worldwide. This paper describes the development of a specific loop-mediated isothermal amplification (LAMP) assay to detect BPSV which did not cross-react with other parapoxviruses. To assess analytical sensitivity of this LAMP assay, DNA was extracted from serially diluted BPSV from which the infectious titer was determined by a novel assay based on calf kidney epithelial cells. The LAMP assay had equivalent analytical sensitivity to quantitative PCR, and could detect as few as 86 copies of viral DNA per reaction. These results suggest that the assay is a specific and sensitive technique to rapidly diagnose bovine papular stomatitis in domestic animals. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. An evaluation of thematic mapper simulator data for the geobotanical discrimination of rock types in Southwest Oregon

    NASA Technical Reports Server (NTRS)

    Weinstock, K. J.; Morrissey, L. A.

    1984-01-01

    Rock type identification may be assisted by the use of remote sensing of associated vegetation, particularly in areas of dense vegetative cover where surface materials are not imaged directly by the sensor. The geobotanical discrimination of ultramafic parent materials was investigated and analytical techniques for lithologic mapping and mineral exploration were developed. The utility of remotely sensed data to discriminate vegetation types associated with ultramafic parent materials in a study area in southwest Oregon were evaluated. A number of specific objectives were identified, which include: (1) establishment of the association between vegetation and rock types; (2) examination of the spectral separability of vegetation types associated with rock types; (3) determination of the contribution of each TMS band for discriminating vegetation associated with rock types and (4) comparison of analytical techniques for spectrally classifying vegetation.

  9. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  10. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    NASA Astrophysics Data System (ADS)

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-11-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes--including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH--in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media.

  11. Methods for determination of radioactive substances in water and fluvial sediments

    USGS Publications Warehouse

    Thatcher, Leland Lincoln; Janzer, Victor J.; Edwards, Kenneth W.

    1977-01-01

    Analytical methods for the determination of some of the more important components of fission or neutron activation product radioactivity and of natural radioactivity found in water are reported. The report for each analytical method includes conditions for application of the method, a summary of the method, interferences, required apparatus and reagents, analytical procedures, calculations, reporting of results, and estimation of precision. The fission product isotopes considered are cesium-137, strontium-90, and ruthenium-106. The natural radioelements and isotopes considered are uranium, lead-210, radium-226, radium-228, tritium, and carbon-14. A gross radioactivity survey method and a uranium isotope ratio method are given. When two analytical methods are in routine use for an individual isotope, both methods are reported with identification of the specific areas of application of each. Techniques for the collection and preservation of water samples to be analyzed for radioactivity are discussed.

  12. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  13. Accommodating subject and instrument variations in spectroscopic determinations

    DOEpatents

    Haas, Michael J [Albuquerque, NM; Rowe, Robert K [Corrales, NM; Thomas, Edward V [Albuquerque, NM

    2006-08-29

    A method and apparatus for measuring a biological attribute, such as the concentration of an analyte, particularly a blood analyte in tissue such as glucose. The method utilizes spectrographic techniques in conjunction with an improved instrument-tailored or subject-tailored calibration model. In a calibration phase, calibration model data is modified to reduce or eliminate instrument-specific attributes, resulting in a calibration data set modeling intra-instrument or intra-subject variation. In a prediction phase, the prediction process is tailored for each target instrument separately using a minimal number of spectral measurements from each instrument or subject.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coletti, Chiara, E-mail: chiara.coletti@studenti.u

    During the firing of bricks, mineralogical and textural transformations produce an artificial aggregate characterised by significant porosity. Particularly as regards pore-size distribution and the interconnection model, porosity is an important parameter to evaluate and predict the durability of bricks. The pore system is in fact the main element, which correlates building materials and their environment (especially in cases of aggressive weathering, e.g., salt crystallisation and freeze-thaw cycles) and determines their durability. Four industrial bricks with differing compositions and firing temperatures were analysed with “direct” and “indirect” techniques, traditional methods (mercury intrusion porosimetry, hydric tests, nitrogen adsorption) and new analytical approachesmore » based on digital image reconstruction of 2D and 3D models (back-scattered electrons and computerised X-ray micro-Tomography, respectively). The comparison of results from different analytical methods in the “overlapping ranges” of porosity and the careful reconstruction of a cumulative curve, allowed overcoming their specific limitations and achieving better knowledge on the pore system of bricks. - Highlights: •Pore-size distribution and structure of the pore system in four commercial bricks •A multi-analytical approach combining “direct” and “indirect” techniques •Traditional methods vs. new approaches based on 2D/3D digital image reconstruction •The use of “overlapping ranges” to overcome the limitations of various techniques.« less

  15. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  16. Western Blotting of the Endocannabinoid System.

    PubMed

    Wager-Miller, Jim; Mackie, Ken

    2016-01-01

    Measuring expression levels of G protein-coupled receptors (GPCRs) is an important step for understanding the distribution, function, and regulation of these receptors. A common approach for detecting proteins from complex biological systems is Western blotting. In this chapter, we describe a general approach to Western blotting protein components of the endocannabinoid system using sodium dodecyl sulfate-polyacrylamide gel electrophoresis and nitrocellulose membranes, with a focus on detecting type 1 cannabinoid (CB1) receptors. When this technique is carefully used, specifically with validation of the primary antibodies, it can provide quantitative information on protein expression levels. Additional information can also be inferred from Western blotting such as potential posttranslational modifications that can be further evaluated by specific analytical techniques.

  17. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general.

  18. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  20. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  1. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  2. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    PubMed

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  3. An Undergraduate Experiment for the Measurement of Perfluorinated Surfactants in Fish Liver by Liquid Chromatography-Tandem Mass Spectrometry

    ERIC Educational Resources Information Center

    Stock, Naomi L.; Martin, Jonathan W.; Ye, Yun; Mabury, Scott A.

    2007-01-01

    A laboratory experiment that provides students a hands-on introduction to the specific techniques of liquid chromatography-tandem mass spectrometry (LC-MS/MS) and electrospray ionization is presented. The students can thus practice the analytical principles of sample extraction, detection, quantification, and quality control using a fresh fish…

  4. Anxiety Disorders in Children and Adolescents with Autistic Spectrum Disorders: A Meta-Analysis

    ERIC Educational Resources Information Center

    van Steensel, Francisca J. A.; Bogels, Susan M.; Perrin, Sean

    2011-01-01

    There is considerable evidence that children and adolescents with autistic spectrum disorders (ASD) are at increased risk of anxiety and anxiety disorders. However, it is less clear which of the specific DSM-IV anxiety disorders occur most in this population. The present study used meta-analytic techniques to help clarify this issue. A systematic…

  5. System safety education focused on industrial engineering

    NASA Technical Reports Server (NTRS)

    Johnston, W. L.; Morris, R. S.

    1971-01-01

    An educational program, designed to train students with the specific skills needed to become safety specialists, is described. The discussion concentrates on application, selection, and utilization of various system safety analytical approaches. Emphasis is also placed on the management of a system safety program, its relationship with other disciplines, and new developments and applications of system safety techniques.

  6. Angles-only, ground-based, initial orbit determination

    NASA Astrophysics Data System (ADS)

    Taff, L. G.; Randall, P. M. S.; Stansfield, S. A.

    1984-05-01

    Over the past few years, passive, ground-based, angles-only initial orbit determination has had a thorough analytical, numerical, experimental, and creative re-examination. This report presents the numerical culmination of this effort and contains specific recommendations for which of several techniques one should use on the different subsets of high altitude artificial satellites and minor planets.

  7. Comparison of Commercial Electromagnetic Interface Test Techniques to NASA Electromagnetic Interference Test Techniques

    NASA Astrophysics Data System (ADS)

    Smith, V.

    2000-11-01

    This report documents the development of analytical techniques required for interpreting and comparing space systems electromagnetic interference test data with commercial electromagnetic interference test data using NASA Specification SSP 30237A "Space Systems Electromagnetic Emission and Susceptibility Requirements for Electromagnetic Compatibility." The PSpice computer simulation results and the laboratory measurements for the test setups under study compare well. The study results, however, indicate that the transfer function required to translate test results of one setup to another is highly dependent on cables and their actual layout in the test setup. Since cables are equipment specific and are not specified in the test standards, developing a transfer function that would cover all cable types (random, twisted, or coaxial), sizes (gauge number and length), and layouts (distance from the ground plane) is not practical.

  8. Comparison of Commercial Electromagnetic Interface Test Techniques to NASA Electromagnetic Interference Test Techniques

    NASA Technical Reports Server (NTRS)

    Smith, V.; Minor, J. L. (Technical Monitor)

    2000-01-01

    This report documents the development of analytical techniques required for interpreting and comparing space systems electromagnetic interference test data with commercial electromagnetic interference test data using NASA Specification SSP 30237A "Space Systems Electromagnetic Emission and Susceptibility Requirements for Electromagnetic Compatibility." The PSpice computer simulation results and the laboratory measurements for the test setups under study compare well. The study results, however, indicate that the transfer function required to translate test results of one setup to another is highly dependent on cables and their actual layout in the test setup. Since cables are equipment specific and are not specified in the test standards, developing a transfer function that would cover all cable types (random, twisted, or coaxial), sizes (gauge number and length), and layouts (distance from the ground plane) is not practical.

  9. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  10. A guide for the application of analytics on healthcare processes: A dynamic view on patient pathways.

    PubMed

    Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan

    2016-10-01

    The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Deriving and Analyzing Analytical Structures of a Class of Typical Interval Type-2 TS Fuzzy Controllers.

    PubMed

    Zhou, Haibo; Ying, Hao

    2017-09-01

    A conventional controller's explicit input-output mathematical relationship, also known as its analytical structure, is always available for analysis and design of a control system. In contrast, virtually all type-2 (T2) fuzzy controllers are treated as black-box controllers in the literature in that their analytical structures are unknown, which inhibits precise and comprehensive understanding and analysis. In this regard, a long-standing fundamental issue remains unresolved: how a T2 fuzzy set's footprint of uncertainty, a key element differentiating a T2 controller from a type-1 (T1) controller, affects a controller's analytical structure. In this paper, we describe an innovative technique for deriving analytical structures of a class of typical interval T2 (IT2) TS fuzzy controllers. This technique makes it possible to analyze the analytical structures of the controllers to reveal the role of footprints of uncertainty in shaping the structures. Specifically, we have mathematically proven that under certain conditions, the larger the footprints, the more the IT2 controllers resemble linear or piecewise linear controllers. When the footprints are at their maximum, the IT2 controllers actually become linear or piecewise linear controllers. That is to say the smaller the footprints, the more nonlinear the controllers. The most nonlinear IT2 controllers are attained at zero footprints, at which point they become T1 controllers. This finding implies that sometimes if strong nonlinearity is most important and desired, one should consider using a smaller footprint or even just a T1 fuzzy controller. This paper exemplifies the importance and value of the analytical structure approach for comprehensive analysis of T2 fuzzy controllers.

  12. Transient well flow in vertically heterogeneous aquifers

    NASA Astrophysics Data System (ADS)

    Hemker, C. J.

    1999-11-01

    A solution for the general problem of computing well flow in vertically heterogeneous aquifers is found by an integration of both analytical and numerical techniques. The radial component of flow is treated analytically; the drawdown is a continuous function of the distance to the well. The finite-difference technique is used for the vertical flow component only. The aquifer is discretized in the vertical dimension and the heterogeneous aquifer is considered to be a layered (stratified) formation with a finite number of homogeneous sublayers, where each sublayer may have different properties. The transient part of the differential equation is solved with Stehfest's algorithm, a numerical inversion technique of the Laplace transform. The well is of constant discharge and penetrates one or more of the sublayers. The effect of wellbore storage on early drawdown data is taken into account. In this way drawdowns are found for a finite number of sublayers as a continuous function of radial distance to the well and of time since the pumping started. The model is verified by comparing results with published analytical and numerical solutions for well flow in homogeneous and heterogeneous, confined and unconfined aquifers. Instantaneous and delayed drainage of water from above the water table are considered, combined with the effects of partially penetrating and finite-diameter wells. The model is applied to demonstrate that the transient effects of wellbore storage in unconfined aquifers are less pronounced than previous numerical experiments suggest. Other applications of the presented solution technique are given for partially penetrating wells in heterogeneous formations, including a demonstration of the effect of decreasing specific storage values with depth in an otherwise homogeneous aquifer. The presented solution can be a powerful tool for the analysis of drawdown from pumping tests, because hydraulic properties of layered heterogeneous aquifer systems with partially penetrating wells may be estimated without the need to construct transient numerical models. A computer program based on the hybrid analytical-numerical technique is available from the author.

  13. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  14. Phenalenone-type phytoalexins mediate resistance of banana plants (Musa spp.) to the burrowing nematode Radopholus similis.

    PubMed

    Hölscher, Dirk; Dhakshinamoorthy, Suganthagunthalam; Alexandrov, Theodore; Becker, Michael; Bretschneider, Tom; Buerkert, Andreas; Crecelius, Anna C; De Waele, Dirk; Elsen, Annemie; Heckel, David G; Heklau, Heike; Hertweck, Christian; Kai, Marco; Knop, Katrin; Krafft, Christoph; Maddula, Ravi K; Matthäus, Christian; Popp, Jürgen; Schneider, Bernd; Schubert, Ulrich S; Sikora, Richard A; Svatoš, Aleš; Swennen, Rony L

    2014-01-07

    The global yield of bananas-one of the most important food crops-is severely hampered by parasites, such as nematodes, which cause yield losses up to 75%. Plant-nematode interactions of two banana cultivars differing in susceptibility to Radopholus similis were investigated by combining the conventional and spatially resolved analytical techniques (1)H NMR spectroscopy, matrix-free UV-laser desorption/ionization mass spectrometric imaging, and Raman microspectroscopy. This innovative combination of analytical techniques was applied to isolate, identify, and locate the banana-specific type of phytoalexins, phenylphenalenones, in the R. similis-caused lesions of the plants. The striking antinematode activity of the phenylphenalenone anigorufone, its ingestion by the nematode, and its subsequent localization in lipid droplets within the nematode is reported. The importance of varying local concentrations of these specialized metabolites in infected plant tissues, their involvement in the plant's defense system, and derived strategies for improving banana resistance are highlighted.

  15. The examination of headache activity using time-series research designs.

    PubMed

    Houle, Timothy T; Remble, Thomas A; Houle, Thomas A

    2005-05-01

    The majority of research conducted on headache has utilized cross-sectional designs which preclude the examination of dynamic factors and principally rely on group-level effects. The present article describes the application of an individual-oriented process model using time-series analytical techniques. The blending of a time-series approach with an interactive process model allows consideration of the relationships of intra-individual dynamic processes, while not precluding the researcher to examine inter-individual differences. The authors explore the nature of time-series data and present two necessary assumptions underlying the time-series approach. The concept of shock and its contribution to headache activity is also presented. The time-series approach is not without its problems and two such problems are specifically reported: autocorrelation and the distribution of daily observations. The article concludes with the presentation of several analytical techniques suited to examine the time-series interactive process model.

  16. NIR and UV-vis spectroscopy, artificial nose and tongue: comparison of four fingerprinting techniques for the characterisation of Italian red wines.

    PubMed

    Casale, M; Oliveri, P; Armanino, C; Lanteri, S; Forina, M

    2010-06-04

    Four rapid and low-cost vanguard analytical systems (NIR and UV-vis spectroscopy, a headspace-mass based artificial nose and a voltammetric artificial tongue), together with chemometric pattern recognition techniques, were applied and compared in addressing a food authentication problem: the distinction between wine samples from the same Italian oenological region, according to the grape variety. Specifically, 59 certified samples belonging to the Barbera d'Alba and Dolcetto d'Alba appellations and collected from the same vintage (2007) were analysed. The instrumental responses, after proper data pre-processing, were used as fingerprints of the characteristics of the samples: the results from principal component analysis and linear discriminant analysis were discussed, comparing the capability of the four analytical strategies in addressing the problem studied. Copyright 2010 Elsevier B.V. All rights reserved.

  17. Halal authenticity issues in meat and meat products.

    PubMed

    Nakyinsige, Khadijah; Man, Yaakob Bin Che; Sazili, Awis Qurni

    2012-07-01

    In the recent years, Muslims have become increasingly concerned about the meat they eat. Proper product description is very crucial for consumers to make informed choices and to ensure fair trade, particularly in the ever growing halal food market. Globally, Muslim consumers are concerned about a number of issues concerning meat and meat products such as pork substitution, undeclared blood plasma, use of prohibited ingredients, pork intestine casings and non-halal methods of slaughter. Analytical techniques which are appropriate and specific have been developed to deal with particular issues. The most suitable technique for any particular sample is often determined by the nature of the sample itself. This paper sets out to identify what makes meat halal, highlight the halal authenticity issues that occur in meat and meat products and provide an overview of the possible analytical methods for halal authentication of meat and meat products. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Supercritical fluid chromatography: a promising alternative to current bioanalytical techniques.

    PubMed

    Dispas, Amandine; Jambo, Hugues; André, Sébastien; Tyteca, Eva; Hubert, Philippe

    2018-01-01

    During the last years, chemistry was involved in the worldwide effort toward environmental problems leading to the birth of green chemistry. In this context, green analytical tools were developed as modern Supercritical Fluid Chromatography in the field of separative techniques. This chromatographic technique knew resurgence a few years ago, thanks to its high efficiency, fastness and robustness of new generation equipment. These advantages and its easy hyphenation to MS fulfill the requirements of bioanalysis regarding separation capacity and high throughput. In the present paper, the technical aspects focused on bioanalysis specifications will be detailed followed by a critical review of bioanalytical supercritical fluid chromatography methods published in the literature.

  19. Nonlinear multi-photon laser wave-mixing optical detection in microarrays and microchips for ultrasensitive detection and separation of biomarkers for cancer and neurodegenerative diseases

    NASA Astrophysics Data System (ADS)

    Iwabuchi, Manna; Hetu, Marcel; Maxwell, Eric; Pradel, Jean S.; Ramos, Sashary; Tong, William G.

    2015-09-01

    Multi-photon degenerate four-wave mixing is demonstrated as an ultrasensitive absorption-based optical method for detection, separation and identification of biomarker proteins in the development of early diagnostic methods for HIV- 1, cancer and neurodegenerative diseases using compact, portable microarrays and capillary- or microchip-based chemical separation systems that offer high chemical specificity levels. The wave-mixing signal has a quadratic dependence on concentration, and hence, it allows more reliable monitoring of smaller changes in analyte properties. Our wave-mixing detection sensitivity is comparable or better than those of current methods including enzyme-linked immunoassay for clinical diagnostic and screening. Detection sensitivity is excellent since the wave-mixing signal is a coherent laser-like beam that can be collected with virtually 100% collection efficiency with high S/N. Our analysis time is short (1-15 minutes) for molecular weight-based protein separation as compared to that of a conventional separation technique, e.g., sodium dodecyl sulfate-polyacrylamide gel electrophoresis. When ultrasensitive wavemixing detection is paired with high-resolution capillary- or microchip-based separation systems, biomarkers can be separated and identified at the zepto- and yocto-mole levels for a wide range of analytes. Specific analytes can be captured in a microchannel through the use of antibody-antigen interactions that provide better chemical specificity as compared to size-based separation alone. The technique can also be combined with immune-precipitation and a multichannel capillary array for high-throughput analysis of more complex protein samples. Wave mixing allows the use of chromophores and absorption-modifying tags, in addition to conventional fluorophores, for online detection of immunecomplexes related to cancer.

  20. Nanotechnology and chip level systems for pressure driven liquid chromatography and emerging analytical separation techniques: a review.

    PubMed

    Lavrik, N V; Taylor, L T; Sepaniak, M J

    2011-05-23

    Pressure driven liquid chromatography (LC) is a powerful and versatile separation technique particularly suitable for differentiating species present in extremely small quantities. This paper briefly reviews main historical trends and focuses on more recently developed technological approaches in miniaturization and on-chip integration of LC columns. The review emphasizes enabling technologies as well as main technological challenges specific to pressure driven separations and highlights emerging concepts that could ultimately overcome fundamental limitations of conventional LC columns. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.

  2. An overview of the characterization of occupational exposure to nanoaerosols in workplaces

    NASA Astrophysics Data System (ADS)

    Castellano, Paola; Ferrante, Riccardo; Curini, Roberta; Canepari, Silvia

    2009-05-01

    Currently, there is a lack of standardized sampling and metric methods that can be applied to measure the level of exposure to nanosized aerosols. Therefore, any attempt to characterize exposure to nanoparticles (NP) in a workplace must involve a multifaceted approach characterized by different sampling and analytical techniques to measure all relevant characteristics of NP exposure. Furthermore, as NP aerosols are always complex mixtures of multiple origins, sampling and analytical methods need to be improved to selectively evaluate the apportionment from specific sources to the final nanomaterials. An open question at the world's level is how to relate specific toxic effects of NP with one or more among several different parameters (such as particle size, mass, composition, surface area, number concentration, aggregation or agglomeration state, water solubility and surface chemistry). As the evaluation of occupational exposure to NP in workplaces needs dimensional and chemical characterization, the main problem is the choice of the sampling and dimensional separation techniques. Therefore a convenient approach to allow a satisfactory risk assessment could be the contemporary use of different sampling and measuring techniques for particles with known toxicity in selected workplaces. Despite the lack of specific NP exposure limit values, exposure metrics, appropriate to nanoaerosols, are discussed in the Technical Report ISO/TR 27628:2007 with the aim to enable occupational hygienists to characterize and monitor nanoaerosols in workplaces. Moreover, NIOSH has developed the Document Approaches to Safe Nanotechnology (intended to be an information exchange with NIOSH) in order to address current and future research needs to understanding the potential risks that nanotechnology may have to workers.

  3. Surface Plasmon Resonance-Based Fiber Optic Sensors Utilizing Molecular Imprinting

    PubMed Central

    Gupta, Banshi D.; Shrivastav, Anand M.; Usha, Sruthi P.

    2016-01-01

    Molecular imprinting is earning worldwide attention from researchers in the field of sensing and diagnostic applications, due to its properties of inevitable specific affinity for the template molecule. The fabrication of complementary template imprints allows this technique to achieve high selectivity for the analyte to be sensed. Sensors incorporating this technique along with surface plasmon or localized surface plasmon resonance (SPR/LSPR) provide highly sensitive real time detection with quick response times. Unfolding these techniques with optical fiber provide the additional advantages of miniaturized probes with ease of handling, online monitoring and remote sensing. In this review a summary of optical fiber sensors using the combined approaches of molecularly imprinted polymer (MIP) and the SPR/LSPR technique is discussed. An overview of the fundamentals of SPR/LSPR implementation on optical fiber is provided. The review also covers the molecular imprinting technology (MIT) with its elementary study, synthesis procedures and its applications for chemical and biological anlayte detection with different sensing methods. In conclusion, we explore the advantages, challenges and the future perspectives of developing highly sensitive and selective methods for the detection of analytes utilizing MIT with the SPR/LSPR phenomenon on optical fiber platforms. PMID:27589746

  4. A Grammar-based Approach for Modeling User Interactions and Generating Suggestions During the Data Exploration Process.

    PubMed

    Dabek, Filip; Caban, Jesus J

    2017-01-01

    Despite the recent popularity of visual analytics focusing on big data, little is known about how to support users that use visualization techniques to explore multi-dimensional datasets and accomplish specific tasks. Our lack of models that can assist end-users during the data exploration process has made it challenging to learn from the user's interactive and analytical process. The ability to model how a user interacts with a specific visualization technique and what difficulties they face are paramount in supporting individuals with discovering new patterns within their complex datasets. This paper introduces the notion of visualization systems understanding and modeling user interactions with the intent of guiding a user through a task thereby enhancing visual data exploration. The challenges faced and the necessary future steps to take are discussed; and to provide a working example, a grammar-based model is presented that can learn from user interactions, determine the common patterns among a number of subjects using a K-Reversible algorithm, build a set of rules, and apply those rules in the form of suggestions to new users with the goal of guiding them along their visual analytic process. A formal evaluation study with 300 subjects was performed showing that our grammar-based model is effective at capturing the interactive process followed by users and that further research in this area has the potential to positively impact how users interact with a visualization system.

  5. Ionic liquids: solvents and sorbents in sample preparation.

    PubMed

    Clark, Kevin D; Emaus, Miranda N; Varona, Marcelino; Bowers, Ashley N; Anderson, Jared L

    2018-01-01

    The applications of ionic liquids (ILs) and IL-derived sorbents are rapidly expanding. By careful selection of the cation and anion components, the physicochemical properties of ILs can be altered to meet the requirements of specific applications. Reports of IL solvents possessing high selectivity for specific analytes are numerous and continue to motivate the development of new IL-based sample preparation methods that are faster, more selective, and environmentally benign compared to conventional organic solvents. The advantages of ILs have also been exploited in solid/polymer formats in which ordinarily nonspecific sorbents are functionalized with IL moieties in order to impart selectivity for an analyte or analyte class. Furthermore, new ILs that incorporate a paramagnetic component into the IL structure, known as magnetic ionic liquids (MILs), have emerged as useful solvents for bioanalytical applications. In this rapidly changing field, this Review focuses on the applications of ILs and IL-based sorbents in sample preparation with a special emphasis on liquid phase extraction techniques using ILs and MILs, IL-based solid-phase extraction, ILs in mass spectrometry, and biological applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Pharmaceuticals in biota in the aquatic environment: analytical methods and environmental implications.

    PubMed

    Huerta, B; Rodríguez-Mozaz, S; Barceló, D

    2012-11-01

    The presence of pharmaceuticals in the aquatic environment is an ever-increasing issue of concern as they are specifically designed to target specific metabolic and molecular pathways in organisms, and they may have the potential for unintended effects on nontarget species. Information on the presence of pharmaceuticals in biota is still scarce, but the scientific literature on the subject has established the possibility of bioaccumulation in exposed aquatic organisms through other environmental compartments. However, few studies have correlated both bioaccumulation of pharmaceutical compounds and the consequent effects. Analytical methodology to detect pharmaceuticals at trace quantities in biota has advanced significantly in the last few years. Nonetheless, there are still unresolved analytical challenges associated with the complexity of biological matrices, which require exhaustive extraction and purification steps, and highly sensitive and selective detection techniques. This review presents the trends in the analysis of pharmaceuticals in aquatic organisms in the last decade, recent data about the occurrence of these compounds in natural biota, and the environmental implications that chronic exposure could have on aquatic wildlife.

  7. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  9. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. The treatment of tendon injury with electromagnetic fields evidenced by advanced ultrasound image processing.

    PubMed

    Parker, Richard; Markov, Marko

    2015-09-01

    This article presents a novel modality for accelerating the repair of tendon and ligament lesions by means of a specifically designed electromagnetic field in an equine model. This novel therapeutic approach employs a delivery system that induces a specific electrical signal from an external magnetic field derived from Superconductive QUantum Interference Device (SQUID) measurements of injured vs. healthy tissue. Evaluation of this therapy technique is enabled by a proposed new technology described as Predictive Analytical Imagery (PAI™). This technique examines an ultrasound grayscale image and seeks to evaluate it by means of look-ahead predictive algorithms and digital signal processing. The net result is a significant reduction in background noise and the production of a high-resolution grayscale or digital image.

  11. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  12. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  13. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment.

    PubMed

    Streby, Ashleigh; Mull, Bonnie J; Levy, Karen; Hill, Vincent R

    2015-05-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Foursuch assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices.

  14. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment

    PubMed Central

    Streby, Ashleigh; Mull, Bonnie J.; Levy, Karen

    2015-01-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Four such assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices. PMID:25855343

  15. New Millenium Inflatable Structures Technology

    NASA Technical Reports Server (NTRS)

    Mollerick, Ralph

    1997-01-01

    Specific applications where inflatable technology can enable or enhance future space missions are tabulated. The applicability of the inflatable technology to large aperture infra-red astronomy missions is discussed. Space flight validation and risk reduction are emphasized along with the importance of analytical tools in deriving structurally sound concepts and performing optimizations using compatible codes. Deployment dynamics control, fabrication techniques, and system testing are addressed.

  16. Cost and Schedule Analytical Techniques Development

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) under contract NAS 8-40431 "Cost and Schedule Analytical Techniques Development Contract" (CSATD) during Option Year 3 (December 1, 1997 through November 30, 1998). This Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical products and deliverables in the form of parametric models, databases, methodologies, studies, and analyses to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) and other user organizations. Detailed Monthly Reports were submitted to MSFC in accordance with the contract's Statement of Work, Section IV "Reporting and Documentation". These reports spelled out each month's specific work performed, deliverables submitted, major meetings conducted, and other pertinent information. Therefore, this Final Report will summarize these activities at a higher level. During this contract Option Year, SAIC expended 25,745 hours in the performance of tasks called out in the Statement of Work. This represents approximately 14 full-time EPs. Included are the Huntsville-based team, plus SAIC specialists in San Diego, Ames Research Center, Tampa, and Colorado Springs performing specific tasks for which they are uniquely qualified.

  17. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.

  18. Finding Waldo: Learning about Users from their Interactions.

    PubMed

    Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco

    2014-12-01

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.

  19. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  20. Modern Approach to Medical Diagnostics - the Use of Separation Techniques in Microorganisms Detection.

    PubMed

    Chylewska, Agnieszka; Ogryzek, M; Makowski, Mariusz

    2017-10-23

    New analytical and molecular methods for microorganisms are being developed on various features of identification i.e. selectivity, specificity, sensitivity, rapidity and discrimination of the viable cell. The presented review was established following the current trends in improved pathogens separation and detection methods and their subsequent use in medical diagnosis. This contribution also focuses on the development of analytical and biological methods in the analysis of microorganisms, with special attention paid to bio-samples containing microbes (blood, urine, lymph, wastewater). First, the paper discusses microbes characterization, their structure, surface, properties, size and then it describes pivotal points in the bacteria, viruses and fungi separation procedure obtained by researchers in the last 30 years. According to the above, detection techniques can be classified into three categories, which were, in our opinion, examined and modified most intensively during this period: electrophoretic, nucleic-acid-based, and immunological methods. The review covers also the progress, limitations and challenges of these approaches and emphasizes the advantages of new separative techniques in selective fractionating of microorganisms. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less

  2. Understanding changes over time in workers' compensation claim rates using time series analytical techniques.

    PubMed

    Moore, Ian C; Tompa, Emile

    2011-11-01

    The objective of this study is to better understand the inter-temporal variation in workers' compensation claim rates using time series analytical techniques not commonly used in the occupational health and safety literature. We focus specifically on the role of unemployment rates in explaining claim rate variations. The major components of workers' compensation claim rates are decomposed using data from a Canadian workers' compensation authority for the period 1991-2007. Several techniques are used to undertake the decomposition and assess key factors driving rates: (i) the multitaper spectral estimator, (ii) the harmonic F test, (iii) the Kalman smoother and (iv) ordinary least squares. The largest component of the periodic behaviour in workers' compensation claim rates is seasonal variation. Business cycle fluctuations in workers' compensation claim rates move inversely to unemployment rates. The analysis suggests that workers' compensation claim rates between 1991 and 2008 were driven by (in order of magnitude) a strong negative long term growth trend, periodic seasonal trends and business cycle fluctuations proxied by the Ontario unemployment rate.

  3. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  4. Significant findings concerning the production of Italian Renaissance lustred majolica

    NASA Astrophysics Data System (ADS)

    Padeletti, G.; Fermo, P.

    2013-12-01

    In the present paper the main results obtained, over a period of more than ten years, from a series of studies concerning the characterization of Italian Renaissance lustred majolicas (from Gubbio and Deruta, Umbria, Italy) are presented. Lustre decoration is a well-known technique, consisting in the application of a thin metallic iridescent film, containing silver and copper nanoparticles, over a previously glazed ceramic object. The technique had its origin in Persia (IX century), was imported by Moorish in Spain, and then developed in central Italy during the Renaissance period. Numerous analytical techniques (among which, ETASS, XRD, UV-Vis, SEM-EDX) have been employed for the characterization of lustred ceramic shards, allowing one to acquire information on both lustre chemical composition and nanostructure. In this way it was shown how some technological parameters, such as the firing conditions, are mandatory to obtain the final result. The presence of a specific marker of the lustre Italian production, i.e., cosalite (Pb2Bi2S5), has been also highlighted. From the study of the ceramic body composition (by means of XRD and ICP-OES and in particular of chemometric techniques) acquired on more than 50 ceramic shards it was possible to discriminate between Deruta and Gubbio production, in this way allowing one to assign objects of uncertain provenance to a specific site. Finally, the most interesting results obtained studying excellent lustred masterpieces from Renaissance belonging to important museums are here presented. In particular, with the use of nondestructive techniques (PIXE, RBS, and portable XRD), the production of Mastro Giorgio Andreoli from Gubbio was investigated. By means of the same analytical approach, one of the first examples of lustre in Italy (the famous Baglioni's albarello) was examined, and the controversial question of its attribution to Italian production was scientifically faced.

  5. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  6. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  7. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  8. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  9. SnapShot: Visualization to Propel Ice Hockey Analytics.

    PubMed

    Pileggi, H; Stolper, C D; Boyle, J M; Stasko, J T

    2012-12-01

    Sports analysts live in a world of dynamic games flattened into tables of numbers, divorced from the rinks, pitches, and courts where they were generated. Currently, these professional analysts use R, Stata, SAS, and other statistical software packages for uncovering insights from game data. Quantitative sports consultants seek a competitive advantage both for their clients and for themselves as analytics becomes increasingly valued by teams, clubs, and squads. In order for the information visualization community to support the members of this blossoming industry, it must recognize where and how visualization can enhance the existing analytical workflow. In this paper, we identify three primary stages of today's sports analyst's routine where visualization can be beneficially integrated: 1) exploring a dataspace; 2) sharing hypotheses with internal colleagues; and 3) communicating findings to stakeholders.Working closely with professional ice hockey analysts, we designed and built SnapShot, a system to integrate visualization into the hockey intelligence gathering process. SnapShot employs a variety of information visualization techniques to display shot data, yet given the importance of a specific hockey statistic, shot length, we introduce a technique, the radial heat map. Through a user study, we received encouraging feedback from several professional analysts, both independent consultants and professional team personnel.

  10. Micro-optics for microfluidic analytical applications.

    PubMed

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  11. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    PubMed Central

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-01-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes—including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH—in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media. PMID:27834380

  12. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes.

    PubMed

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-02-23

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques.

  13. Environmental Monitoring and the Gas Industry: Program Manager Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory D. Gillispie

    1997-12-01

    This document has been developed for the nontechnical gas industry manager who has the responsibility for the development of waste or potentially contaminated soil and groundwater data or must make decisions based on such data for the management or remediation of these materials. It explores the pse of common analytical chemistry instrumentation and associated techniques for identification of environmentally hazardous materials. Sufficient detail is given to familiarize the nontechnical reader with the principles behind the operation of each technique. The scope and realm of the techniques and their constituent variations are portrayed through a discussion of crucial details and, wheremore » appropriate, the depiction of real-life data. It is the author's intention to provide an easily understood handbook for gas industry management. Techniques which determine the presence, composition, and quantification of gas industry wastes are discussed. Greater focus is given to traditional techniques which have been the mainstay of modem analytical benchwork. However, with the continual advancement of instrumental principles and design, several techniques have been included which are likely to receive greater attention in fiture considerations for waste-related detection. Definitions and concepts inherent to a thorough understanding of the principles common to analytical chemistry are discussed. It is also crucial that gas industry managers understand the effects of the various actions which take place before, during, and after the actual sampling step. When a series of sample collection, storage, and transport activities occur, new or inexperienced project managers may overlook or misunderstand the importance of the sequence. Each step has an impact on the final results of the measurement process; errors in judgment or decision making can be costly. Specific techniques and methodologies for the collection, storage, and transport of environmental media samples are not described or discussed in detail in thk handbook. However, the underlying philosophy regarding the importance of proper collection, storage, and transport practices, as well as pertinent references, are presented.« less

  14. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  15. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  16. Analytical Micromechanics Modeling Technique Developed for Ceramic Matrix Composites Analysis

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    Ceramic matrix composites (CMCs) promise many advantages for next-generation aerospace propulsion systems. Specifically, carbon-reinforced silicon carbide (C/SiC) CMCs enable higher operational temperatures and provide potential component weight savings by virtue of their high specific strength. These attributes may provide systemwide benefits. Higher operating temperatures lessen or eliminate the need for cooling, thereby reducing both fuel consumption and the complex hardware and plumbing required for heat management. This, in turn, lowers system weight, size, and complexity, while improving efficiency, reliability, and service life, resulting in overall lower operating costs.

  17. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  18. Development and evaluation of the RT-PCR kit for the rabies virus diagnosis.

    PubMed

    Dedkov, Vladimir G; Deviatkin, A A; Poleschuk, E M; Safonova, M V; Markelov, M L; Shipulin, G A

    To improve the diagnosis, surveillance, and control for the rabies virus, a kit for hybridization-triggered fluorescence detection of rabies virus DNA by the RT-PCR technique was developed and evaluated. The analytical sensitivity of the kit was 4*10 GE per ml. High specificity of the kit was shown using representative sampling of viral, bacterial, and human nucleic acids.

  19. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  20. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    PubMed

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.

  1. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  2. Update on Chemical Analysis of Recovered Hydrazine Family Fuels for Recycling

    NASA Technical Reports Server (NTRS)

    Davis, C. L.

    1997-01-01

    The National Aeronautics and Space Administration, Kennedy Space Center, has developed a program to re-use and/or recycle hypergolic propellants recovered from propellant systems. As part of this effort, new techniques were developed to analyze recovered propellants. At the 1996 PDCS, the paper 'Chemical Analysis of Recovered Hydrazine Family Fuels For Recycling' presented analytical techniques used in accordance with KSC specifications which define what recovered propellants are acceptable for recycling. This paper is a follow up to the 1996 paper. Lower detection limits and response linearity were examined for two gas chromatograph methods.

  3. A history of development in rotordynamics: A manufacturer's perspective

    NASA Technical Reports Server (NTRS)

    Shemeld, David E.

    1987-01-01

    The subject of rotordynamics and instability problems in high performance turbomachinery has been a topic of considerable industry discussion and debate over the last 15 or so years. This paper reviews an original equipment manufacturer's history of development of concepts and equipment as applicable to multistage centrifugal compressors. The variety of industry user compression requirements and resultant problematical situations tends to confound many of the theories and analytical techniques set forth. The experiences and examples described herein support the conclusion that the successful addressing of potential rotordynamics problems is best served by a fundamental knowledge of the specific equipment. This in addition to having the appropriate analytical tools. Also, that the final proof is in the doing.

  4. Infrared radiation of thin plastic films.

    NASA Technical Reports Server (NTRS)

    Tien, C. L.; Chan, C. K.; Cunnington, G. R.

    1972-01-01

    A combined analytical and experimental study is presented for infrared radiation characteristics of thin plastic films with and without a metal substrate. On the basis of the thin-film analysis, a simple analytical technique is developed for determining band-averaged optical constants of thin plastic films from spectral normal transmittance data for two different film thicknesses. Specifically, the band-averaged optical constants of polyethylene terephthalate and polyimide were obtained from transmittance measurements of films with thicknesses in the range of 0.25 to 3 mil. The spectral normal reflectance and total normal emittance of the film side of singly aluminized films are calculated by use of optical constants; the results compare favorably with measured values.

  5. On-line soft sensing in upstream bioprocessing.

    PubMed

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  6. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  7. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  8. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Techniques and Tools of NASA's Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steve J.

    2005-01-01

    The Space Shuttle Columbia accident investigation was a fusion of many disciplines into a single effort. From the recovery and reconstruction of the debris, Figure 1, to the analysis, both destructive and nondestructive, of chemical and metallurgical samples, Figure 2, a multitude of analytical techniques and tools were employed. Destructive and non-destructive testing were utilized in tandem to determine if a breach in the left wing of the Orbiter had occurred, and if so, the path of the resultant high temperature plasma flow. Nondestructive analysis included topometric scanning, laser mapping, and real-time radiography. These techniques were useful in constructing a three dimensional virtual representation of the reconstruction project, specifically the left wing leading edge reinforced carbon/carbon heat protectant panels. Similarly, they were beneficial in determining where sampling should be performed on the debris. Analytic testing included such techniques as Energy Dispersive Electron Microprobe Analysis (EMPA), Electron Spectroscopy Chemical Analysis (ESCA), and X-Ray dot mapping; these techniques related the characteristics of intermetallics deposited on the leading edge of the left wing adjacent to the location of a suspected plasma breach during reentry. The methods and results of the various analyses, along with their implications into the accident, are discussed, along with the findings and recommendations of the Columbia Accident Investigation Board. Likewise, NASA's Return To Flight efforts are highlighted.

  10. 2D-Visualization of metabolic activity with planar optical chemical sensors (optodes)

    NASA Astrophysics Data System (ADS)

    Meier, R. J.; Liebsch, G.

    2015-12-01

    Microbia plays an outstandingly important role in many hydrologic compartments, such as e.g. the benthic community in sediments, or biologically active microorganisms in the capillary fringe, in ground water, or soil. Oxygen, pH, and CO2 are key factors and indicators for microbial activity. They can be measured using optical chemical sensors. These sensors record changing fluorescence properties of specific indicator dyes. The signals can be measured in a non-contact mode, even through transparent walls, which is important for many lab-experiments. They can measure in closed (transparent) systems, without sampling or intruding into the sample. They do not consume the analytes while measuring, are fully reversible and able to measure in non-stirred solutions. These sensors can be applied as high precision fiberoptic sensors (for profiling), robust sensor spots, or as planar sensors for 2D visualization (imaging). Imaging enables to detect thousands of measurement spots at the same time and generate 2D analyte maps over a region of interest. It allows for comparing different regions within one recorded image, visualizing spatial analyte gradients, or more important to identify hot spots of metabolic activity. We present ready-to-use portable imaging systems for the analytes oxygen, pH, and CO2. They consist of a detector unit, planar sensor foils and a software for easy data recording and evaluation. Sensors foils for various analytes and measurement ranges enable visualizing metabolic activity or analyte changes in the desired range. Dynamics of metabolic activity can be detected in one shot or over long time periods. We demonstrate the potential of this analytical technique by presenting experiments on benthic disturbance-recovery dynamics in sediments and microbial degradation of organic material in the capillary fringe. We think this technique is a new tool to further understand how microbial and geochemical processes are linked in (not solely) hydrologic systems.

  11. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  12. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  13. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analyte specific reagents. 809.30 Section 809.30 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte specific reagents. (a) Analyte specific reagents (ASR's) (§ 864.4020 of this chapter) are restricted...

  14. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  15. Stress analysis of the cracked-lap-shear specimen - An ASTM round-robin

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.

    1987-01-01

    This ASTM Round Robin was conducted to evaluate the state of the art in stress analysis of adhesively bonded joint specimens. Specifically, the participants were asked to calculate the strain-energy-release rate for two different geometry cracked lap shear (CLS) specimens at four different debond lengths. The various analytical techniques consisted of 2- and 3-dimensional finite element analysis, beam theory, plate theory, and a combination of beam theory and finite element analysis. The results were examined in terms of the total strain-energy-release rate and the mode I to mode II ratio as a function of debond length for each specimen geometry. These results basically clustered into two groups: geometric linear or geometric nonlinear analysis. The geometric nonlinear analysis is required to properly analyze the CLS specimens. The 3-D finite element analysis gave indications of edge closure plus some mode III loading. Each participant described his analytical technique and results. Nine laboratories participated.

  16. Stress analysis of the cracked lap shear specimens: An ASTM round robin

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.

    1986-01-01

    This ASTM Round Robin was conducted to evaluate the state of the art in stress analysis of adhesively bonded joint specimens. Specifically, the participants were asked to calculate the strain-energy-release rate for two different geometry cracked lap shear (CLS) specimens at four different debond lengths. The various analytical techniques consisted of 2- and 3-dimensional finite element analysis, beam theory, plate theory, and a combination of beam theory and finite element analysis. The results were examined in terms of the total strain-energy-release rate and the mode I to mode II ratio as a function of debond length for each specimen geometry. These results basically clustered into two groups: geometric linear or geometric nonlinear analysis. The geometric nonlinear analysis is required to properly analyze the CLS specimens. The 3-D finite element analysis gave indications of edge closure plus some mode III loading. Each participant described his analytical technique and results. Nine laboratories participated.

  17. Text Mining in Organizational Research

    PubMed Central

    Kobayashi, Vladimer B.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.

    2017-01-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies. PMID:29881248

  18. Text Mining in Organizational Research.

    PubMed

    Kobayashi, Vladimer B; Mol, Stefan T; Berkers, Hannah A; Kismihók, Gábor; Den Hartog, Deanne N

    2018-07-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies.

  19. Phenalenone-type phytoalexins mediate resistance of banana plants (Musa spp.) to the burrowing nematode Radopholus similis

    PubMed Central

    Hölscher, Dirk; Dhakshinamoorthy, Suganthagunthalam; Alexandrov, Theodore; Becker, Michael; Bretschneider, Tom; Buerkert, Andreas; Crecelius, Anna C.; De Waele, Dirk; Elsen, Annemie; Heckel, David G.; Heklau, Heike; Hertweck, Christian; Kai, Marco; Knop, Katrin; Krafft, Christoph; Maddula, Ravi K.; Matthäus, Christian; Popp, Jürgen; Schneider, Bernd; Schubert, Ulrich S.; Sikora, Richard A.; Svatoš, Aleš; Swennen, Rony L.

    2014-01-01

    The global yield of bananas—one of the most important food crops—is severely hampered by parasites, such as nematodes, which cause yield losses up to 75%. Plant–nematode interactions of two banana cultivars differing in susceptibility to Radopholus similis were investigated by combining the conventional and spatially resolved analytical techniques 1H NMR spectroscopy, matrix-free UV-laser desorption/ionization mass spectrometric imaging, and Raman microspectroscopy. This innovative combination of analytical techniques was applied to isolate, identify, and locate the banana-specific type of phytoalexins, phenylphenalenones, in the R. similis-caused lesions of the plants. The striking antinematode activity of the phenylphenalenone anigorufone, its ingestion by the nematode, and its subsequent localization in lipid droplets within the nematode is reported. The importance of varying local concentrations of these specialized metabolites in infected plant tissues, their involvement in the plant’s defense system, and derived strategies for improving banana resistance are highlighted. PMID:24324151

  20. On the performance of energy detection-based CR with SC diversity over IG channel

    NASA Astrophysics Data System (ADS)

    Verma, Pappu Kumar; Soni, Sanjay Kumar; Jain, Priyanka

    2017-12-01

    Cognitive radio (CR) is a viable 5G technology to address the scarcity of the spectrum. Energy detection-based sensing is known to be the simplest method as far as hardware complexity is concerned. In this paper, the performance of spectrum sensing-based energy detection technique in CR networks over inverse Gaussian channel for selection combining diversity technique is analysed. More specifically, accurate analytical expressions for the average detection probability under different detection scenarios such as single channel (no diversity) and with diversity reception are derived and evaluated. Further, the detection threshold parameter is optimised by minimising the probability of error over several diversity branches. The results clearly show the significant improvement in the probability of detection when optimised threshold parameter is applied. The impact of shadowing parameters on the performance of energy detector is studied in terms of complimentary receiver operating characteristic curve. To verify the correctness of our analysis, the derived analytical expressions are corroborated via exact result and Monte Carlo simulations.

  1. Enabling fluorescent biosensors for the forensic identification of body fluids.

    PubMed

    Frascione, Nunzianda; Gooch, James; Daniel, Barbara

    2013-11-12

    The search for body fluids often forms a crucial element of many forensic investigations. Confirming fluid presence at a scene can not only support or refute the circumstantial claims of a victim, suspect or witness, but may additionally provide a valuable source of DNA for further identification purposes. However, current biological fluid testing techniques are impaired by a number of well-characterised limitations; they often give false positives, cannot be used simultaneously, are sample destructive and lack the ability to visually locate fluid depositions. These disadvantages can negatively affect the outcome of a case through missed or misinterpreted evidence. Biosensors are devices able to transduce a biological recognition event into a measurable signal, resulting in real-time analyte detection. The use of innovative optical sensing technology may enable the highly specific and non-destructive detection of biological fluid depositions through interaction with several fluid-endogenous biomarkers. Despite considerable impact in a variety of analytical disciplines, biosensor application within forensic analyses may be considered extremely limited. This article aims to explore a number of prospective biosensing mechanisms and to outline the challenges associated with their adaptation towards detection of fluid-specific analytes.

  2. A three-dimensional finite-element thermal/mechanical analytical technique for high-performance traveling wave tubes

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard

    1991-01-01

    Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/ mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.

  3. A three-dimensional finite-element thermal/mechanical analytical technique for high-performance traveling wave tubes

    NASA Technical Reports Server (NTRS)

    Shalkhauser, Kurt A.; Bartos, Karen F.; Fite, E. B.; Sharp, G. R.

    1992-01-01

    Current research in high-efficiency, high-performance traveling wave tubes (TWT's) has led to the development of novel thermal/mechanical computer models for use with helical slow-wave structures. A three-dimensional, finite element computer model and analytical technique used to study the structural integrity and thermal operation of a high-efficiency, diamond-rod, K-band TWT designed for use in advanced space communications systems. This analysis focused on the slow-wave circuit in the radiofrequency section of the TWT, where an inherent localized heating problem existed and where failures were observed during earlier cold compression, or 'coining' fabrication technique that shows great potential for future TWT development efforts. For this analysis, a three-dimensional, finite element model was used along with MARC, a commercially available finite element code, to simulate the fabrication of a diamond-rod TWT. This analysis was conducted by using component and material specifications consistent with actual TWT fabrication and was verified against empirical data. The analysis is nonlinear owing to material plasticity introduced by the forming process and also to geometric nonlinearities presented by the component assembly configuration. The computer model was developed by using the high efficiency, K-band TWT design but is general enough to permit similar analyses to be performed on a wide variety of TWT designs and styles. The results of the TWT operating condition and structural failure mode analysis, as well as a comparison of analytical results to test data are presented.

  4. Finding Waldo: Learning about Users from their Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Eli T.; Ottley, Alvitta; Zhao, Helen

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less

  5. Low-picomolar, label-free procalcitonin analytical detection with an electrolyte-gated organic field-effect transistor based electronic immunosensor.

    PubMed

    Seshadri, Preethi; Manoli, Kyriaki; Schneiderhan-Marra, Nicole; Anthes, Uwe; Wierzchowiec, Piotr; Bonrad, Klaus; Di Franco, Cinzia; Torsi, Luisa

    2018-05-01

    Herein a label-free immunosensor based on electrolyte-gated organic field-effect transistor (EGOFET) was developed for the detection of procalcitonin (PCT), a sepsis marker. Antibodies specific to PCT were immobilized on the poly-3-hexylthiophene (P3HT) organic semiconductor surface through direct physical adsorption followed by a post-treatment with bovine serum albumin (BSA) which served as the blocking agent to prevent non-specific adsorption. Antibodies together with BSA (forming the whole biorecognition layer) served to selectively capture the procalcitonin target analyte. The entire immunosensor fabrication process was fast, requiring overall 45min to be completed before analyte sensing. The EGOFET immunosensor showed excellent electrical properties, comparable to those of bare P3HT based EGOFET confirming reliable biosensing with bio-functional EGOFET immunosensor. The detection limit of the immunosensor was as low as 2.2pM and within a range of clinical relevance. The relative standard deviation of the individual calibration data points, measured on immunosensors fabricated on different chips (reproducibility error) was below 7%. The developed immunosensor showed high selectivity to the PCT analyte which was evident through control experiments. This report of PCT detection is first of its kind among the electronic sensors based on EGOFETs. The developed sensor is versatile and compatible with low-cost fabrication techniques. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Analytical procedure for the determination of Ethyl Lauroyl Arginate (LAE) to assess the kinetics and specific migration from a new antimicrobial active food packaging.

    PubMed

    Pezo, Davinson; Navascués, Beatriz; Salafranca, Jesús; Nerín, Cristina

    2012-10-01

    Ethyl Lauroyl Arginate (LAE) is a cationic tensoactive compound, soluble in water, with a wide activity spectrum against moulds and bacteria. LAE has been incorporated as antimicrobial agent into packaging materials for food contact and these materials require to comply with the specific migration criteria. In this paper, one analytical procedure has been developed and optimized for the analysis of LAE in food simulants after the migrations tests. It consists of the formation of an ionic pair between LAE and the inorganic complex Co(SCN)(4)(2-) in aqueous solution, followed by a liquid-liquid extraction in a suitable organic solvent and further UV-Vis absorbance measurement. In order to evaluate possible interferences, the ionic pair has been also analyzed by high performance liquid chromatography with UV-Vis detection. Both procedures provided similar analytical characteristics, with linear ranges from 1.10 to 25.00 mg kg(-1), linearity higher than 0.9886, limits of detection and quantification of 0.33 and 1.10 mg kg(-1), respectively, accuracy better than 1% as relative error and precision better than 3.6% expressed as RSD. Optimization of analytical techniques, thermal and chemical stability of LAE, as well as migration kinetics of LAE from experimental active packaging are reported and discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  8. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  9. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  10. Plasmonic crystal based solid substrate for biomedical application of SERS

    NASA Astrophysics Data System (ADS)

    Morasso, Carlo F.; Mehn, Dora; Picciolini, Silvia; Vanna, Renzo; Bedoni, Marzia; Gramatica, Furio; Pellacani, Paola; Frangolho, Ana; Marchesini, Gerardo; Valsesia, Andrea

    2014-02-01

    Surface Enhanced Raman Spectroscopy is a powerful analytical technique that combines the excellent chemical specificity of Raman spectroscopy with the good sensitivity provided by the enhancement of the signal observed when a molecule is located on (or very close to) the surface of suitable nanostructured metallic materials. The availability of cheap, reliable and easy to use SERS substrates would pave the road to the development of bioanalytical tests that can be used in clinical practice. SERS, in fact, is expected to provide not only higher sensitivity and specificity, but also the simultaneous and markedly improved detection of several targets at the same time with higher speed compared to the conventional analytical methods. Here, we present the SERS activity of 2-D plasmonic crystals made by polymeric pillars embedded in a gold matrix obtained through the combination of soft-lithography and plasma deposition techniques on a transparent substrates. The use of a transparent support material allowed us to perform SERS detection from support side opening the possibility to use these substrates in combination with microfluidic devices. In order to demonstrate the potentialities for bioanalytical applications, we used our SERS active gold surface to detect the oxidation product of apomorphine, a well-known drug molecule used in Parkinson's disease which has been demonstrated being difficult to study by traditional HPLC based approaches.

  11. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  12. Quantification of rapid environmental redox processes with quick-scanning x-ray absorption spectroscopy (Q-XAS)

    PubMed Central

    Ginder-Vogel, Matthew; Landrot, Gautier; Fischel, Jason S.; Sparks, Donald L.

    2009-01-01

    Quantification of the initial rates of environmental reactions at the mineral/water interface is a fundamental prerequisite to determining reaction mechanisms and contaminant transport modeling and predicting environmental risk. Until recently, experimental techniques with adequate time resolution and elemental sensitivity to measure initial rates of the wide variety of environmental reactions were quite limited. Techniques such as electron paramagnetic resonance and Fourier transform infrared spectroscopies suffer from limited elemental specificity and poor sensitivity to inorganic elements, respectively. Ex situ analysis of batch and stirred-flow systems provides high elemental sensitivity; however, their time resolution is inadequate to characterize rapid environmental reactions. Here we apply quick-scanning x-ray absorption spectroscopy (Q-XAS), at sub-second time-scales, to measure the initial oxidation rate of As(III) to As(V) by hydrous manganese(IV) oxide. Using Q-XAS, As(III) and As(V) concentrations were determined every 0.98 s in batch reactions. The initial apparent As(III) depletion rate constants (t < 30 s) measured with Q-XAS are nearly twice as large as rate constants measured with traditional analytical techniques. Our results demonstrate the importance of developing analytical techniques capable of analyzing environmental reactions on the same time scale as they occur. Given the high sensitivity, elemental specificity, and time resolution of Q-XAS, it has many potential applications. They could include measuring not only redox reactions but also dissolution/precipitation reactions, such as the formation and/or reductive dissolution of Fe(III) (hydr)oxides, solid-phase transformations (i.e., formation of layered-double hydroxide minerals), or almost any other reaction occurring in aqueous media that can be measured using x-ray absorption spectroscopy. PMID:19805269

  13. Mixing of two co-directional Rayleigh surface waves in a nonlinear elastic material.

    PubMed

    Morlock, Merlin B; Kim, Jin-Yeon; Jacobs, Laurence J; Qu, Jianmin

    2015-01-01

    The mixing of two co-directional, initially monochromatic Rayleigh surface waves in an isotropic, homogeneous, and nonlinear elastic solid is investigated using analytical, finite element method, and experimental approaches. The analytical investigations show that while the horizontal velocity component can form a shock wave, the vertical velocity component can form a pulse independent of the specific ratios of the fundamental frequencies and amplitudes that are mixed. This analytical model is then used to simulate the development of the fundamentals, second harmonics, and the sum and difference frequency components over the propagation distance. The analytical model is further extended to include diffraction effects in the parabolic approximation. Finally, the frequency and amplitude ratios of the fundamentals are identified which provide maximum amplitudes of the second harmonics as well as of the sum and difference frequency components, to help guide effective material characterization; this approach should make it possible to measure the acoustic nonlinearity of a solid not only with the second harmonics, but also with the sum and difference frequency components. Results of the analytical investigations are then confirmed using the finite element method and the experimental feasibility of the proposed technique is validated for an aluminum specimen.

  14. Recent Advances in Bioprinting and Applications for Biosensing

    PubMed Central

    Dias, Andrew D.; Kingsley, David M.; Corr, David T.

    2014-01-01

    Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write) that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic. PMID:25587413

  15. Fast alternative Monte Carlo formalism for a class of problems in biophotonics

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.

    1997-12-01

    A practical and effective, alternative Monte Carlo formalism is presented that rapidly finds flux solutions to the radiative transport equation for a class of problems in biophotonics; namely, wide-beam irradiance of finite, optically anisotropic homogeneous or heterogeneous biomedias, which both strongly scatter and absorb light. Such biomedias include liver, tumors, blood, or highly blood perfused tissues. As Fermat rays comprising a wide coherent (laser) beam enter the tissue, they evolve into a bundle of random optical paths or trajectories due to scattering. Overall, this can be physically interpreted as a bundle of Markov trajectories traced out by a 'gas' of Brownian-like point photons being successively scattered and absorbed. By considering the cumulative flow of a statistical bundle of trajectories through interior data planes, the effective equivalent information of the (generally unknown) analytical flux solutions of the transfer equation rapidly emerges. Unlike the standard Monte Carlo techniques, which evaluate scalar fluence, this technique is faster, more efficient, and simpler to apply for this specific class of optical situations. Other analytical or numerical techniques can either become unwieldy or lack viability or are simply more difficult to apply. Illustrative flux calculations are presented for liver, blood, and tissue-tumor-tissue systems.

  16. Considerations for standardizing predictive molecular pathology for cancer prognosis.

    PubMed

    Fiorentino, Michelangelo; Scarpelli, Marina; Lopez-Beltran, Antonio; Cheng, Liang; Montironi, Rodolfo

    2017-01-01

    Molecular tests that were once ancillary to the core business of cyto-histopathology are becoming the most relevant workload in pathology departments after histopathology/cytopathology and before autopsies. This has resulted from innovations in molecular biology techniques, which have developed at an incredibly fast pace. Areas covered: Most of the current widely used techniques in molecular pathology such as FISH, direct sequencing, pyrosequencing, and allele-specific PCR will be replaced by massive parallel sequencing that will not be considered next generation, but rather, will be considered to be current generation sequencing. The pre-analytical steps of molecular techniques such as DNA extraction or sample preparation will be largely automated. Moreover, all the molecular pathology instruments will be part of an integrated workflow that traces the sample from extraction to the analytical steps until the results are reported; these steps will be guided by expert laboratory information systems. In situ hybridization and immunohistochemistry for quantification will be largely digitalized as much as histology will be mostly digitalized rather than viewed using microscopy. Expert commentary: This review summarizes the technical and regulatory issues concerning the standardization of molecular tests in pathology. A vision of the future perspectives of technological changes is also provided.

  17. PHARMACEUTICAL AND BIOMEDICAL APPLICATIONS OF AFFINITY CHROMATOGRAPHY: RECENT TRENDS AND DEVELOPMENTS

    PubMed Central

    Hage, David S.; Anguizola, Jeanethe A.; Bi, Cong; Li, Rong; Matsuda, Ryan; Papastavros, Efthimia; Pfaunmiller, Erika; Vargas, John; Zheng, Xiwei

    2012-01-01

    Affinity chromatography is a separation technique that has become increasingly important in work with biological samples and pharmaceutical agents. This method is based on the use of a biologically-related agent as a stationary phase to selectively retain analytes or to study biological interactions. This review discusses the basic principles behind affinity chromatography and examines recent developments that have occurred in the use of this method for biomedical and pharmaceutical analysis. Techniques based on traditional affinity supports are discussed, but an emphasis is placed on methods in which affinity columns are used as part of HPLC systems or in combination with other analytical methods. General formats for affinity chromatography that are considered include step elution schemes, weak affinity chromatography, affinity extraction and affinity depletion. Specific separation techniques that are examined include lectin affinity chromatography, boronate affinity chromatography, immunoaffinity chromatography, and immobilized metal ion affinity chromatography. Approaches for the study of biological interactions by affinity chromatography are also presented, such as the measurement of equilibrium constants, rate constants, or competition and displacement effects. In addition, related developments in the use of immobilized enzyme reactors, molecularly imprinted polymers, dye ligands and aptamers are briefly considered. PMID:22305083

  18. Exaggerated heart rate oscillations during two meditation techniques.

    PubMed

    Peng, C K; Mietus, J E; Liu, Y; Khalsa, G; Douglas, P S; Benson, H; Goldberger, A L

    1999-07-31

    We report extremely prominent heart rate oscillations associated with slow breathing during specific traditional forms of Chinese Chi and Kundalini Yoga meditation techniques in healthy young adults. We applied both spectral analysis and a novel analytic technique based on the Hilbert transform to quantify these heart rate dynamics. The amplitude of these oscillations during meditation was significantly greater than in the pre-meditation control state and also in three non-meditation control groups: i) elite athletes during sleep, ii) healthy young adults during metronomic breathing, and iii) healthy young adults during spontaneous nocturnal breathing. This finding, along with the marked variability of the beat-to-beat heart rate dynamics during such profound meditative states, challenges the notion of meditation as only an autonomically quiescent state.

  19. Uranium determination in natural water by the fissiontrack technique

    USGS Publications Warehouse

    Reimer, G.M.

    1975-01-01

    The fission track technique, utilizing the neutron-induced fission of uranium-235, provides a versatile analytical method for the routine analysis of uranium in liquid samples of natural water. A detector is immersed in the sample and both are irradiated. The fission track density observed in the detector is directly proportional to the uranium concentration. The specific advantages of this technique are: (1) only a small quantity of sample, typically 0.1-1 ml, is needed; (2) no sample concentration is necessary; (3) it is capable of providing analyses with a lower reporting limit of 1 ??g per liter; and (4) the actual time spent on an analysis can be only a few minutes. This paper discusses and describes the method. ?? 1975.

  20. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  1. TomoPhantom, a software package to generate 2D-4D analytical phantoms for CT image reconstruction algorithm benchmarks

    NASA Astrophysics Data System (ADS)

    Kazantsev, Daniil; Pickalov, Valery; Nagella, Srikanth; Pasca, Edoardo; Withers, Philip J.

    2018-01-01

    In the field of computerized tomographic imaging, many novel reconstruction techniques are routinely tested using simplistic numerical phantoms, e.g. the well-known Shepp-Logan phantom. These phantoms cannot sufficiently cover the broad spectrum of applications in CT imaging where, for instance, smooth or piecewise-smooth 3D objects are common. TomoPhantom provides quick access to an external library of modular analytical 2D/3D phantoms with temporal extensions. In TomoPhantom, quite complex phantoms can be built using additive combinations of geometrical objects, such as, Gaussians, parabolas, cones, ellipses, rectangles and volumetric extensions of them. Newly designed phantoms are better suited for benchmarking and testing of different image processing techniques. Specifically, tomographic reconstruction algorithms which employ 2D and 3D scanning geometries, can be rigorously analyzed using the software. TomoPhantom also provides a capability of obtaining analytical tomographic projections which further extends the applicability of software towards more realistic, free from the "inverse crime" testing. All core modules of the package are written in the C-OpenMP language and wrappers for Python and MATLAB are provided to enable easy access. Due to C-based multi-threaded implementation, volumetric phantoms of high spatial resolution can be obtained with computational efficiency.

  2. Heat capacity measurements of sub-nanoliter volumes of liquids using bimaterial microchannel cantilevers

    NASA Astrophysics Data System (ADS)

    Khan, M. F.; Miriyala, N.; Lee, J.; Hassanpourfard, M.; Kumar, A.; Thundat, T.

    2016-05-01

    Lab-on-a-Chip compatible techniques for thermal characterization of miniaturized volumes of liquid analytes are necessary in applications such as protein blotting, DNA melting, and drug development, where samples are either rare or volume-limited. We developed a closed-chamber calorimeter based on a bimaterial microchannel cantilever (BMC) for sub-nanoliter level thermal analysis. When the liquid-filled BMC is irradiated with infrared (IR) light at a specific wavelength, the IR absorption by the liquid analyte results in localized heat generation and the subsequent deflection of the BMC, due to a thermal expansion mismatch between the constituent materials. The time constant of the deflection, which is dependent upon the heat capacity of the liquid analyte, can be directly measured by recording the time-dependent bending of the BMC. We have used the BMC to quantitatively measure the heat capacity of five volatile organic compounds. With a deflection noise level of ˜10 nm and a signal-to-noise ratio of 68:1, the BMC offers a sensitivity of 30.5 ms/(J g-1 K-1) and a resolution of 23 mJ/(g K) for ˜150 pl liquid for heat capacity measurements. This technique can be used for small-scale thermal characterization of different chemical and biological samples.

  3. Data mining to support simulation modeling of patient flow in hospitals.

    PubMed

    Isken, Mark W; Rajagopalan, Balaji

    2002-04-01

    Spiraling health care costs in the United States are driving institutions to continually address the challenge of optimizing the use of scarce resources. One of the first steps towards optimizing resources is to utilize capacity effectively. For hospital capacity planning problems such as allocation of inpatient beds, computer simulation is often the method of choice. One of the more difficult aspects of using simulation models for such studies is the creation of a manageable set of patient types to include in the model. The objective of this paper is to demonstrate the potential of using data mining techniques, specifically clustering techniques such as K-means, to help guide the development of patient type definitions for purposes of building computer simulation or analytical models of patient flow in hospitals. Using data from a hospital in the Midwest this study brings forth several important issues that researchers need to address when applying clustering techniques in general and specifically to hospital data.

  4. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  5. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  6. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  7. Mechanical properties of additively manufactured octagonal honeycombs.

    PubMed

    Hedayati, R; Sadighi, M; Mohammadi-Aghdam, M; Zadpoor, A A

    2016-12-01

    Honeycomb structures have found numerous applications as structural and biomedical materials due to their favourable properties such as low weight, high stiffness, and porosity. Application of additive manufacturing and 3D printing techniques allows for manufacturing of honeycombs with arbitrary shape and wall thickness, opening the way for optimizing the mechanical and physical properties for specific applications. In this study, the mechanical properties of honeycomb structures with a new geometry, called octagonal honeycomb, were investigated using analytical, numerical, and experimental approaches. An additive manufacturing technique, namely fused deposition modelling, was used to fabricate the honeycomb from polylactic acid (PLA). The honeycombs structures were then mechanically tested under compression and the mechanical properties of the structures were determined. In addition, the Euler-Bernoulli and Timoshenko beam theories were used for deriving analytical relationships for elastic modulus, yield stress, Poisson's ratio, and buckling stress of this new design of honeycomb structures. Finite element models were also created to analyse the mechanical behaviour of the honeycombs computationally. The analytical solutions obtained using Timoshenko beam theory were close to computational results in terms of elastic modulus, Poisson's ratio and yield stress, especially for relative densities smaller than 25%. The analytical solutions based on the Timoshenko analytical solution and the computational results were in good agreement with experimental observations. Finally, the elastic properties of the proposed honeycomb structure were compared to those of other honeycomb structures such as square, triangular, hexagonal, mixed, diamond, and Kagome. The octagonal honeycomb showed yield stress and elastic modulus values very close to those of regular hexagonal honeycombs and lower than the other considered honeycombs. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  9. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  10. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Present and future applications of NMR to medicine and materials science

    NASA Astrophysics Data System (ADS)

    Morris, Peter

    1992-06-01

    The phenomenon of nuclear magnetic resonance (NMR) was first observed in the immediate post second-world-war period by two American physicists, working independently: Bloch at Stanford and Purcell at Harvard. Their observations were reported in 1946 in the same volume of Physical Review and led to the joint award of the 1952 Nobel Prize for Physics. Once the details of the interaction had been worked out, and the chemical specificity had been appreciated, a period of instrumentational refinement followed before NMR took its place as arguably the most powerful analytical technique available to the organic chemist. The historical development of NMR and the basis of its analytical power are described in the companion article by Dr. J. Feeney.

  12. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  13. Medical Applications at CERN and the ENLIGHT Network

    PubMed Central

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN. PMID:26835422

  14. Medical Applications at CERN and the ENLIGHT Network.

    PubMed

    Dosanjh, Manjit; Cirilli, Manuela; Myers, Steve; Navin, Sparsh

    2016-01-01

    State-of-the-art techniques derived from particle accelerators, detectors, and physics computing are routinely used in clinical practice and medical research centers: from imaging technologies to dedicated accelerators for cancer therapy and nuclear medicine, simulations, and data analytics. Principles of particle physics themselves are the foundation of a cutting edge radiotherapy technique for cancer treatment: hadron therapy. This article is an overview of the involvement of CERN, the European Organization for Nuclear Research, in medical applications, with specific focus on hadron therapy. It also presents the history, achievements, and future scientific goals of the European Network for Light Ion Hadron Therapy, whose co-ordination office is at CERN.

  15. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  16. Specifying and calibrating instrumentations for wideband electronic power measurements. [in switching circuits

    NASA Technical Reports Server (NTRS)

    Lesco, D. J.; Weikle, D. H.

    1980-01-01

    The wideband electric power measurement related topics of electronic wattmeter calibration and specification are discussed. Tested calibration techniques are described in detail. Analytical methods used to determine the bandwidth requirements of instrumentation for switching circuit waveforms are presented and illustrated with examples from electric vehicle type applications. Analog multiplier wattmeters, digital wattmeters and calculating digital oscilloscopes are compared. The instrumentation characteristics which are critical to accurate wideband power measurement are described.

  17. Introduction to Food Analysis

    NASA Astrophysics Data System (ADS)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet the legal requirements established by governmental agencies. Government regulations and international standards most relevant to the analysis of foods are mentioned here but covered in more detail in Chap. 2, and nutrition labeling regulations in the USA are covered in Chap. 3. Internet addresses for many of the organizations and government agencies discussed are given at the end of this chapter.

  18. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  19. Accuracy of specific BIVA for the assessment of body composition in the United States population.

    PubMed

    Buffa, Roberto; Saragat, Bruno; Cabras, Stefano; Rinaldi, Andrea C; Marini, Elisabetta

    2013-01-01

    Bioelectrical impedance vector analysis (BIVA) is a technique for the assessment of hydration and nutritional status, used in the clinical practice. Specific BIVA is an analytical variant, recently proposed for the Italian elderly population, that adjusts bioelectrical values for body geometry. Evaluating the accuracy of specific BIVA in the adult U.S. population, compared to the 'classic' BIVA procedure, using DXA as the reference technique, in order to obtain an interpretative model of body composition. A cross-sectional sample of 1590 adult individuals (836 men and 754 women, 21-49 years old) derived from the NHANES 2003-2004 was considered. Classic and specific BIVA were applied. The sensitivity and specificity in recognizing individuals below the 5(th) and above the 95(th) percentiles of percent fat (FMDXA%) and extracellular/intracellular water (ECW/ICW) ratio were evaluated by receiver operating characteristic (ROC) curves. Classic and specific BIVA results were compared by a probit multiple-regression. Specific BIVA was significantly more accurate than classic BIVA in evaluating FMDXA% (ROC areas: 0.84-0.92 and 0.49-0.61 respectively; p = 0.002). The evaluation of ECW/ICW was accurate (ROC areas between 0.83 and 0.96) and similarly performed by the two procedures (p = 0.829). The accuracy of specific BIVA was similar in the two sexes (p = 0.144) and in FMDXA% and ECW/ICW (p = 0.869). Specific BIVA showed to be an accurate technique. The tolerance ellipses of specific BIVA can be used for evaluating FM% and ECW/ICW in the U.S. adult population.

  20. Exhaled human breath measurement method for assessing exposure to halogenated volatile organic compounds.

    PubMed

    Pleil, J D; Lindstrom, A B

    1997-05-01

    The organic constituents of exhaled human breath are representative of blood-borne concentrations through gas exchange in the blood/breath interface in the lungs. The presence of specific compounds can be an indicator of recent exposure or represent a biological response of the subject. For volatile organic compounds (VOCs), sampling and analysis of breath is preferred to direct measurement from blood samples because breath collection is noninvasive, potentially infectious waste is avoided, and the measurement of gas-phase analytes is much simpler in a gas matrix rather than in a complex biological tissue such as blood. To exploit these advantages, we have developed the "single breath canister" (SBC) technique, a simple direct collection method for individual alveolar breath samples, and adapted conventional gas chromatography-mass spectrometry analytical methods for trace-concentration VOC analysis. The focus of this paper is to describe briefly the techniques for making VOC measurements in breath, to present some specific applications for which these methods are relevant, and to demonstrate how to estimate exposure to example VOCs on the basis of breath elimination. We present data from three different exposure scenarios: (a) vinyl chloride and cis-1,2-dichloroethene from showering with contaminated water from a private well, (b) chloroform and bromodichloromethane from high-intensity swimming in chlorinated pool water, and (c) trichloroethene from a controlled exposure chamber experiment. In all cases, for all subjects, the experiment is the same: preexposure breath measurement, exposure to halogenated VOC, and a postexposure time-dependent series of breath measurements. Data are presented only to demonstrate the use of the method and how to interpret the analytical results.

  1. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs.

    PubMed

    Bayliss, Elizabeth A; Powers, J David; Ellis, Jennifer L; Barrow, Jennifer C; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0-4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery.

  2. Therapeutic drug monitoring of flucytosine in serum using a SERS-active membrane system

    NASA Astrophysics Data System (ADS)

    Berger, Adam G.; White, Ian M.

    2017-02-01

    A need exists for near real-time therapeutic drug monitoring (TDM), in particular for antibiotics and antifungals in patient samples at the point-of-care. To truly fit the point-of-care need, techniques must be rapid and easy to use. Here we report a membrane system utilizing inkjet-fabricated surface enhanced Raman spectroscopy (SERS) sensors that allows sensitive and specific analysis despite the elimination of sophisticated chromatography equipment, expensive analytical instruments, and other systems relegated to the central lab. We utilize inkjet-fabricated paper SERS sensors as substrates for 5FC detection; the use of paper-based SERS substrates leverages the natural wicking ability and filtering properties of microporous membranes. We investigate the use of microporous membranes in the vertical flow assay to allow separation of the flucytosine from whole blood. The passive vertical flow assay serves as a valuable method for physical separation of target analytes from complex biological matrices. This work further establishes a platform for easy, sensitive, and specific TDM of 5FC from whole blood.

  3. Mass-transport limitations in spot-based microarrays.

    PubMed

    Zhao, Ming; Wang, Xuefeng; Nolte, David

    2010-09-20

    Mass transport of analyte to surface-immobilized affinity reagents is the fundamental bottleneck for sensitive detection in solid-support microarrays and biosensors. Analyte depletion in the volume adjacent to the sensor causes deviation from ideal association, significantly slows down reaction kinetics, and causes inhomogeneous binding across the sensor surface. In this paper we use high-resolution molecular interferometric imaging (MI2), a label-free optical interferometry technique for direct detection of molecular films, to study the inhomogeneous distribution of intra-spot binding across 100 micron-diameter protein spots. By measuring intra-spot binding inhomogeneity, reaction kinetics can be determined accurately when combined with a numerical three-dimensional finite element model. To ensure homogeneous binding across a spot, a critical flow rate is identified in terms of the association rate k(a) and the spot diameter. The binding inhomogeneity across a spot can be used to distinguish high-affinity low-concentration specific reactions from low-affinity high-concentration non-specific binding of background proteins.

  4. ASSESSMENT OF ANALYTICAL METHODS USED TO MEASURE CHANGES IN BODY COMPOSITION IN THE ELDERLY AND RECOMMENDATIONS FOR THEIR USE IN PHASE II CLINICAL TRIALS

    PubMed Central

    Lustgarten, M.S.; Fielding, R.A.

    2012-01-01

    It is estimated that in the next 20 years, the amount of people greater than 65 years of age will rise from 40 to 70 million, and will account for 19% of the total population. Age-related decreases in muscle mass and function, known as sarcopenia, have been shown to be related to functional limitation, frailty and an increased risk of morbidity and mortality. Therefore, with an increasing elderly population, interventions that can improve muscle mass content and/or function are essential. However, analytical techniques used for measurement of muscle mass in young subjects may not be valid for use in the elderly. Therefore, the purpose of this review is to examine the applied specificity and accuracy of methods that are commonly used for measurement of muscle mass in aged subjects, and, to propose specific recommendations for the use of body composition measures in phase II clinical trials of function-promoting anabolic therapies. PMID:21528163

  5. New Trends in Impedimetric Biosensors for the Detection of Foodborne Pathogenic Bacteria

    PubMed Central

    Wang, Yixian; Ye, Zunzhong; Ying, Yibin

    2012-01-01

    The development of a rapid, sensitive, specific method for the foodborne pathogenic bacteria detection is of great importance to ensure food safety and security. In recent years impedimetric biosensors which integrate biological recognition technology and impedance have gained widespread application in the field of bacteria detection. This paper presents an overview on the progress and application of impedimetric biosensors for detection of foodborne pathogenic bacteria, particularly the new trends in the past few years, including the new specific bio-recognition elements such as bacteriophage and lectin, the use of nanomaterials and microfluidics techniques. The applications of these new materials or techniques have provided unprecedented opportunities for the development of high-performance impedance bacteria biosensors. The significant developments of impedimetric biosensors for bacteria detection in the last five years have been reviewed according to the classification of with or without specific bio-recognition element. In addition, some microfluidics systems, which were used in the construction of impedimetric biosensors to improve analytical performance, are introduced in this review. PMID:22737018

  6. Some Interesting Applications of Probabilistic Techiques in Structural Dynamic Analysis of Rocket Engines

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.

    2014-01-01

    Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.

  7. Insights from Smart Meters. Ramp-up, dependability, and short-term persistence of savings from Home Energy Reports

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Annika; Perry, Michael; Smith, Brian

    Smart meters, smart thermostats, and other new technologies provide previously unavailable high-frequency and location-specific energy usage data. Many utilities are now able to capture real-time, customer specific hourly interval usage data for a large proportion of their residential and small commercial customers. These vast, constantly growing streams of rich data (or, “big data”) have the potential to provide novel insights into key policy questions about how people make energy decisions. The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull useful insights out ofmore » this high-frequency, human-focused data. In this series, we call this “behavior analytics.” This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, disaggregated and heterogeneous information about actual energy use allows energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; enables evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and may provide better insights into the energy and peak hour savings associated with EE and DR programs (e.g., behavior-based (BB) programs). The goal of this series is to enable evidence-based and data-driven decision making by policy makers and industry stakeholders, including program planners, program administrators, utilities, state regulatory agencies, and evaluators. We focus on research findings that are immediately relevant.« less

  8. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  9. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  10. A survey of nested grid techniques and their potential for use within the MASS weather prediction model

    NASA Technical Reports Server (NTRS)

    Koch, Steven E.; Mcqueen, Jeffery T.

    1987-01-01

    A survey of various one- and two-way interactive nested grid techniques used in hydrostatic numerical weather prediction models is presented and the advantages and disadvantages of each method are discussed. The techniques for specifying the lateral boundary conditions for each nested grid scheme are described in detail. Averaging and interpolation techniques used when applying the coarse mesh grid (CMG) and fine mesh grid (FMG) interface conditions during two-way nesting are discussed separately. The survey shows that errors are commonly generated at the boundary between the CMG and FMG due to boundary formulation or specification discrepancies. Methods used to control this noise include application of smoothers, enhanced diffusion, or damping-type time integration schemes to model variables. The results from this survey provide the information needed to decide which one-way and two-way nested grid schemes merit future testing with the Mesoscale Atmospheric Simulation System (MASS) model. An analytically specified baroclinic wave will be used to conduct systematic tests of the chosen schemes since this will allow for objective determination of the interfacial noise in the kind of meteorological setting for which MASS is designed. Sample diagnostic plots from initial tests using the analytic wave are presented to illustrate how the model-generated noise is ascertained. These plots will be used to compare the accuracy of the various nesting schemes when incorporated into the MASS model.

  11. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  12. Investigation of the feasibility of an analytical method of accounting for the effects of atmospheric drag on satellite motion

    NASA Technical Reports Server (NTRS)

    Bozeman, Robert E.

    1987-01-01

    An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.

  13. Electrical field-induced extraction and separation techniques: promising trends in analytical chemistry--a review.

    PubMed

    Yamini, Yadollah; Seidi, Shahram; Rezazadeh, Maryam

    2014-03-03

    Sample preparation is an important issue in analytical chemistry, and is often a bottleneck in chemical analysis. So, the major incentive for the recent research has been to attain faster, simpler, less expensive, and more environmentally friendly sample preparation methods. The use of auxiliary energies, such as heat, ultrasound, and microwave, is one of the strategies that have been employed in sample preparation to reach the above purposes. Application of electrical driving force is the current state-of-the-art, which presents new possibilities for simplifying and shortening the sample preparation process as well as enhancing its selectivity. The electrical driving force has scarcely been utilized in comparison with other auxiliary energies. In this review, the different roles of electrical driving force (as a powerful auxiliary energy) in various extraction techniques, including liquid-, solid-, and membrane-based methods, have been taken into consideration. Also, the references have been made available, relevant to the developments in separation techniques and Lab-on-a-Chip (LOC) systems. All aspects of electrical driving force in extraction and separation methods are too specific to be treated in this contribution. However, the main aim of this review is to provide a brief knowledge about the different fields of analytical chemistry, with an emphasis on the latest efforts put into the electrically assisted membrane-based sample preparation systems. The advantages and disadvantages of these approaches as well as the new achievements in these areas have been discussed, which might be helpful for further progress in the future. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Graph Theoretic Foundations of Multibody Dynamics Part I: Structural Properties

    PubMed Central

    Jain, Abhinandan

    2011-01-01

    This is the first part of two papers that use concepts from graph theory to obtain a deeper understanding of the mathematical foundations of multibody dynamics. The key contribution is the development of a unifying framework that shows that key analytical results and computational algorithms in multibody dynamics are a direct consequence of structural properties and require minimal assumptions about the specific nature of the underlying multibody system. This first part focuses on identifying the abstract graph theoretic structural properties of spatial operator techniques in multibody dynamics. The second part paper exploits these structural properties to develop a broad spectrum of analytical results and computational algorithms. Towards this, we begin with the notion of graph adjacency matrices and generalize it to define block-weighted adjacency (BWA) matrices and their 1-resolvents. Previously developed spatial operators are shown to be special cases of such BWA matrices and their 1-resolvents. These properties are shown to hold broadly for serial and tree topology multibody systems. Specializations of the BWA and 1-resolvent matrices are referred to as spatial kernel operators (SKO) and spatial propagation operators (SPO). These operators and their special properties provide the foundation for the analytical and algorithmic techniques developed in the companion paper. We also use the graph theory concepts to study the topology induced sparsity structure of these operators and the system mass matrix. Similarity transformations of these operators are also studied. While the detailed development is done for the case of rigid-link multibody systems, the extension of these techniques to a broader class of systems (e.g. deformable links) are illustrated. PMID:22102790

  15. Qualitative evaluation of maternal milk and commercial infant formulas via LIBS.

    PubMed

    Abdel-Salam, Z; Al Sharnoubi, J; Harith, M A

    2013-10-15

    This study focuses on the use of laser-induced breakdown spectroscopy (LIBS) for the evaluation of the nutrients in maternal milk and some commercially available infant formulas. The results of such evaluation are vital for adequate and healthy feeding for babies during lactation period. Laser-induced breakdown spectroscopy offers special advantages in comparison to the other conventional analytical techniques. Specifically, LIBS is a straightforward technique that can be used in situ to provide qualitative analytical information in few minutes for the samples under investigation without preparation processes. The samples studied in the current work were maternal milk samples collected during the first 3 months of lactation (not colostrum milk) and samples from six different types of commercially available infant formulas. The samples' elemental composition has been compared with respect to the relative abundance of the elements of nutrition importance, namely Mg, Ca, Na, and Fe using their spectral emission lines in the relevant LIBS spectra. In addition, CN and C2 molecular emission bands in the same spectra have been studied as indicators of proteins content in the samples. The obtained analytical results demonstrate the higher elemental contents of the maternal milk compared with the commercial formulas samples. Similar results have been obtained as for the proteins content. It has been also shown that calcium and proteins have similar relative concentration trends in the studied samples. This work demonstrates the feasibility of adopting LIBS as a fast, safe, less costly technique evaluating qualitatively the nutrients content of both maternal and commercial milk samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  17. Investigation of practical applications of H infinity control theory to the design of control systems for large space structures

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis

    1988-01-01

    The applicability of H infinity control theory to the problems of large space structures (LSS) control was investigated. A complete evaluation to any technique as a candidate for large space structure control involves analytical evaluation, algorithmic evaluation, evaluation via simulation studies, and experimental evaluation. The results of analytical and algorithmic evaluations are documented. The analytical evaluation involves the determination of the appropriateness of the underlying assumptions inherent in the H infinity theory, the determination of the capability of the H infinity theory to achieve the design goals likely to be imposed on an LSS control design, and the identification of any LSS specific simplifications or complications of the theory. The resuls of the analytical evaluation are presented in the form of a tutorial on the subject of H infinity control theory with the LSS control designer in mind. The algorthmic evaluation of H infinity for LSS control pertains to the identification of general, high level algorithms for effecting the application of H infinity to LSS control problems, the identification of specific, numerically reliable algorithms necessary for a computer implementation of the general algorithms, the recommendation of a flexible software system for implementing the H infinity design steps, and ultimately the actual development of the necessary computer codes. Finally, the state of the art in H infinity applications is summarized with a brief outline of the most promising areas of current research.

  18. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  19. Emerging technologies for the non-invasive characterization of physical-mechanical properties of tablets.

    PubMed

    Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V

    2017-10-30

    The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  1. Chemical Functionalization of Plasmonic Surface Biosensors: A Tutorial Review on Issues, Strategies, and Costs

    PubMed Central

    2017-01-01

    In an ideal plasmonic surface sensor, the bioactive area, where analytes are recognized by specific biomolecules, is surrounded by an area that is generally composed of a different material. The latter, often the surface of the supporting chip, is generally hard to be selectively functionalized, with respect to the active area. As a result, cross talks between the active area and the surrounding one may occur. In designing a plasmonic sensor, various issues must be addressed: the specificity of analyte recognition, the orientation of the immobilized biomolecule that acts as the analyte receptor, and the selectivity of surface coverage. The objective of this tutorial review is to introduce the main rational tools required for a correct and complete approach to chemically functionalize plasmonic surface biosensors. After a short introduction, the review discusses, in detail, the most common strategies for achieving effective surface functionalization. The most important issues, such as the orientation of active molecules and spatial and chemical selectivity, are considered. A list of well-defined protocols is suggested for the most common practical situations. Importantly, for the reported protocols, we also present direct comparisons in term of costs, labor demand, and risk vs benefit balance. In addition, a survey of the most used characterization techniques necessary to validate the chemical protocols is reported. PMID:28796479

  2. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  3. Power Distribution Analysis For Electrical Usage In Province Area Using Olap (Online Analytical Processing)

    NASA Astrophysics Data System (ADS)

    Samsinar, Riza; Suseno, Jatmiko Endro; Widodo, Catur Edi

    2018-02-01

    The distribution network is the closest power grid to the customer Electric service providers such as PT. PLN. The dispatching center of power grid companies is also the data center of the power grid where gathers great amount of operating information. The valuable information contained in these data means a lot for power grid operating management. The technique of data warehousing online analytical processing has been used to manage and analysis the great capacity of data. Specific methods for online analytics information systems resulting from data warehouse processing with OLAP are chart and query reporting. The information in the form of chart reporting consists of the load distribution chart based on the repetition of time, distribution chart on the area, the substation region chart and the electric load usage chart. The results of the OLAP process show the development of electric load distribution, as well as the analysis of information on the load of electric power consumption and become an alternative in presenting information related to peak load.

  4. Analytical surveillance of emerging drugs of abuse and drug formulations

    PubMed Central

    Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan

    2012-01-01

    Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240

  5. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  6. Clinical cancer diagnosis using optical fiber-delivered coherent anti-stokes ramon scattering microscopy

    NASA Astrophysics Data System (ADS)

    Gao, Liang

    This thesis describes the development of a combined label-free imaging and analytical strategy for intraoperative characterization of cancer lesions using the coherent anti-Stokes Raman scattering imaging (CARS) technique. A cell morphology-based analytical platform is developed to characterize CARS images and, hence, provide diagnostic information using disease-related pathology features. This strategy is validated for three different applications, including margin detection for radical prostatectomy, differential diagnosis of lung cancer, as well as detection and differentiation of breast cancer subtypes for in situ analysis of margin status during lumpectomy. As the major contribution of this thesis, the developed analytical strategy shows high accuracy and specificity for all three diseases and thus has introduced the CARS imaging technique into the field of human cancer diagnosis, which holds substantial potential for clinical translations. In addition, I have contributed a project aimed at miniaturizing the CARS imaging device into a microendoscope setup through a fiber-delivery strategy. A four-wave-mixing (FWM) background signal, which is caused by simultaneous delivery of the two CARS-generating excitation laser beams, is initially identified. A polarization-based strategy is then introduced and tested for suppression of this FWM noise. The approach shows effective suppression of the FWM signal, both on microscopic and prototype endoscopic setups, indicating the potential of developing a novel microendoscope with a compatible size for clinical use. These positive results show promise for the development of an all-fiber-based, label-free imaging and analytical platform for minimally invasive detection and diagnosis of cancers during surgery or surgical-biopsy, thus improving surgical outcomes and reducing patients' suffering.

  7. Thermo-mechanical design aspects of mercury bombardment ion thrusters.

    NASA Technical Reports Server (NTRS)

    Schnelker, D. E.; Kami, S.

    1972-01-01

    The mechanical design criteria are presented as background considerations for solving problems associated with the thermomechanical design of mercury ion bombardment thrusters. Various analytical procedures are used to aid in the development of thruster subassemblies and components in the fields of heat transfer, vibration, and stress analysis. Examples of these techniques which provide computer solutions to predict and control stress levels encountered during launch and operation of thruster systems are discussed. Computer models of specific examples are presented.

  8. Development of optical immunosensors for detection of proteins in serum.

    PubMed

    Kyprianou, Dimitris; Chianella, Iva; Guerreiro, Antonio; Piletska, Elena V; Piletsky, Sergey A

    2013-01-15

    The detection of proteins in biological samples such as blood, serum or plasma by biosensors is very challenging due to the complex nature of the matrix, which contains a high level of many interfering compounds. Here we show the application of a novel polymeric immobilisation matrix that helps in the detection of specific protein analytes in biological samples by surface plasmon resonance (SPR) immunosensors. This polymer matrix contains thioacetal functional groups included in the network, and these groups do not require any further activation in order to react with proteins, making it attractive for sensor fabrication. The protein prostate specific antigen (PSA) was selected as a model target analyte. A sandwich format with two primary antibodies recognising different parts (epitopes) of the analyte was used for the detection of PSA in serum. The efficiency of the reduction of non-specific binding achieved with novel polymer was compared with those of other techniques such as coating of sensor surface with polyethylene glycol (PEG), use of charged hydrophilic aspartic acid and surfactants such as Tween20. The detection limit of the polymer based immunosensor was 0.1 ng ml(-1) for free form PSA (f-PSA) in buffer and 5 ng ml(-1) in 20% serum. This is an improvement compared with similar devices reported on literature, indicating the potential of the immunosensor developed here for the analysis of real samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  9. Manual-slide-engaged paper chip for parallel SERS-immunoassay measurement of clenbuterol from swine hair.

    PubMed

    Zheng, Tingting; Gao, Zhigang; Luo, Yong; Liu, Xianming; Zhao, Weijie; Lin, Bingcheng

    2016-02-01

    Clenbuterol (CL), as a feed additive, has been banned in many countries due to its potential threat to human health. In detection of CL, a fast, low-cost technique with high accuracy and specificity would be ideal for its administrative on-field inspections. Among the attempts to pursue a reliable detection tool of CL, a technique that combines surface enhanced Raman spectroscopy (SERS) and immunoassay, is close to meet the requirements as above. However, multiple steps of interactions between CL analyte, antibody, and antigen are involved in this method, and under conventional setup, the operation of SERS/immunoassay were unwieldy. In this paper, to facilitate a more manageable sample manipulation for SERS-immunoassay measurement, a 3D paper chip was suggested. A switch-on-chip multilayered (abbreviated as SoCM-) microfluidic paper-based analysis device (μPad) was fabricated to provide operators with manual switches on the interactions between different microfluids. Besides, on a detection slip we made on the main body of our SoCM-μPad, antigen was anchored in pattern. With this architecture, multistep interactions between the CL analyte in swine hair extract and the SERS probe-modified antibody and antigen, were managed for on-chip SERS-immunoassay detection. This would be very attractive for fast, cheap, accurate, and on-site specific detection of CL from real samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Efficient computational nonlinear dynamic analysis using modal modification response technique

    NASA Astrophysics Data System (ADS)

    Marinone, Timothy; Avitabile, Peter; Foley, Jason; Wolfson, Janet

    2012-08-01

    Generally, structural systems contain nonlinear characteristics in many cases. These nonlinear systems require significant computational resources for solution of the equations of motion. Much of the model, however, is linear where the nonlinearity results from discrete local elements connecting different components together. Using a component mode synthesis approach, a nonlinear model can be developed by interconnecting these linear components with highly nonlinear connection elements. The approach presented in this paper, the Modal Modification Response Technique (MMRT), is a very efficient technique that has been created to address this specific class of nonlinear problem. By utilizing a Structural Dynamics Modification (SDM) approach in conjunction with mode superposition, a significantly smaller set of matrices are required for use in the direct integration of the equations of motion. The approach will be compared to traditional analytical approaches to make evident the usefulness of the technique for a variety of test cases.

  11. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  12. Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.

    PubMed

    Ötles, Semih; Kartal, Canan

    2016-01-01

    Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.

  13. MRI of human hair.

    PubMed

    Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael

    2009-06-01

    Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.

  14. Molecular beacon probes-base multiplex NASBA Real-time for detection of HIV-1 and HCV.

    PubMed

    Mohammadi-Yeganeh, S; Paryan, M; Mirab Samiee, S; Kia, V; Rezvan, H

    2012-06-01

    Developed in 1991, nucleic acid sequence-based amplification (NASBA) has been introduced as a rapid molecular diagnostic technique, where it has been shown to give quicker results than PCR, and it can also be more sensitive. This paper describes the development of a molecular beacon-based multiplex NASBA assay for simultaneous detection of HIV-1 and HCV in plasma samples. A well-conserved region in the HIV-1 pol gene and 5'-NCR of HCV genome were used for primers and molecular beacon design. The performance features of HCV/HIV-1 multiplex NASBA assay including analytical sensitivity and specificity, clinical sensitivity and clinical specificity were evaluated. The analysis of scalar concentrations of the samples indicated that the limit of quantification of the assay was <1000 copies/ml for HIV-1 and <500 copies/ml for HCV with 95% confidence interval. Multiplex NASBA assay showed a 98% sensitivity and 100% specificity. The analytical specificity study with BLAST software demonstrated that the primers do not attach to any other sequences except for that of HIV-1 or HCV. The primers and molecular beacon probes detected all HCV genotypes and all major variants of HIV-1. This method may represent a relatively inexpensive isothermal method for detection of HIV-1/HCV co-infection in monitoring of patients.

  15. Peptide Fragmentation Induced by Radicals at Atmospheric Pressure

    PubMed Central

    Vilkov, Andrey N.; Laiko, Victor V.; Doroshenko, Vladimir M.

    2009-01-01

    A novel ion dissociation technique, which is capable of providing an efficient fragmentation of peptides at essentially atmospheric pressure conditions, is developed. The fragmentation patterns observed often contain c-type fragments that are specific to ECD/ETD, along with the y-/b- fragments that are specific to CAD. In the presented experimental setup, ion fragmentation takes place within a flow reactor located in the atmospheric pressure region between the ion source and the mass spectrometer. According to a proposed mechanism, the fragmentation results from the interaction of ESI-generated analyte ions with the gas-phase radical species produced by a corona discharge source. PMID:19034885

  16. Identification of novel peptides for horse meat speciation in highly processed foodstuffs.

    PubMed

    Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario

    2015-01-01

    There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.

  17. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  18. Electro-Analytical Study of Material Interfaces Relevant for Chemical Mechanical Planarization and Lithium Ion Batteries

    NASA Astrophysics Data System (ADS)

    Turk, Michael C.

    This dissertation work involves two areas of experimental research, focusing specifically on the applications of electro-analytical techniques for interfacial material characterization. The first area of the work is centered on the evaluation and characterization of material components used for chemical mechanical planarization (CMP) in the fabrication of semiconductor devices. This part also represents the bulk of the projects undertaken for the present dissertation. The other area of research included here involves exploratory electrochemical studies of certain electrolyte and electrode materials for applications in the development of advanced lithium ion secondary batteries. The common element between the two areas of investigation is the technical approach that combines a broad variety of electro-analytical characterization techniques to examine application specific functions of the associated materials and devices. The CMP related projects concentrate on designing and evaluating materials for CMP slurries that would be useful in the processing of copper interconnects for the sub-22 nm technology node. Specifically, ruthenium and cobalt are nontraditional barrier materials currently considered for the new interconnects. The CMP schemes used to process the structures based on these metals involve complex surface chemistries of Ru, Co and Cu (used for wiring lines). The strict requirement of defect-control while maintaining material removal by precisely regulated tribo-corrosion complicates the designs of the CMP slurries needed to process these systems. Since Ru is electrochemically more noble than Cu, the surface regions of Cu assembled in contact with Ru tend to generate defects due to galvanic corrosion in the CMP environment. At the same time, Co is strongly reactive in the typical slurry environment and is prone to developing galvanic corrosion induced by Cu. The present work explores a selected class of alkaline slurry formulations aimed at reducing these galvanic corrosions in chemically controlled low-pressure CMP. The CMP specific functions of the slurry components are characterized in the tribo-electro-analytical approach by using voltammetry, open circuit potential (OCP) measurements and electrochemical impedance spectroscopy (EIS) in the presence as well as in the absence of surface abrasion, both with and without the inclusion of colloidal silica (SiO2) abrasives. The results are used to understand the reaction mechanisms responsible for supporting material removal and corrosion suppression. The project carried out in the area of Li ion batteries (LIBs) uses electro-analytical techniques to probe electrolyte characteristics as well as electrode material performance. The investigation concentrates on optimizing a tactically chosen set of electrolyte compositions for low-to-moderate temperature applications of lithium titanium oxide (LTO), a relatively new anode material for such batteries. For this application, mixtures of non-aqueous carbonate based solvents are studied in combination with lithium perchlorate. The temperature dependent conductivities of the electrolytes are rigorously measured and analyzed using EIS. The experimental considerations and the working principle of this EIS based approach are carefully examined and standardized in the course of this study. These experiments also investigate the effects of temperature variations (below room temperature) on the solid electrolyte interphase (SEI) formation characteristics of LTO in the given electrolytes. This dissertation is organized as follows: Each experimental system and its relevance for practical applications are briefly introduced in each chapter. The experimental approach and the motivation for carrying out the investigation are also noted in that context. The experimental details specific to the particular study are described. This is followed by the results and their discussion, and subsequently, by the specific conclusions drawn from the given set of experiments. A general summary of the obtained results is presented at the end of the dissertation. Possible extensions of the present studies have also been briefly noted there.

  19. Polymer architectures via mass spectrometry and hyphenated techniques: A review.

    PubMed

    Crotty, Sarah; Gerişlioğlu, Selim; Endres, Kevin J; Wesdemiotis, Chrys; Schubert, Ulrich S

    2016-08-17

    This review covers the application of mass spectrometry (MS) and its hyphenated techniques to synthetic polymers of varying architectural complexities. The synthetic polymers are discussed as according to their architectural complexity from linear homopolymers and copolymers to stars, dendrimers, cyclic copolymers and other polymers. MS and tandem MS (MS/MS) has been extensively used for the analysis of synthetic polymers. However, the increase in structural or architectural complexity can result in analytical challenges that MS or MS/MS cannot overcome alone. Hyphenation to MS with different chromatographic techniques (2D × LC, SEC, HPLC etc.), utilization of other ionization methods (APCI, DESI etc.) and various mass analyzers (FT-ICR, quadrupole, time-of-flight, ion trap etc.) are applied to overcome these challenges and achieve more detailed structural characterizations of complex polymeric systems. In addition, computational methods (software: MassChrom2D, COCONUT, 2D maps etc.) have also reached polymer science to facilitate and accelerate data interpretation. Developments in technology and the comprehension of different polymer classes with diverse architectures have significantly improved, which allow for smart polymer designs to be examined and advanced. We present specific examples covering diverse analytical aspects as well as forthcoming prospects in polymer science. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations

    PubMed Central

    Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.

    2015-01-01

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076

  1. Interconnections between various analytic approaches applicable to third-order nonlinear differential equations.

    PubMed

    Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M

    2015-04-08

    We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.

  2. Chiral Separations

    NASA Astrophysics Data System (ADS)

    Stalcup, A. M.

    2010-07-01

    The main goal of this review is to provide a brief overview of chiral separations to researchers who are versed in the area of analytical separations but unfamiliar with chiral separations. To researchers who are not familiar with this area, there is currently a bewildering array of commercially available chiral columns, chiral derivatizing reagents, and chiral selectors for approaches that span the range of analytical separation platforms (e.g., high-performance liquid chromatography, gas chromatography, supercritical-fluid chromatography, and capillary electrophoresis). This review begins with a brief discussion of chirality before examining the general strategies and commonalities among all of the chiral separation techniques. Rather than exhaustively listing all the chiral selectors and applications, this review highlights significant issues and differences between chiral and achiral separations, providing salient examples from specific classes of chiral selectors where appropriate.

  3. An analytical study of electric vehicle handling dynamics

    NASA Technical Reports Server (NTRS)

    Greene, J. E.; Segal, D. J.

    1979-01-01

    Hypothetical electric vehicle configurations were studied by applying available analytical methods. Elementary linearized models were used in addition to a highly sophisticated vehicle dynamics computer simulation technique. Physical properties of specific EV's were defined for various battery and powertrain packaging approaches applied to a range of weight distribution and inertial properties which characterize a generic class of EV's. Computer simulations of structured maneuvers were performed for predicting handling qualities in the normal driving range and during various extreme conditions related to accident avoidance. Results indicate that an EV with forward weight bias will possess handling qualities superior to a comparable EV that is rear-heavy or equally balanced. The importance of properly matching tires, suspension systems, and brake system front/rear torque proportioning to a given EV configuration during the design stage is demonstrated.

  4. Some results regarding stability of photovoltaic maximum-power-point tracking dc-dc converters

    NASA Astrophysics Data System (ADS)

    Schaefer, John F.

    An analytical investigation of a class of photovoltaic (PV) maximum-power-point tracking dc-dc converters has yielded basic results relative to the stability of such devices. Necessary and sufficient conditions for stable operation are derived, and design tools are given. Specific results have been obtained for arbitrary PV arrays driving converters powering resistive loads and batteries. The analytical techniques are applicable to inverters, also. Portions of the theoretical results have been verified in operational devices: a 1500 watt unit has driven a 1-horsepower, 90-volt dc motor powering a water pump jack for over one year. Prior to modification shortly after initial installation, the unit exhibited instability at low levels of irradiance, as predicted by the theory. Two examples are provided.

  5. The role of atomic fluorescence spectrometry in the automatic environmental monitoring of trace element analysis

    PubMed Central

    Stockwell, P. B.; Corns, W. T.

    1993-01-01

    Considerable attention has been drawn to the environmental levels of mercury, arsenic, selenium and antimony in the last decade. Legislative and environmental pressure has forced levels to be lowered and this has created an additional burden for analytical chemists. Not only does an analysis have to reach lower detection levels, but it also has to be seen to be correct. Atomic fluorescence detection, especially when coupled to vapour generation techniques, offers both sensitivity and specificity. Developments in the design of specified atomic fluorescence detectors for mercury, for the hydride-forming elements and also for cadmium, are described in this paper. Each of these systems is capable of analysing samples in the part per trillion (ppt) range reliably and economically. Several analytical applications are described. PMID:18924964

  6. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    PubMed

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  7. Spectroscopic characterization and quantitative determination of atorvastatin calcium impurities by novel HPLC method

    NASA Astrophysics Data System (ADS)

    Gupta, Lokesh Kumar

    2012-11-01

    Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.

  8. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  9. Analytical pyrolysis mass spectrometry: new vistas opened by temperature-resolved in-source PYMS

    NASA Astrophysics Data System (ADS)

    Boon, Jaap J.

    1992-09-01

    Analytical pyrolysis mass spectrometry (PYMS) is introduced and its applications to the analysis of synthetic polymers, biopolymers, biomacromolecular systems and geomacromolecules are critically reviewed. Analytical pyrolysis inside the ionisation chamber of a mass spectrometer, i.e. in-source PYMS, gives a complete inventory of the pyrolysis products evolved from a solid sample. The temperature-resolved nature of the experiment gives a good insight into the temperature dependence of the volatilisation and pyrolytic dissociation processes. Chemical ionisation techniques appear to be especially suitable for the analysis of oligomeric fragments released in early stages of the pyrolysis of polymer systems. Large oligomeric fragments were observed for linear polymers such as cellulose (pentadecamer), polyhydroxyoctanoic acid (tridecamer) and polyhydroxybutyric acid (heneicosamer). New in-source PYMS data are presented on artists' paints, the plant polysaccharides cellulose and xyloglucan, several microbial polyhydroxyalkanoates, wood and enzyme-digested wood, biodegraded roots and a fossil cuticle of Miocene age. On-line and off-line pyrolysis chromatography mass spectrometric approaches are also discussed. New data presented on high temperature gas chromatography--mass spectrometry of deuterio-reduced permethylated pyrolysates of cellulose lead to a better understanding of polysaccharide dissociation mechanisms. Pyrolysis as an on-line sample pretreatment method for organic macromolecules in combination with MS techniques is a very challenging field of mass spectrometry. Pyrolytic dissociation and desorption is not at all a chaotic process but proceeds according to very specific mechanisms.

  10. Hollow silica microspheres for buoyancy-assisted separation of infectious pathogens from stool.

    PubMed

    Weigum, Shannon E; Xiang, Lichen; Osta, Erica; Li, Linying; López, Gabriel P

    2016-09-30

    Separation of cells and microorganisms from complex biological mixtures is a critical first step in many analytical applications ranging from clinical diagnostics to environmental monitoring for food and waterborne contaminants. Yet, existing techniques for cell separation are plagued by high reagent and/or instrumentation costs that limit their use in many remote or resource-poor settings, such as field clinics or developing countries. We developed an innovative approach to isolate infectious pathogens from biological fluids using buoyant hollow silica microspheres that function as "molecular buoys" for affinity-based target capture and separation by floatation. In this process, antibody functionalized glass microspheres are mixed with a complex biological sample, such as stool. When mixing is stopped, the target-bound, low-density microspheres float to the air/liquid surface, which simultaneously isolates and concentrates the target analytes from the sample matrix. The microspheres are highly tunable in terms of size, density, and surface functionality for targeting diverse analytes with separation times of ≤2min in viscous solutions. We have applied the molecular buoy technique for isolation of a protozoan parasite that causes diarrheal illness, Cryptosporidium, directly from stool with separation efficiencies over 90% and low non-specific binding. This low-cost method for phenotypic cell/pathogen separation from complex mixtures is expected to have widespread use in clinical diagnostics as well as basic research. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Heat capacity measurements of sub-nanoliter volumes of liquids using bimaterial microchannel cantilevers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, M. F.; Miriyala, N.; Hassanpourfard, M.

    Lab-on-a-Chip compatible techniques for thermal characterization of miniaturized volumes of liquid analytes are necessary in applications such as protein blotting, DNA melting, and drug development, where samples are either rare or volume-limited. We developed a closed-chamber calorimeter based on a bimaterial microchannel cantilever (BMC) for sub-nanoliter level thermal analysis. When the liquid-filled BMC is irradiated with infrared (IR) light at a specific wavelength, the IR absorption by the liquid analyte results in localized heat generation and the subsequent deflection of the BMC, due to a thermal expansion mismatch between the constituent materials. The time constant of the deflection, which ismore » dependent upon the heat capacity of the liquid analyte, can be directly measured by recording the time-dependent bending of the BMC. We have used the BMC to quantitatively measure the heat capacity of five volatile organic compounds. With a deflection noise level of ∼10 nm and a signal-to-noise ratio of 68:1, the BMC offers a sensitivity of 30.5 ms/(J g{sup −1 }K{sup −1}) and a resolution of 23 mJ/(g K) for ∼150 pl liquid for heat capacity measurements. This technique can be used for small-scale thermal characterization of different chemical and biological samples.« less

  13. Testing a path-analytic mediation model of how motivational enhancement physiotherapy improves physical functioning in pain patients.

    PubMed

    Cheing, Gladys; Vong, Sinfia; Chan, Fong; Ditchman, Nicole; Brooks, Jessica; Chan, Chetwyn

    2014-12-01

    Pain is a complex phenomenon not easily discerned from psychological, social, and environmental characteristics and is an oft cited barrier to return to work for people experiencing low back pain (LBP). The purpose of this study was to evaluate a path-analytic mediation model to examine how motivational enhancement physiotherapy, which incorporates tenets of motivational interviewing, improves physical functioning of patients with chronic LBP. Seventy-six patients with chronic LBP were recruited from the outpatient physiotherapy department of a government hospital in Hong Kong. The re-specified path-analytic model fit the data very well, χ (2)(3, N = 76) = 3.86, p = .57; comparative fit index = 1.00; and the root mean square error of approximation = 0.00. Specifically, results indicated that (a) using motivational interviewing techniques in physiotherapy was associated with increased working alliance with patients, (b) working alliance increased patients' outcome expectancy and (c) greater outcome expectancy resulted in a reduction of subjective pain intensity and improvement in physical functioning. Change in pain intensity also directly influenced improvement in physical functioning. The effect of motivational enhancement therapy on physical functioning can be explained by social-cognitive factors such as motivation, outcome expectancy, and working alliance. The use of motivational interviewing techniques to increase outcome expectancy of patients and improve working alliance could further strengthen the impact of physiotherapy on rehabilitation outcomes of patients with chronic LBP.

  14. Common aspects influencing the translocation of SERS to Biomedicine.

    PubMed

    Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu

    2018-01-04

    In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. Control system design for flexible structures using data models

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Frazier, W. Garth; Mitchell, Jerrel R.; Medina, Enrique A.; Bukley, Angelia P.

    1993-01-01

    The dynamics and control of flexible aerospace structures exercises many of the engineering disciplines. In recent years there has been considerable research in the developing and tailoring of control system design techniques for these structures. This problem involves designing a control system for a multi-input, multi-output (MIMO) system that satisfies various performance criteria, such as vibration suppression, disturbance and noise rejection, attitude control and slewing control. Considerable progress has been made and demonstrated in control system design techniques for these structures. The key to designing control systems for these structures that meet stringent performance requirements is an accurate model. It has become apparent that theoretically and finite-element generated models do not provide the needed accuracy; almost all successful demonstrations of control system design techniques have involved using test results for fine-tuning a model or for extracting a model using system ID techniques. This paper describes past and ongoing efforts at Ohio University and NASA MSFC to design controllers using 'data models.' The basic philosophy of this approach is to start with a stabilizing controller and frequency response data that describes the plant; then, iteratively vary the free parameters of the controller so that performance measures become closer to satisfying design specifications. The frequency response data can be either experimentally derived or analytically derived. One 'design-with-data' algorithm presented in this paper is called the Compensator Improvement Program (CIP). The current CIP designs controllers for MIMO systems so that classical gain, phase, and attenuation margins are achieved. The center-piece of the CIP algorithm is the constraint improvement technique which is used to calculate a parameter change vector that guarantees an improvement in all unsatisfied, feasible performance metrics from iteration to iteration. The paper also presents a recently demonstrated CIP-type algorithm, called the Model and Data Oriented Computer-Aided Design System (MADCADS), developed for achieving H(sub infinity) type design specifications using data models. Control system design for the NASA/MSFC Single Structure Control Facility are demonstrated for both CIP and MADCADS. Advantages of design-with-data algorithms over techniques that require analytical plant models are also presented.

  16. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  17. Effects of two-temperature parameter and thermal nonlocal parameter on transient responses of a half-space subjected to ramp-type heating

    NASA Astrophysics Data System (ADS)

    Xue, Zhang-Na; Yu, Ya-Jun; Tian, Xiao-Geng

    2017-07-01

    Based upon the coupled thermoelasticity and Green and Lindsay theory, the new governing equations of two-temperature thermoelastic theory with thermal nonlocal parameter is formulated. To more realistically model thermal loading of a half-space surface, a linear temperature ramping function is adopted. Laplace transform techniques are used to get the general analytical solutions in Laplace domain, and the inverse Laplace transforms based on Fourier expansion techniques are numerically implemented to obtain the numerical solutions in time domain. Specific attention is paid to study the effect of thermal nonlocal parameter, ramping time, and two-temperature parameter on the distributions of temperature, displacement and stress distribution.

  18. Virtual Screening of Receptor Sites for Molecularly Imprinted Polymers.

    PubMed

    Bates, Ferdia; Cela-Pérez, María Concepción; Karim, Kal; Piletsky, Sergey; López-Vilariño, José Manuel

    2016-08-01

    Molecularly Imprinted Polymers (MIPs) are highly advantageous in the field of analytical chemistry. However, interference from secondary molecules can also impede capture of a target by a MIP receptor. This greatly complicates the design process and often requires extensive laboratory screening which is time consuming, costly, and creates substantial waste products. Herein, is presented a new technique for screening of "virtually imprinted receptors" for rebinding of the molecular template as well as secondary structures, correlating the virtual predictions with experimentally acquired data in three case studies. This novel technique is particularly applicable to the evaluation and prediction of MIP receptor specificity and efficiency in complex aqueous systems. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Analyte-Responsive Hydrogels: Intelligent Materials for Biosensing and Drug Delivery.

    PubMed

    Culver, Heidi R; Clegg, John R; Peppas, Nicholas A

    2017-02-21

    Nature has mastered the art of molecular recognition. For example, using synergistic non-covalent interactions, proteins can distinguish between molecules and bind a partner with incredible affinity and specificity. Scientists have developed, and continue to develop, techniques to investigate and better understand molecular recognition. As a consequence, analyte-responsive hydrogels that mimic these recognitive processes have emerged as a class of intelligent materials. These materials are unique not only in the type of analyte to which they respond but also in how molecular recognition is achieved and how the hydrogel responds to the analyte. Traditional intelligent hydrogels can respond to environmental cues such as pH, temperature, and ionic strength. The functional monomers used to make these hydrogels can be varied to achieve responsive behavior. For analyte-responsive hydrogels, molecular recognition can also be achieved by incorporating biomolecules with inherent molecular recognition properties (e.g., nucleic acids, peptides, enzymes, etc.) into the polymer network. Furthermore, in addition to typical swelling/syneresis responses, these materials exhibit unique responsive behaviors, such as gel assembly or disassembly, upon interaction with the target analyte. With the diverse tools available for molecular recognition and the ability to generate unique responsive behaviors, analyte-responsive hydrogels have found great utility in a wide range of applications. In this Account, we discuss strategies for making four different classes of analyte-responsive hydrogels, specifically, non-imprinted, molecularly imprinted, biomolecule-containing, and enzymatically responsive hydrogels. Then we explore how these materials have been incorporated into sensors and drug delivery systems, highlighting examples that demonstrate the versatility of these materials. For example, in addition to the molecular recognition properties of analyte-responsive hydrogels, the physicochemical changes that are induced upon analyte binding can be exploited to generate a detectable signal for sensing applications. As research in this area has grown, a number of creative approaches for improving the selectivity and sensitivity (i.e., detection limit) of these sensors have emerged. For applications in drug delivery systems, therapeutic release can be triggered by competitive molecular interactions or physicochemical changes in the network. Additionally, including degradable units within the network can enable sustained and responsive therapeutic release. Several exciting examples exploiting the analyte-responsive behavior of hydrogels for the treatment of cancer, diabetes, and irritable bowel syndrome are discussed in detail. We expect that creative and combinatorial approaches used in the design of analyte-responsive hydrogels will continue to yield materials with great potential in the fields of sensing and drug delivery.

  20. Evaluating biological variation in non-transgenic crops: executive summary from the ILSI Health and Environmental Sciences Institute workshop, November 16-17, 2009, Paris, France.

    PubMed

    Doerrer, Nancy; Ladics, Gregory; McClain, Scott; Herouet-Guicheney, Corinne; Poulsen, Lars K; Privalle, Laura; Stagg, Nicola

    2010-12-01

    The International Life Sciences Institute Health and Environmental Sciences Institute Protein Allergenicity Technical Committee hosted an international workshop November 16-17, 2009, in Paris, France, with over 60 participants from academia, government, and industry to review and discuss the potential utility of "-omics" technologies for assessing the variability in plant gene, protein, and metabolite expression. The goal of the workshop was to illustrate how a plant's constituent makeup and phenotypic processes can be surveyed analytically. Presentations on the "-omics" techniques (i.e., genomics, proteomics, and metabolomics) highlighted the workshop, and summaries of these presentations are published separately in this supplemental issue. This paper summarizes key messages, as well as the consensus points reached, in a roundtable discussion on eight specific questions posed during the final session of the workshop. The workshop established some common, though not unique, challenges for all "-omics" techniques, and include (a) standardization of separation/extraction and analytical techniques; (b) difficulty in associating environmental impacts (e.g., planting, soil texture, location, climate, stress) with potential alterations in plants at genomic, proteomic, and metabolomic levels; (c) many independent analytical measurements, but few replicates/subjects--poorly defined accuracy and precision; and (d) bias--a lack of hypothesis-driven science. Information on natural plant variation is critical in establishing the utility of new technologies due to the variability in specific analytes that may result from genetic differences (crop genotype), different crop management practices (conventional high input, low input, organic), interaction between genotype and environment, and the use of different breeding methods. For example, variations of several classes of proteins were reported among different soybean, rice, or wheat varieties or varieties grown at different locations. Data on the variability of allergenic proteins are important in defining the risk of potential allergenicity. Once established as a standardized assay, survey approaches such as the "-omics" techniques can be considered in a hypothesis-driven analysis of plants, such as determining unintended effects in genetically modified (GM) crops. However, the analysis should include both the GM and control varieties that have the same breeding history and exposure to the same environmental conditions. Importantly, the biological relevance and safety significance of changes in "-omic" data are still unknown. Furthermore, the current compositional assessment for evaluating the substantial equivalence of GM crops is robust, comprehensive, and a good tool for food safety assessments. The overall consensus of the workshop participants was that many "-omics" techniques are extremely useful in the discovery and research phases of biotechnology, and are valuable for hypothesis generation. However, there are many methodological shortcomings identified with "-omics" approaches, a paucity of reference materials, and a lack of focused strategy for their use that currently make them not conducive for the safety assessment of GM crops. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Surface-Enhanced Raman Spectroscopy.

    ERIC Educational Resources Information Center

    Garrell, Robin L.

    1989-01-01

    Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)

  2. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  3. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  4. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    PubMed

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  5. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  6. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  7. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  8. Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students.

    PubMed

    Yune, So Jung; Lee, Sang Yeoup; Im, Sun Ju; Kam, Bee Sung; Baek, Sun Yong

    2018-06-05

    Task-specific checklists, holistic rubrics, and analytic rubrics are often used for performance assessments. We examined what factors evaluators consider important in holistic scoring of clinical performance assessment, and compared the usefulness of applying holistic and analytic rubrics respectively, and analytic rubrics in addition to task-specific checklists based on traditional standards. We compared the usefulness of a holistic rubric versus an analytic rubric in effectively measuring the clinical skill performances of 126 third-year medical students who participated in a clinical performance assessment conducted by Pusan National University School of Medicine. We conducted a questionnaire survey of 37 evaluators who used all three evaluation methods-holistic rubric, analytic rubric, and task-specific checklist-for each student. The relationship between the scores on the three evaluation methods was analyzed using Pearson's correlation. Inter-rater agreement was analyzed by Kappa index. The effect of holistic and analytic rubric scores on the task-specific checklist score was analyzed using multiple regression analysis. Evaluators perceived accuracy and proficiency to be major factors in objective structured clinical examinations evaluation, and history taking and physical examination to be major factors in clinical performance examinations evaluation. Holistic rubric scores were highly related to the scores of the task-specific checklist and analytic rubric. Relatively low agreement was found in clinical performance examinations compared to objective structured clinical examinations. Meanwhile, the holistic and analytic rubric scores explained 59.1% of the task-specific checklist score in objective structured clinical examinations and 51.6% in clinical performance examinations. The results show the usefulness of holistic and analytic rubrics in clinical performance assessment, which can be used in conjunction with task-specific checklists for more efficient evaluation.

  9. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  10. Propulsion Health Monitoring for Enhanced Safety

    NASA Technical Reports Server (NTRS)

    Butz, Mark G.; Rodriguez, Hector M.

    2003-01-01

    This report presents the results of the NASA contract Propulsion System Health Management for Enhanced Safety performed by General Electric Aircraft Engines (GE AE), General Electric Global Research (GE GR), and Pennsylvania State University Applied Research Laboratory (PSU ARL) under the NASA Aviation Safety Program. This activity supports the overall goal of enhanced civil aviation safety through a reduction in the occurrence of safety-significant propulsion system malfunctions. Specific objectives are to develop and demonstrate vibration diagnostics techniques for the on-line detection of turbine rotor disk cracks, and model-based fault tolerant control techniques for the prevention and mitigation of in-flight engine shutdown, surge/stall, and flameout events. The disk crack detection work was performed by GE GR which focused on a radial-mode vibration monitoring technique, and PSU ARL which focused on a torsional-mode vibration monitoring technique. GE AE performed the Model-Based Fault Tolerant Control work which focused on the development of analytical techniques for detecting, isolating, and accommodating gas-path faults.

  11. Resonance Ionization, Mass Spectrometry.

    ERIC Educational Resources Information Center

    Young, J. P.; And Others

    1989-01-01

    Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)

  12. Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods

    ERIC Educational Resources Information Center

    Zhang, Ying

    2011-01-01

    Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…

  13. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  14. Geomagnetic field models for satellite angular motion studies

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, M. Yu.; Penkov, V. I.; Roldugin, D. S.; Pichuzhkina, A. V.

    2018-03-01

    Four geomagnetic field models are discussed: IGRF, inclined, direct and simplified dipoles. Geomagnetic induction vector expressions are provided in different reference frames. Induction vector behavior is compared for different models. Models applicability for the analysis of satellite motion is studied from theoretical and engineering perspectives. Relevant satellite dynamics analysis cases using analytical and numerical techniques are provided. These cases demonstrate the benefit of a certain model for a specific dynamics study. Recommendations for models usage are summarized in the end.

  15. Harmonics suppression of vacuum chamber eddy current induced fields with application to the Superconducting Super Collider (SSC) Low Energy Booster (LEB) Magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlueter, R.D.; Halbach, K.

    1991-12-04

    This memo presents the formulation of an expression for eddy currents induced in a thin-walled conductor due to a time-dependent electromagnet field excitation. Then follows an analytical development for prediction of vacuum chamber eddy current induced field harmonics in iron-core electromagnets. A passive technique for harmonics suppression is presented with specific application to the design of the Superconducting Super Collider (SSC) Low Energy B (LEB) Magnets.

  16. Problem Definition Study on Techniques and Methodologies for Evaluating the Chemical and Toxicological Properties of Combustion Products of Gun Systems. Volume 1.

    DTIC Science & Technology

    1988-03-01

    methods that can resolve the various compounds are required. This chapter specifically focuses on analytical and sampling metho - dology used to determine...Salmonella typhimurium TA1538. Cancer Res. 35:2461-2468. Huy, N. D., R. Belleau, and P. E. Roy. 1975. Toxicity of marijuana and tobacco smoking in the... Medicine Division (HSHA-IPM) Fort Sam Houston, TX 78234 Commander U.S. Army Materiel Command ATTN: AMSCG 5001 Eisenhower Avenue Alexandria, VA 22333

  17. Preparation Torque Limit for Composites Joined with Mechanical Fasteners

    NASA Technical Reports Server (NTRS)

    Thomas, Frank P.; Yi, Zhao

    2005-01-01

    Current design guidelines for determining torque ranges for composites are based on tests and analysis from isotropic materials. Properties of composites are not taken into account. No design criteria based upon a systematic analytical and test analyses is available. This paper is to study the maximum torque load a composite component could carry prior to any failure. Specifically, the torque-tension tests are conducted. NDT techniques including acoustic emission, thermography and photomicroscopy are also utilized to characterize the damage modes.

  18. Disease management with ARIMA model in time series.

    PubMed

    Sato, Renato Cesar

    2013-01-01

    The evaluation of infectious and noninfectious disease management can be done through the use of a time series analysis. In this study, we expect to measure the results and prevent intervention effects on the disease. Clinical studies have benefited from the use of these techniques, particularly for the wide applicability of the ARIMA model. This study briefly presents the process of using the ARIMA model. This analytical tool offers a great contribution for researchers and healthcare managers in the evaluation of healthcare interventions in specific populations.

  19. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  20. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  1. Mass spectrometry techniques in the survey of steroid metabolites as potential disease biomarkers: a review.

    PubMed

    Gouveia, Maria João; Brindley, Paul J; Santos, Lúcio Lara; Correia da Costa, José Manuel; Gomes, Paula; Vale, Nuno

    2013-09-01

    Mass spectrometric approaches have been fundamental to the identification of metabolites associated with steroid hormones, yet this topic has not been reviewed in depth in recent years. To this end, and given the increasing relevance of liquid chromatography-mass spectrometry (LC-MS) studies on steroid hormones and their metabolites, the present review addresses this subject. This review provides a timely summary of the use of various mass spectrometry-based analytical techniques during the evaluation of steroidal biomarkers in a range of human disease settings. The sensitivity and specificity of these technologies are clearly providing valuable new insights into breast cancer and cardiovascular disease. We aim to contribute to an enhanced understanding of steroid metabolism and how it can be profiled by LC-MS techniques. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  2. The Use of Neutron Analysis Techniques for Detecting The Concentration And Distribution of Chloride Ions in Archaeological Iron

    PubMed Central

    Watkinson, D; Rimmer, M; Kasztovszky, Z; Kis, Z; Maróti, B; Szentmiklósi, L

    2014-01-01

    Chloride (Cl) ions diffuse into iron objects during burial and drive corrosion after excavation. Located under corrosion layers, Cl is inaccessible to many analytical techniques. Neutron analysis offers non-destructive avenues for determining Cl content and distribution in objects. A pilot study used prompt gamma activation analysis (PGAA) and prompt gamma activation imaging (PGAI) to analyse the bulk concentration and longitudinal distribution of Cl in archaeological iron objects. This correlated with the object corrosion rate measured by oxygen consumption, and compared well with Cl measurement using a specific ion meter. High-Cl areas were linked with visible damage to the corrosion layers and attack of the iron core. Neutron techniques have significant advantages in the analysis of archaeological metals, including penetration depth and low detection limits. PMID:26028670

  3. High lift selected concepts

    NASA Technical Reports Server (NTRS)

    Henderson, M. L.

    1979-01-01

    The benefits to high lift system maximum life and, alternatively, to high lift system complexity, of applying analytic design and analysis techniques to the design of high lift sections for flight conditions were determined and two high lift sections were designed to flight conditions. The influence of the high lift section on the sizing and economics of a specific energy efficient transport (EET) was clarified using a computerized sizing technique and an existing advanced airplane design data base. The impact of the best design resulting from the design applications studies on EET sizing and economics were evaluated. Flap technology trade studies, climb and descent studies, and augmented stability studies are included along with a description of the baseline high lift system geometry, a calculation of lift and pitching moment when separation is present, and an inverse boundary layer technique for pressure distribution synthesis and optimization.

  4. Probing biomolecular interaction forces using an anharmonic acoustic technique for selective detection of bacterial spores.

    PubMed

    Ghosh, Sourav K; Ostanin, Victor P; Johnson, Christian L; Lowe, Christopher R; Seshia, Ashwin A

    2011-11-15

    Receptor-based detection of pathogens often suffers from non-specific interactions, and as most detection techniques cannot distinguish between affinities of interactions, false positive responses remain a plaguing reality. Here, we report an anharmonic acoustic based method of detection that addresses the inherent weakness of current ligand dependant assays. Spores of Bacillus subtilis (Bacillus anthracis simulant) were immobilized on a thickness-shear mode AT-cut quartz crystal functionalized with anti-spore antibody and the sensor was driven by a pure sinusoidal oscillation at increasing amplitude. Biomolecular interaction forces between the coupled spores and the accelerating surface caused a nonlinear modulation of the acoustic response of the crystal. In particular, the deviation in the third harmonic of the transduced electrical response versus oscillation amplitude of the sensor (signal) was found to be significant. Signals from the specifically-bound spores were clearly distinguishable in shape from those of the physisorbed streptavidin-coated polystyrene microbeads. The analytical model presented here enables estimation of the biomolecular interaction forces from the measured response. Thus, probing biomolecular interaction forces using the described technique can quantitatively detect pathogens and distinguish specific from non-specific interactions, with potential applicability to rapid point-of-care detection. This also serves as a potential tool for rapid force-spectroscopy, affinity-based biomolecular screening and mapping of molecular interaction networks. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  6. Microprobe monazite geochronology: new techniques for dating deformation and metamorphism

    NASA Astrophysics Data System (ADS)

    Williams, M.; Jercinovic, M.; Goncalves, P.; Mahan, K.

    2003-04-01

    High-resolution compositional mapping, age mapping, and precise dating of monazite on the electron microprobe are powerful additions to microstructural and petrologic analysis and important tools for tectonic studies. The in-situ nature and high spatial resolution of the technique offer an entirely new level of structurally and texturally specific geochronologic data that can be used to put absolute time constraints on P-T-D paths, constrain the rates of sedimentary, metamorphic, and deformational processes, and provide new links between metamorphism and deformation. New analytical techniques (including background modeling, sample preparation, and interference analysis) have significantly improved the precision and accuracy of the technique and new mapping and image analysis techniques have increased the efficiency and strengthened the correlation with fabrics and textures. Microprobe geochronology is particularly applicable to three persistent microstructural-microtextural problem areas: (1) constraining the chronology of metamorphic assemblages; (2) constraining the timing of deformational fabrics; and (3) interpreting other geochronological results. In addition, authigenic monazite can be used to date sedimentary basins, and detrital monazite can fingerprint sedimentary source areas, both critical for tectonic analysis. Although some monazite generations can be directly tied to metamorphism or deformation, at present, the most common constraints rely on monazite inclusion relations in porphyroblasts that, in turn, can be tied to the deformation and/or metamorphic history. Examples will be presented from deep-crustal rocks of northern Saskatchewan and from mid-crustal rocks from the southwestern USA. Microprobe monazite geochronology has been used in both regions to deconvolute overprinting deformation and metamorphic events and to clarify the interpretation of other geochronologic data. Microprobe mapping and dating are powerful companions to mass spectroscopic dating techniques. They allow geochronology to be incorporated into the microstructural analytical process, resulting in a new level of integration of time (t) into P-T-D histories.

  7. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  8. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  9. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  10. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  11. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  12. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  13. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  14. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  15. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  16. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  17. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  18. Dissociable meta-analytic brain networks contribute to coordinated emotional processing.

    PubMed

    Riedel, Michael C; Yanes, Julio A; Ray, Kimberly L; Eickhoff, Simon B; Fox, Peter T; Sutherland, Matthew T; Laird, Angela R

    2018-06-01

    Meta-analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional- towards more network-based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta-analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta-analytic groupings of experiments demonstrating whole-brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta-analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta-analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large-scale brain networks. © 2018 Wiley Periodicals, Inc.

  19. Applications of flight control system methods to an advanced combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.

    1989-01-01

    Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.

  20. Molecularly Imprinted Polymer as an Antibody Substitution in Pseudo-immunoassays for Chemical Contaminants in Food and Environmental Samples.

    PubMed

    Chen, Chaochao; Luo, Jiaxun; Li, Chenglong; Ma, Mingfang; Yu, Wenbo; Shen, Jianzhong; Wang, Zhanhui

    2018-03-21

    The chemical contaminants in food and the environment are quite harmful to food safety and human health. Rapid, accurate, and cheap detection can effectively control the potential risks derived from these chemical contaminants. Among all detection methods, the immunoassay based on the specific interaction of antibody-analyte is one of the most widely used techniques in the field. However, biological antibodies employed in the immunoassay usually cannot tolerate extreme conditions, resulting in an unstable state in both physical and chemical profiles. Molecularly imprinted polymers (MIPs) are a class of polymers with specific molecular recognition abilities, which are highly robust, showing excellent operational stability under a wide variety of conditions. Recently, MIPs have been used in biomimetic immunoassays for chemical contaminants as an antibody substitute in food and the environment. Here, we reviewed these applications of MIPs incorporated in different analytical platforms, such as enzyme-linked immunosorbent assay, fluorescent immunoassay, chemiluminescent immunoassay, electrochemical immunoassay, microfluidic paper-based immunoassay, and homogeneous immunoassay, and discussed current challenges and future trends in the use of MIPs in biomimetic immunoassays.

  1. Integrated performance and reliability specification for digital avionics systems

    NASA Technical Reports Server (NTRS)

    Brehm, Eric W.; Goettge, Robert T.

    1995-01-01

    This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.

  2. Estimating hydraulic properties of the Floridan Aquifer System by analysis of earth-tide, ocean-tide, and barometric effects, Collier and Hendry Counties, Florida

    USGS Publications Warehouse

    Merritt, Michael L.

    2004-01-01

    Aquifers are subjected to mechanical stresses from natural, non-anthropogenic, processes such as pressure loading or mechanical forcing of the aquifer by ocean tides, earth tides, and pressure fluctuations in the atmosphere. The resulting head fluctuations are evident even in deep confined aquifers. The present study was conducted for the purpose of reviewing the research that has been done on the use of these phenomena for estimating the values of aquifer properties, and determining which of the analytical techniques might be useful for estimating hydraulic properties in the dissolved-carbonate hydrologic environment of southern Florida. Fifteen techniques are discussed in this report, of which four were applied.An analytical solution for head oscillations in a well near enough to the ocean to be influenced by ocean tides was applied to data from monitor zones in a well near Naples, Florida. The solution assumes a completely non-leaky confining unit of infinite extent. Resulting values of transmissivity are in general agreement with the results of aquifer performance tests performed by the South Florida Water Management District. There seems to be an inconsistency between results of the amplitude ratio analysis and independent estimates of loading efficiency. A more general analytical solution that takes leakage through the confining layer into account yielded estimates that were lower than those obtained using the non-leaky method, and closer to the South Florida Water Management District estimates. A numerical model with a cross-sectional grid design was applied to explore additional aspects of the problem.A relation between specific storage and the head oscillation observed in a well provided estimates of specific storage that were considered reasonable. Porosity estimates based on the specific storage estimates were consistent with values obtained from measurements on core samples. Methods are described for determining aquifer diffusivity by comparing the time-varying drawdown in an open well with periodic pressure-head oscillations in the aquifer, but the applicability of such methods might be limited in studies of the Floridan aquifer system.

  3. The evolution of analytical chemistry methods in foodomics.

    PubMed

    Gallo, Monica; Ferranti, Pasquale

    2016-01-08

    The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  5. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  6. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  7. Applications of Flow Cytometry to Clinical Microbiology†

    PubMed Central

    Álvarez-Barrientos, Alberto; Arroyo, Javier; Cantón, Rafael; Nombela, César; Sánchez-Pérez, Miguel

    2000-01-01

    Classical microbiology techniques are relatively slow in comparison to other analytical techniques, in many cases due to the need to culture the microorganisms. Furthermore, classical approaches are difficult with unculturable microorganisms. More recently, the emergence of molecular biology techniques, particularly those on antibodies and nucleic acid probes combined with amplification techniques, has provided speediness and specificity to microbiological diagnosis. Flow cytometry (FCM) allows single- or multiple-microbe detection in clinical samples in an easy, reliable, and fast way. Microbes can be identified on the basis of their peculiar cytometric parameters or by means of certain fluorochromes that can be used either independently or bound to specific antibodies or oligonucleotides. FCM has permitted the development of quantitative procedures to assess antimicrobial susceptibility and drug cytotoxicity in a rapid, accurate, and highly reproducible way. Furthermore, this technique allows the monitoring of in vitro antimicrobial activity and of antimicrobial treatments ex vivo. The most outstanding contribution of FCM is the possibility of detecting the presence of heterogeneous populations with different responses to antimicrobial treatments. Despite these advantages, the application of FCM in clinical microbiology is not yet widespread, probably due to the lack of access to flow cytometers or the lack of knowledge about the potential of this technique. One of the goals of this review is to attempt to mitigate this latter circumstance. We are convinced that in the near future, the availability of commercial kits should increase the use of this technique in the clinical microbiology laboratory. PMID:10755996

  8. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Newcomer adjustment during organizational socialization: a meta-analytic review of antecedents, outcomes, and methods.

    PubMed

    Bauer, Talya N; Bodner, Todd; Erdogan, Berrin; Truxillo, Donald M; Tucker, Jennifer S

    2007-05-01

    The authors tested a model of antecedents and outcomes of newcomer adjustment using 70 unique samples of newcomers with meta-analytic and path modeling techniques. Specifically, they proposed and tested a model in which adjustment (role clarity, self-efficacy, and social acceptance) mediated the effects of organizational socialization tactics and information seeking on socialization outcomes (job satisfaction, organizational commitment, job performance, intentions to remain, and turnover). The results generally supported this model. In addition, the authors examined the moderating effects of methodology on these relationships by coding for 3 methodological issues: data collection type (longitudinal vs. cross-sectional), sample characteristics (school-to-work vs. work-to-work transitions), and measurement of the antecedents (facet vs. composite measurement). Discussion focuses on the implications of the findings and suggestions for future research. 2007 APA, all rights reserved

  10. [The water content reference material of water saturated octanol].

    PubMed

    Wang, Haifeng; Ma, Kang; Zhang, Wei; Li, Zhanyuan

    2011-03-01

    The national standards of biofuels specify the technique specification and analytical methods. A water content certified reference material based on the water saturated octanol was developed in order to satisfy the needs of the instrument calibration and the methods validation, assure the accuracy and consistency of results in water content measurements of biofuels. Three analytical methods based on different theories were employed to certify the water content of the reference material, including Karl Fischer coulometric titration, Karl Fischer volumetric titration and quantitative nuclear magnetic resonance. The consistency of coulometric and volumetric titration was achieved through the improvement of methods. The accuracy of the certified result was improved by the introduction of the new method of quantitative nuclear magnetic resonance. Finally, the certified value of reference material is 4.76% with an expanded uncertainty of 0.09%.

  11. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  12. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  13. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  14. In-vivo analysis of ankle joint movement for patient-specific kinematic characterization.

    PubMed

    Ferraresi, Carlo; De Benedictis, Carlo; Franco, Walter; Maffiodo, Daniela; Leardini, Alberto

    2017-09-01

    In this article, a method for the experimental in-vivo characterization of the ankle kinematics is proposed. The method is meant to improve personalization of various ankle joint treatments, such as surgical decision-making or design and application of an orthosis, possibly to increase their effectiveness. This characterization in fact would make the treatments more compatible with the specific patient's joint physiological conditions. This article describes the experimental procedure and the analytical method adopted, based on the instantaneous and mean helical axis theories. The results obtained in this experimental analysis reveal that more accurate techniques are necessary for a robust in-vivo assessment of the tibio-talar axis of rotation.

  15. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  16. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  17. Background Signal as an in Situ Predictor of Dopamine Oxidation Potential: Improving Interpretation of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Meunier, Carl J; Roberts, James G; McCarty, Gregory S; Sombers, Leslie A

    2017-02-15

    Background-subtracted fast-scan cyclic voltammetry (FSCV) has emerged as a powerful analytical technique for monitoring subsecond molecular fluctuations in live brain tissue. Despite increasing utilization of FSCV, efforts to improve the accuracy of quantification have been limited due to the complexity of the technique and the dynamic recording environment. It is clear that variable electrode performance renders calibration necessary for accurate quantification; however, the nature of in vivo measurements can make conventional postcalibration difficult, or even impossible. Analyte-specific voltammograms and scaling factors that are critical for quantification can shift or fluctuate in vivo. This is largely due to impedance changes, and the effects of impedance on these measurements have not been characterized. We have previously reported that the background current can be used to predict electrode-specific scaling factors in situ. In this work, we employ model circuits to investigate the impact of impedance on FSCV measurements. Additionally, we take another step toward in situ electrode calibration by using the oxidation potential of quinones on the electrode surface to accurately predict the oxidation potential for dopamine at any point in an electrochemical experiment, as both are dependent on impedance. The model, validated both in adrenal slice and live brain tissue, enables information encoded in the shape of the background voltammogram to determine electrochemical parameters that are critical for accurate quantification. This improves data interpretation and provides a significant next step toward more automated methods for in vivo data analysis.

  18. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  19. Insights from native mass spectrometry approaches for top- and middle- level characterization of site-specific antibody-drug conjugates

    PubMed Central

    Botzanowski, Thomas; Erb, Stéphane; Hernandez-Alba, Oscar; Ehkirch, Anthony; Colas, Olivier; Wagner-Rousset, Elsa; Rabuka, David; Beck, Alain; Drake, Penelope M.; Cianférani, Sarah

    2017-01-01

    ABSTRACT Antibody-drug conjugates (ADCs) have emerged as a family of compounds with promise as efficient immunotherapies. First-generation ADCs were generated mostly via reactions on either lysine side-chain amines or cysteine thiol groups after reduction of the interchain disulfide bonds, resulting in heterogeneous populations with a variable number of drug loads per antibody. To control the position and the number of drug loads, new conjugation strategies aiming at the generation of more homogeneous site-specific conjugates have been developed. We report here the first multi-level characterization of a site-specific ADC by state-of-the-art mass spectrometry (MS) methods, including native MS and its hyphenation to ion mobility (IM-MS). We demonstrate the versatility of native MS methodologies for site-specific ADC analysis, with the unique ability to provide several critical quality attributes within one single run, along with a direct snapshot of ADC homogeneity/heterogeneity without extensive data interpretation. The capabilities of native IM-MS to directly access site-specific ADC conformational information are also highlighted. Finally, the potential of these techniques for assessing an ADC's heterogeneity/homogeneity is illustrated by comparing the analytical characterization of a site-specific DAR4 ADC to that of first-generation ADCs. Altogether, our results highlight the compatibility, versatility, and benefits of native MS approaches for the analytical characterization of all types of ADCs, including site-specific conjugates. Thus, we envision integrating native MS and IM-MS approaches, even in their latest state-of-the-art forms, into workflows that benchmark bioconjugation strategies. PMID:28406343

  20. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Restrictions on the sale, distribution and use of... Requirements for Manufacturers and Producers § 809.30 Restrictions on the sale, distribution and use of analyte... include the statement for class I exempt ASR's: “Analyte Specific Reagent. Analytical and performance...

  1. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Application of GC/MS Soft Ionization for Isomeric Biological Compound Analysis.

    PubMed

    Furuhashi, Takeshi; Okuda, Koji

    2017-09-03

    Isomers are compounds with the same molecular formula. Many different types of isomers are ubiquitous and play important roles in living organisms. Despite their early discovery, the actual analysis of isomers has been tricky and has confounded researchers. Using mass spectrometry (MS) to distinguish or identify isomers is an emergent topic and challenge for analytical chemists. We review some techniques for analyzing isomers with emphasis on MS, e.g., the roles of ion reaction, hydrogen-deuterium exchange, ion mobility mass spectrometry, ion spectroscopy, and energy change in producing isomer-specific fragments. In particular, soft ionization for gas chromatography-mass spectrometry (GC-MS) is a focus in this review. Awareness of the advantages and technical problems of these techniques would inspire innovation in future approaches.

  3. Current role of ICP-MS in clinical toxicology and forensic toxicology: a metallic profile.

    PubMed

    Goullé, Jean-Pierre; Saussereau, Elodie; Mahieu, Loïc; Guerbet, Michel

    2014-08-01

    As metal/metalloid exposure is inevitable owing to its omnipresence, it may exert toxicity in humans. Recent advances in metal/metalloid analysis have been made moving from flame atomic absorption spectrometry and electrothermal atomic absorption spectrometry to the multi-elemental inductively coupled plasma (ICP) techniques as ICP atomic emission spectrometry and ICP-MS. ICP-MS has now emerged as a major technique in inorganic analytical chemistry owing to its flexibility, high sensitivity and good reproducibility. This in depth review explores the ICP-MS metallic profile in human toxicology. It is now routinely used and of great importance, in clinical toxicology and forensic toxicology to explore biological matrices, specifically whole blood, plasma, urine, hair, nail, biopsy samples and tissues.

  4. Analytical cytology applied to detection of prognostically important cytogenetic aberrations: Current status and future directions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, J.W.; Pinkel, D.; Trask, B.

    1987-07-24

    This paper discusses the application of analytical cytology to the detection of clinically important chromosome abnormalities in human tumors. Flow cytometric measurements of DNA distributions have revealed that many human tumors have abnormal (usually elevated) DNA contents and that the occurrence of DNA abnormality may be diagnostically or prognostically important. However, DNA indices (ratio of tumor DNA content to normal DNA content) provide little information about the specific chromosome(s) involved in the DNA content abnormality. Fluorescence in situ hybridization with chromosome specific probes is suggested as a technique to facilitate detection of specific chromosome aneuploidy in interphase and metaphase humanmore » tumor cells. Fluorescence hybridization to nuclei on slides allows enumeration of brightly fluorescent nuclear domains as an estimate of the number of copies of the chromosome type for which the hybridization probe is specific. Fluorescence hybridization can also be made to nuclei in suspension. The fluorescence intensity can then be measured flow cytometrically as an indication of the number of chromosomes in each nucleus carrying the DNA sequence homologous to the probe. In addition, quantitative image analysis may be used to explore the position of chromosomes in interphase nuclei and to look for changes in the order that may eventually permit detection of clinicaly important conditions. 55 refs., 8 figs., 1 tab.« less

  5. Quantification of ferritin bound iron in human serum using species-specific isotope dilution mass spectrometry.

    PubMed

    Ren, Yao; Walczyk, Thomas

    2014-09-01

    Ferritin is a hollow sphere protein composed of 24 subunits that can store up to 4500 iron atoms in its inner cavity. It is mainly found in the liver and spleen but also in serum at trace levels. Serum ferritin is considered as the best single indicator in assessing body iron stores except liver or bone marrow biopsy. However, it is confounded by other disease conditions. Ferritin bound iron (FBI) and ferritin saturation have been suggested as more robust biomarkers. The current techniques for FBI determination are limited by low antibody specificity, low instrument sensitivity and possible analyte losses during sample preparation. The need for a highly sensitive and reliable method is widely recognized. Here we describe a novel technique to detect serum FBI using species-specific isotope dilution mass spectrometry (SS-IDMS). [(57)Fe]-ferritin was produced by biosynthesis and in vitro labeling with the (57)Fe spike in the form of [(57)Fe]-citrate after cell lysis and heat treatment. [(57)Fe]-ferritin for sample spiking was further purified by fast liquid protein chromatography. Serum ferritin and added [(57)Fe]-ferritin were separated from other iron species by ultrafiltration followed by isotopic analysis of FBI using negative thermal ionization mass spectrometry. Repeatability of our assay is 8% with an absolute detection limit of 18 ng FBI in the sample. As compared to other speciation techniques, SS-IDMS offers maximum control over sample losses and species conversion during analysis. The described technique may therefore serve as a reference technique for clinical applications of FBI as a new biomarker for assessing body iron status.

  6. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  7. A validation of event-related FMRI comparisons between users of cocaine, nicotine, or cannabis and control subjects.

    PubMed

    Murphy, Kevin; Dixon, Veronica; LaGrave, Kathleen; Kaufman, Jacqueline; Risinger, Robert; Bloom, Alan; Garavan, Hugh

    2006-07-01

    Noninvasive brain imaging techniques are a powerful tool for researching the effects of drug abuse on brain activation measures. However, because many drugs have direct vascular effects, the validity of techniques that depend on blood flow measures as a reflection of neuronal activity may be called into question. This may be of particular concern in event-related functional magnetic resonance imaging (fMRI), where current analytic techniques search for a specific shape in the hemodynamic response to neuronal activity. To investigate possible alterations in task-related activation as a result of drug abuse, fMRI scans were conducted on subjects in four groups as they performed a simple event-related finger-tapping task: users of cocaine, nicotine, or cannabis and control subjects. Activation measures, as determined by two different analytic methods, did not differ between the groups. A comparison between an intravenous saline and an intravenous cocaine condition in cocaine users found a similar null result. Further in-depth analyses of the shape of the hemodynamic responses in each group also showed no differences. This study demonstrates that drug groups may be compared with control subjects using event-related fMRI without the need for any post hoc procedures to correct for possible drug-induced cardiovascular alterations. Thus, fMRI activation differences reported between these drug groups can be more confidently interpreted as reflecting neuronal differences.

  8. The Preliminary Examination of Organics in the Returned Stardust Samples from Comet Wild 2

    NASA Technical Reports Server (NTRS)

    Sandford, S. A.; Aleon, J.; Alexander, C.; Butterworth, A.; Clemett, S. J.; Cody, G.; Cooper, G.; Dworkin, J. P.; Flynn, G. J.; Gilles, M. K.

    2006-01-01

    The primary objective of STARDUST is to collect coma samples from comet 8lP/Wild 2. These samples were collected by impact onto aerogel tiles on Jan 2, 2004 when the spacecraft flew through the comet's coma at a relative velocity of about 6.1 km/sec. Measurements of dust impacts on the front of the spacecraft suggest that the aerogel particle collector was impacted by 2800 +/- 500 particles larger than 15 micron in diameter. Following recovery of the Sample Return Capsule (SRC) on Jan 15, 2006, the aerogel collector trays will be removed in a clean room at JSC. After documentation of the collection, selected aerogel tiles will be removed and aerogel and cometary samples will be extracted for study. A number of different extraction techniques will be used, each optimized for the analytical technique that is to be used. The STARDUST Mission will carry out a 6 month preliminary examination (PE) of a small portion of the returned samples. The examination of the samples will be made by a number of subteams that will concentrate on specific aspects of the samples. One of these is the Organics PE Team (see the author list above for team members). These team members will use a number of analytical techniques to produce a preliminary characterization of the abundance and nature of the organics (if any) in the returned samples.

  9. Gaussian closure technique applied to the hysteretic Bouc model with non-zero mean white noise excitation

    NASA Astrophysics Data System (ADS)

    Waubke, Holger; Kasess, Christian H.

    2016-11-01

    Devices that emit structure-borne sound are commonly decoupled by elastic components to shield the environment from acoustical noise and vibrations. The elastic elements often have a hysteretic behavior that is typically neglected. In order to take hysteretic behavior into account, Bouc developed a differential equation for such materials, especially joints made of rubber or equipped with dampers. In this work, the Bouc model is solved by means of the Gaussian closure technique based on the Kolmogorov equation. Kolmogorov developed a method to derive probability density functions for arbitrary explicit first-order vector differential equations under white noise excitation using a partial differential equation of a multivariate conditional probability distribution. Up to now no analytical solution of the Kolmogorov equation in conjunction with the Bouc model exists. Therefore a wide range of approximate solutions, especially the statistical linearization, were developed. Using the Gaussian closure technique that is an approximation to the Kolmogorov equation assuming a multivariate Gaussian distribution an analytic solution is derived in this paper for the Bouc model. For the stationary case the two methods yield equivalent results, however, in contrast to statistical linearization the presented solution allows to calculate the transient behavior explicitly. Further, stationary case leads to an implicit set of equations that can be solved iteratively with a small number of iterations and without instabilities for specific parameter sets.

  10. Quantification issues of trace metal contaminants on silicon wafers by means of TOF-SIMS, ICP-MS, and TXRF

    NASA Astrophysics Data System (ADS)

    Rostam-Khani, P.; Hopstaken, M. J. P.; Vullings, P.; Noij, G.; O'Halloran, O.; Claassen, W.

    2004-06-01

    Measurement of surface metal contamination on silicon wafers is essential for yield enhancement in IC manufacturing. Vapor phase decomposition coupled with either inductively coupled plasma mass spectrometry (VPD-ICP-MS), or total reflection X-ray fluorescence (VPD-TXRF), TXRF and more recently time of flight secondary ion mass spectrometry (TOF-SIMS) are used to monitor surface metal contamination. These techniques complement each other in their respective strengths and weaknesses. For reliable and accurate quantification, so-called relative sensitivity factors (RSF) are required for TOF-SIMS analysis. For quantification purposes in VPD, the collection efficiency (CE) is important to ensure complete collection of contamination. A standard procedure has been developed that combines the determination of these RSFs as well as the collection efficiency using all the analytical techniques mentioned above. Therefore, sample wafers were intentionally contaminated and analyzed (by TOF-SIMS) directly after preparation. After VPD-ICP-MS, several scanned surfaces were analyzed again by TOF-SIMS. Comparing the intensities of the specific metals before and after the VPD-DC procedure on the scanned surface allows the determination of so-called removing efficiency (RE). In general, very good agreement was obtained comparing the four analytical techniques after updating the RSFs for TOF-SIMS. Progress has been achieved concerning the CE evaluation as well as determining the RSFs more precisely for TOF-SIMS.

  11. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.

  12. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  13. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  14. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  15. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  16. A fast method for detecting Cryptosporidium parvum oocysts in real world samples

    NASA Astrophysics Data System (ADS)

    Stewart, Shona; McClelland, Lindy; Maier, John

    2005-04-01

    Contamination of drinking water with pathogenic microorganisms such as Cryptosporidium has become an increasing concern in recent years. Cryptosporidium oocysts are particularly problematic, as infections caused by this organism can be life threatening in immunocompromised patients. Current methods for monitoring and analyzing water are often laborious and require experts to conduct. In addition, many of the techniques require very specific reagents to be employed. These factors add considerable cost and time to the analytical process. Raman spectroscopy provides specific molecular information on samples, and offers advantages of speed, sensitivity and low cost over current methods of water monitoring. Raman spectroscopy is an optical method that has demonstrated the capability to identify and differentiate microorganisms at the species and strain levels. In addition, this technique has exhibited sensitivities down to the single organism detection limit. We have employed Raman spectroscopy and Raman Chemical Imaging, in conjunction with chemometric techniques, to detect small numbers of oocysts in the presence of interferents derived from real-world water samples. Our investigations have also indicated that Raman Chemical Imaging may provide chemical and physiological information about an oocyst sample which complements information provided by the traditional methods. This work provides evidence that Raman imaging is a useful technique for consideration in the water quality industry.

  17. Bench-top validation testing of selected immunological and molecular Renibacterium salmoninarum diagnostic assays by comparison with quantitative bacteriological culture

    USGS Publications Warehouse

    Elliott, D.G.; Applegate, L.J.; Murray, A.L.; Purcell, M.K.; McKibben, C.L.

    2013-01-01

    No gold standard assay exhibiting error-free classification of results has been identified for detection of Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease. Validation of diagnostic assays for R. salmoninarum has been hindered by its unique characteristics and biology, and difficulties in locating suitable populations of reference test animals. Infection status of fish in test populations is often unknown, and it is commonly assumed that the assay yielding the most positive results has the highest diagnostic accuracy, without consideration of misclassification of results. In this research, quantification of R. salmoninarum in samples by bacteriological culture provided a standardized measure of viable bacteria to evaluate analytical performance characteristics (sensitivity, specificity and repeatability) of non-culture assays in three matrices (phosphate-buffered saline, ovarian fluid and kidney tissue). Non-culture assays included polyclonal enzyme-linked immunosorbent assay (ELISA), direct smear fluorescent antibody technique (FAT), membrane-filtration FAT, nested polymerase chain reaction (nested PCR) and three real-time quantitative PCR assays. Injection challenge of specific pathogen-free Chinook salmon, Oncorhynchus tshawytscha (Walbaum), with R. salmoninarum was used to estimate diagnostic sensitivity and specificity. Results did not identify a single assay demonstrating the highest analytical and diagnostic performance characteristics, but revealed strengths and weaknesses of each test.

  18. Mechanisms of Nanophase-Induced Desorption in LDI-MS. A Short Review

    PubMed Central

    Picca, Rosaria Anna; Calvano, Cosima Damiana; Cioffi, Nicola; Palmisano, Francesco

    2017-01-01

    Nanomaterials are frequently used in laser desorption ionization mass spectrometry (LDI-MS) as DI enhancers, providing excellent figures of merit for the analysis of low molecular weight organic molecules. In recent years, literature on this topic has benefited from several studies assessing the fundamental aspects of the ion desorption efficiency and the internal energy transfer, in the case of model analytes. Several different parameters have been investigated, including the intrinsic chemical and physical properties of the nanophase (chemical composition, thermal conductivity, photo-absorption efficiency, specific heat capacity, phase transition point, explosion threshold, etc.), along with morphological parameters such as the nanophase size, shape, and interparticle distance. Other aspects, such as the composition, roughness and defects of the substrate supporting the LDI-active nanophases, the nanophase binding affinity towards the target analyte, the role of water molecules, have been taken into account as well. Readers interested in nanoparticle based LDI-MS sub-techniques (SALDI-, SELDI-, NALDI- MS) will find here a concise overview of the recent findings in the specialized field of fundamental and mechanistic studies, shading light on the desorption ionization phenomena responsible of the outperforming MS data offered by these techniques. PMID:28368330

  19. Self-Powered Wireless Affinity-Based Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled RFID Antennas.

    PubMed

    Yuan, Mingquan; Alocilja, Evangelyn C; Chakrabartty, Shantanu

    2016-08-01

    This paper presents a wireless, self-powered, affinity-based biosensor based on the integration of paper-based microfluidics with our previously reported method for self-assembling radio-frequency (RF) antennas. At the core of the proposed approach is a silver-enhancement technique that grows portions of a RF antenna in regions where target antigens hybridize with target specific affinity probes. The hybridization regions are defined by a network of nitrocellulose based microfluidic channels which implement a self-powered approach to sample the reagent and control its flow and mixing. The integration substrate for the biosensor has been constructed using polyethylene and the patterning of the antenna on the substrate has been achieved using a low-cost ink-jet printing technique. The substrate has been integrated with passive radio-frequency identification (RFID) tags to demonstrate that the resulting sensor-tag can be used for continuous monitoring in a food supply-chain where direct measurement of analytes is typically considered to be impractical. We validate the proof-of-concept operation of the proposed sensor-tag using IgG as a model analyte and using a 915 MHz Ultra-high-frequency (UHF) RFID tagging technology.

  20. Translational research in pediatrics III: bronchoalveolar lavage.

    PubMed

    Radhakrishnan, Dhenuka; Yamashita, Cory; Gillio-Meina, Carolina; Fraser, Douglas D

    2014-07-01

    The role of flexible bronchoscopy and bronchoalveolar lavage (BAL) for the care of children with airway and pulmonary diseases is well established, with collected BAL fluid most often used clinically for microbiologic pathogen identification and cellular analyses. More recently, powerful analytic research methods have been used to investigate BAL samples to better understand the pathophysiological basis of pediatric respiratory disease. Investigations have focused on the cellular components contained in BAL fluid, such as macrophages, lymphocytes, neutrophils, eosinophils, and mast cells, as well as the noncellular components such as serum molecules, inflammatory proteins, and surfactant. Molecular techniques are frequently used to investigate BAL fluid for the presence of infectious pathologies and for cellular gene expression. Recent advances in proteomics allow identification of multiple protein expression patterns linked to specific respiratory diseases, whereas newer analytic techniques allow for investigations on surfactant quantification and function. These translational research studies on BAL fluid have aided our understanding of pulmonary inflammation and the injury/repair responses in children. We review the ethics and practices for the execution of BAL in children for translational research purposes, with an emphasis on the optimal handling and processing of BAL samples. Copyright © 2014 by the American Academy of Pediatrics.

  1. From Theory Use to Theory Building in Learning Analytics: A Commentary on "Learning Analytics to Support Teachers during Synchronous CSCL"

    ERIC Educational Resources Information Center

    Chen, Bodong

    2015-01-01

    In this commentary on Van Leeuwen (2015, this issue), I explore the relation between theory and practice in learning analytics. Specifically, I caution against adhering to one specific theoretical doctrine while ignoring others, suggest deeper applications of cognitive load theory to understanding teaching with analytics tools, and comment on…

  2. Identification of species origin of meat and meat products on the DNA basis: a review.

    PubMed

    Kumar, Arun; Kumar, Rajiv Ranjan; Sharma, Brahm Deo; Gokulakrishnan, Palanisamy; Mendiratta, Sanjod Kumar; Sharma, Deepak

    2015-01-01

    The adulteration/substitution of meat has always been a concern for various reasons such as public health, religious factors, wholesomeness, and unhealthy competition in meat market. Consumer should be protected from these malicious practices of meat adulterations by quick, precise, and specific identification of meat animal species. Several analytical methodologies have been employed for meat speciation based on anatomical, histological, microscopic, organoleptic, chemical, electrophoretic, chromatographic, or immunological principles. However, by virtue of their inherent limitations, most of these techniques have been replaced by the recent DNA-based molecular techniques. In the last decades, several methods based on polymerase chain reaction have been proposed as useful means for identifying the species origin in meat and meat products, due to their high specificity and sensitivity, as well as rapid processing time and low cost. This review intends to provide an updated and extensive overview on the DNA-based methods for species identification in meat and meat products.

  3. Cellular Oxygen and Nutrient Sensing in Microgravity Using Time-Resolved Fluorescence Microscopy

    NASA Technical Reports Server (NTRS)

    Szmacinski, Henryk

    2003-01-01

    Oxygen and nutrient sensing is fundamental to the understanding of cell growth and metabolism. This requires identification of optical probes and suitable detection technology without complex calibration procedures. Under this project Microcosm developed an experimental technique that allows for simultaneous imaging of intra- and inter-cellular events. The technique consists of frequency-domain Fluorescence Lifetime Imaging Microscopy (FLIM), a set of identified oxygen and pH probes, and methods for fabrication of microsensors. Specifications for electronic and optical components of FLIM instrumentation are provided. Hardware and software were developed for data acquisition and analysis. Principles, procedures, and representative images are demonstrated. Suitable lifetime sensitive oxygen, pH, and glucose probes for intra- and extra-cellular measurements of analyte concentrations have been identified and tested. Lifetime sensing and imaging have been performed using PBS buffer, culture media, and yeast cells as a model systems. Spectral specifications, calibration curves, and probes availability are also provided in the report.

  4. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  5. Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975

    DTIC Science & Technology

    1975-09-01

    and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the

  6. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  7. A novel attack method about double-random-phase-encoding-based image hiding method

    NASA Astrophysics Data System (ADS)

    Xu, Hongsheng; Xiao, Zhijun; Zhu, Xianchen

    2018-03-01

    By using optical image processing techniques, a novel text encryption and hiding method applied by double-random phase-encoding technique is proposed in the paper. The first step is that the secret message is transformed into a 2-dimension array. The higher bits of the elements in the array are used to fill with the bit stream of the secret text, while the lower bits are stored specific values. Then, the transformed array is encoded by double random phase encoding technique. Last, the encoded array is embedded on a public host image to obtain the image embedded with hidden text. The performance of the proposed technique is tested via analytical modeling and test data stream. Experimental results show that the secret text can be recovered either accurately or almost accurately, while maintaining the quality of the host image embedded with hidden data by properly selecting the method of transforming the secret text into an array and the superimposition coefficient.

  8. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  9. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  10. Analytical Parameters of an Amperometric Glucose Biosensor for Fast Analysis in Food Samples.

    PubMed

    Artigues, Margalida; Abellà, Jordi; Colominas, Sergi

    2017-11-14

    Amperometric biosensors based on the use of glucose oxidase (GOx) are able to combine the robustness of electrochemical techniques with the specificity of biological recognition processes. However, very little information can be found in literature about the fundamental analytical parameters of these sensors. In this work, the analytical behavior of an amperometric biosensor based on the immobilization of GOx using a hydrogel (Chitosan) onto highly ordered titanium dioxide nanotube arrays (TiO₂NTAs) has been evaluated. The GOx-Chitosan/TiO₂NTAs biosensor showed a sensitivity of 5.46 μA·mM -1 with a linear range from 0.3 to 1.5 mM; its fundamental analytical parameters were studied using a commercial soft drink. The obtained results proved sufficient repeatability (RSD = 1.9%), reproducibility (RSD = 2.5%), accuracy (95-105% recovery), and robustness (RSD = 3.3%). Furthermore, no significant interferences from fructose, ascorbic acid and citric acid were obtained. In addition, the storage stability was further examined, after 30 days, the GOx-Chitosan/TiO₂NTAs biosensor retained 85% of its initial current response. Finally, the glucose content of different food samples was measured using the biosensor and compared with the respective HPLC value. In the worst scenario, a deviation smaller than 10% was obtained among the 20 samples evaluated.

  11. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  12. Effective Teaching, Effective Living: A Review of Behavior Analysis for Effective Teaching by Julie S. Vargas

    PubMed Central

    Austin, Jennifer L; Soeda, Jennifer M

    2009-01-01

    Elevated academic standards and expectations, along with competing contingencies outside the classroom, have given the teaching profession new and demanding challenges with which to contend. Although previous textbooks have addressed behavior analytic techniques specifically directed for the classroom environment, few have done so with comprehensive overviews of both instructional and basic behavioral strategies from a teacher's perspective. This review describes Julie Vargas' book in terms of its contribution to the educational field, as well as its impressive style for reaching its target audience.

  13. Nonlinear crack analysis with finite elements

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Saleme, E.; Pifko, A.; Levine, H. S.

    1973-01-01

    The application of finite element techniques to the analytic representation of the nonlinear behavior of arbitrary two-dimensional bodies containing cracks is discussed. Specific methods are proposed using which it should be possible to obtain information concerning: the description of the maximum, minimum, and residual near-tip stress and strain fields; the effects of crack closure on the near-tip behavior of stress and strain fields during cyclic loading into the plastic range; the stress-strain and displacement field behavior associated with a nonstationary crack; and the effects of large rotation near the crack tip.

  14. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  15. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. A Literature Survey and Experimental Evaluation of the State-of-the-Art in Uplift Modeling: A Stepping Stone Toward the Development of Prescriptive Analytics.

    PubMed

    Devriendt, Floris; Moldovan, Darie; Verbeke, Wouter

    2018-03-01

    Prescriptive analytics extends on predictive analytics by allowing to estimate an outcome in function of control variables, allowing as such to establish the required level of control variables for realizing a desired outcome. Uplift modeling is at the heart of prescriptive analytics and aims at estimating the net difference in an outcome resulting from a specific action or treatment that is applied. In this article, a structured and detailed literature survey on uplift modeling is provided by identifying and contrasting various groups of approaches. In addition, evaluation metrics for assessing the performance of uplift models are reviewed. An experimental evaluation on four real-world data sets provides further insight into their use. Uplift random forests are found to be consistently among the best performing techniques in terms of the Qini and Gini measures, although considerable variability in performance across the various data sets of the experiments is observed. In addition, uplift models are frequently observed to be unstable and display a strong variability in terms of performance across different folds in the cross-validation experimental setup. This potentially threatens their actual use for business applications. Moreover, it is found that the available evaluation metrics do not provide an intuitively understandable indication of the actual use and performance of a model. Specifically, existing evaluation metrics do not facilitate a comparison of uplift models and predictive models and evaluate performance either at an arbitrary cutoff or over the full spectrum of potential cutoffs. In conclusion, we highlight the instability of uplift models and the need for an application-oriented approach to assess uplift models as prime topics for further research.

  17. Detection of malignant lesions in vivo in the upper gastrointestinal tract using image-guided Raman endoscopy

    NASA Astrophysics Data System (ADS)

    Bergholt, Mads Sylvest; Zheng, Wei; Lin, Kan; Ho, Khek Yu; Yeoh, Khay Guan; Teh, Ming; So, Jimmy Bok Yan; Huang, Zhiwei

    2012-01-01

    Raman spectroscopy is a vibrational analytic technique sensitive to the changes in biomolecular composition and conformations occurring in tissue. With our most recent development of near-infrared (NIR) Raman endoscopy integrated with diagnostic algorithms, in vivo real-time Raman diagnostics has been realized under multimodal wide-field imaging (i.e., white- light reflectance (WLR), narrow-band imaging (NBI), autofluorescence imaging (AFI)) modalities. A selection of 177 patients who previously underwent Raman endoscopy (n=2510 spectra) was used to render two robust models based on partial least squares - discriminant analysis (PLS-DA) for esophageal and gastric cancer diagnosis. The Raman endoscopy technique was validated prospectively on 4 new gastric and esophageal patients for in vivo tissue diagnosis. The Raman endoscopic technique could identify esophageal cancer in vivo with a sensitivity of 88.9% (8/9) and specificity of 100.0% (11/11) and gastric cancers with a sensitivity of 77.8% (14/18) and specificity of 100.0% (13/13). This study realizes for the first time the image-guided Raman endoscopy for real-time in vivo diagnosis of malignancies in the esophagus and gastric at the biomolecular level.

  18. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  19. Comprehensive two-dimensional gas chromatography and food sensory properties: potential and challenges.

    PubMed

    Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2015-01-01

    Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.

  20. Implicit motor learning promotes neural efficiency during laparoscopy.

    PubMed

    Zhu, Frank F; Poolton, Jamie M; Wilson, Mark R; Hu, Yong; Maxwell, Jon P; Masters, Rich S W

    2011-09-01

    An understanding of differences in expert and novice neural behavior can inform surgical skills training. Outside the surgical domain, electroencephalographic (EEG) coherence analyses have shown that during motor performance, experts display less coactivation between the verbal-analytic and motor planning regions than their less skilled counterparts. Reduced involvement of verbal-analytic processes suggests greater neural efficiency. The authors tested the utility of an implicit motor learning intervention specifically devised to promote neural efficiency by reducing verbal-analytic involvement in laparoscopic performance. In this study, 18 novices practiced a movement pattern on a laparoscopic trainer with either conscious awareness of the movement pattern (explicit motor learning) or suppressed awareness of the movement pattern (implicit motor learning). In a retention test, movement accuracy was compared between the conditions, and coactivation (EEG coherence) was assessed between the motor planning (Fz) region and both the verbal-analytic (T3) and the visuospatial (T4) cortical regions (T3-Fz and T4-Fz, respectively). Movement accuracy in the conditions was not different in a retention test (P = 0.231). Findings showed that the EEG coherence scores for the T3-Fz regions were lower for the implicit learners than for the explicit learners (P = 0.027), but no differences were apparent for the T4-Fz regions (P = 0.882). Implicit motor learning reduced EEG coactivation between verbal-analytic and motor planning regions, suggesting that verbal-analytic processes were less involved in laparoscopic performance. The findings imply that training techniques that discourage nonessential coactivation during motor performance may provide surgeons with more neural resources with which to manage other aspects of surgery.

  1. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  2. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  3. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets

    PubMed Central

    Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2010-01-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595

  4. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  5. Need total sulfur content? Use chemiluminescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubala, S.W.; Campbell, D.N.; DiSanzo, F.P.

    Regulations issued by the United States Environmental Protection Agency require petroleum refineries to reduce or control the amount of total sulfur present in their refined products. These legislative requirements have led many refineries to search for online instrumentation that can produce accurate and repeatable total sulfur measurements within allowed levels. Several analytical methods currently exist to measure total sulfur content. They include X-ray fluorescence (XRF), microcoulometry, lead acetate tape, and pyrofluorescence techniques. Sulfur-specific chemiluminescence detection (SSCD) has recently received much attention due to its linearity, selectivity, sensitivity, and equimolar response. However, its use has been largely confined to the areamore » of gas chromatography. This article focuses on the special design considerations and analytical utility of an SSCD system developed to determine total sulfur content in gasoline. The system exhibits excellent linearity and selectivity, the ability to detect low minimum levels, and an equimolar response to various sulfur compounds. 2 figs., 2 tabs.« less

  6. Quasinormal modes of a strongly coupled nonconformal plasma and approach to criticality

    NASA Astrophysics Data System (ADS)

    Betzios, Panagiotis; Gürsoy, Umut; Järvinen, Matti; Policastro, Giuseppe

    2018-04-01

    We study fluctuations around equilibrium in a class of strongly interacting nonconformal plasmas using holographic techniques. In particular, we calculate the quasinormal mode spectrum of black hole backgrounds that approach Chamblin-Reall plasmas in the IR. In a specific limit, related to the exact linear-dilaton background in string theory, we observe that the plasma approaches criticality and we obtain the quasinormal spectrum analytically. We regulate the critical limit by gluing the IR geometry that corresponds to the nonconformal plasma to a part of AdS space-time in the UV. Near criticality, the spectrum can still be computed analytically and we find two sets of quasinormal modes, related to the IR and UV parts of the geometry. In the critical limit, the quasinormal modes accumulate to form a branch cut in the correlators of the energy-momentum tensor on the real axis of the complex frequency plane.

  7. Evaluation of capillary electrophoresis for in-flight ionic contaminant monitoring of SSF potable water

    NASA Technical Reports Server (NTRS)

    Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.

    1992-01-01

    Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.

  8. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  9. Systematic Assessment of the Hemolysis Index: Pros and Cons.

    PubMed

    Lippi, Giuseppe

    2015-01-01

    Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.

  10. "Reagentless" flow injection determination of ammonia and urea using membrane separation and solid phase basification

    NASA Technical Reports Server (NTRS)

    Akse, J. R.; Thompson, J. O.; Sauer, R. L.; Atwater, J. E.

    1998-01-01

    Flow injection analysis instrumentation and methodology for the determination of ammonia and ammonium ions in an aqueous solution are described. Using in-line solid phase basification beds containing crystalline media. the speciation of ammoniacal nitrogen is shifted toward the un-ionized form. which diffuses in the gas phase across a hydrophobic microporous hollow fiber membrane into a pure-water-containing analytical stream. The two streams flow in a countercurrent configuration on opposite sides of the membrane. The neutral pH of the analytical stream promotes the formation of ammonium cations, which are detected using specific conductance. The methodology provides a lower limit of detection of 10 microgram/L and a dynamic concentration range spanning three orders of magnitude using a 315-microliters sample injection volume. Using immobilized urease to enzymatically promote the hydrolysis of urea to produce ammonia and carbon dioxide, the technique has been extended to the determination of urea.

  11. Investigating noncovalent squarylium dye-protein interactions by capillary electrophoresis-frontal analysis.

    PubMed

    Yan, Weiying; Colyer, Christa L

    2006-11-24

    Noncovalent interactions between fluorescent probe molecules and protein analyte molecules, which typically occur with great speed and minimal sample handling, form the basis of many high sensitivity analytical techniques. Understanding the nature of these interactions and the composition of the resulting complexes represents an important area of study that can be facilitated by capillary electrophoresis (CE). Specifically, we will present how frontal analysis (FA) and Hummel-Dreyer (HD) methods can be implemented with CE to determine association constants and stoichiometries of noncovalent complexes of the red luminescent squarylium dye Red-1c with bovine serum albumin (BSA) and beta-lactoglobulin A. By adjusting solution conditions, such as pH or ionic strength, it is possible to selectively modify the binding process. As such, conditions for optimal selectivity for labeling reactions can be established by capillary electrophoresis-frontal analysis (CE-FA) investigations.

  12. Electrochemical concentration measurements for multianalyte mixtures in simulated electrorefiner salt

    NASA Astrophysics Data System (ADS)

    Rappleye, Devin Spencer

    The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).

  13. Analytical methods for gelatin differentiation from bovine and porcine origins and food products.

    PubMed

    Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B

    2012-01-01

    Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®

  14. An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. L.; Cosby, R. M.

    1976-01-01

    Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.

  15. Carbon-carbon mirrors for exoatmospheric and space applications

    NASA Astrophysics Data System (ADS)

    Krumweide, Duane E.; Wonacott, Gary D.; Woida, Patrick M.; Woida, Rigel Q.; Shih, Wei

    2007-09-01

    The cost and leadtime associated with beryllium has forced the MDA and other defense agencies to look for alternative materials with similar structural and thermal properties. The use of carbon-carbon material, specifically in optical components has been demonstrated analytically in prior SBIR work at San Diego Composites. Carbon-carbon material was chosen for its low in-plane and through-thickness CTE (athermal design), high specific stiffness, near-zero coefficient of moisture expansion, availability of material (specifically c-c honeycomb for lightweight substrates), and compatibility with silicon monoxide (SiO) and silicon dioxide (SiO II) coatings. Subsequent development work has produced shaped carbon-carbon sandwich substrates which have been ground, polished, coated and figured using traditional optical processing. Further development has also been done on machined monolithic carbon-carbon mirror substrates which have also been processed using standard optical finishing techniques.

  16. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  17. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  18. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Bioanalytical Applications of Fluorescence Line-Narrowing and Non-Line-Narrowing Spectroscopy Interfaced with Capillary Electrophoresis and High-Performance Liquid Chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Kenneth Paul

    Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less

  20. Chemometric applications to assess quality and critical parameters of virgin and extra-virgin olive oil. A review.

    PubMed

    Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo

    2016-03-24

    Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. New tools for investigating student learning in upper-division electrostatics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.

    Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.

  2. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Raman Spectrometry.

    ERIC Educational Resources Information Center

    Gardiner, Derek J.

    1980-01-01

    Reviews mainly quantitative analytical applications in the field of Raman spectrometry. Includes references to other reviews, new and analytically untested techniques, and novel sampling and instrument designs. Cites 184 references. (CS)

  4. WIPP waste characterization program sampling and analysis guidance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less

  5. The HVT technique and the 'uncertainty' relation for central potentials

    NASA Astrophysics Data System (ADS)

    Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th

    2004-08-01

    The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.

  6. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...

  7. Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1982-01-01

    Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…

  8. Development of electrical test procedures for qualification of spacecraft against EID. Volume 2: Review and specification of test procedures

    NASA Technical Reports Server (NTRS)

    Wilkenfeld, J. M.; Harlacher, B. L.; Mathews, D.

    1982-01-01

    A combined experimental and analytical program to develop system electrical test procedures for the qualification of spacecraft against damage produced by space-electron-induced discharges (EID) occurring on spacecraft dielectric outer surfaces is described. A review and critical evaluation of possible approaches to qualify spacecraft against space electron-induced discharges (EID) is presented. A variety of possible schemes to simulate EID electromagnetic effects produced in spacecraft was studied. These techniques form the principal element of a provisional, recommended set of test procedures for the EID qualification spacecraft. Significant gaps in our knowledge about EID which impact the final specification of an electrical test to qualify spacecraft against EID are also identified.

  9. Synthesis and Characterization of Templated Ion Exchange Resins for the Selective Complexion of Actinide Ions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murrray, George M.; Uy, O. Manuel

    The purpose of this research is to develop polymeric extractants for the selective complexation of uranyl ions (and subsequently other actinyl and actinide ions) from aqueous solutions. Selectivity for a specific actinide ion is obtained by providing the polymers with cavities lined with complexing ligands so arranged as to match the charge, coordination number, coordination geometry, and size of the actinide ion. These cavity-containing polymers are produced by using a specific actinide ion (or surrogate) as a template around which monomeric complexing ligands are polymerized. The polymers provide useful sequestering agents for removing actinide ions from wastes and will formmore » the basis for a variety of analytical techniques for actinide determination.« less

  10. 21 CFR 809.30 - Restrictions on the sale, distribution and use of analyte specific reagents.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... other than providing diagnostic information to patients and practitioners, e.g., forensic, academic... include the statement for class I exempt ASR's: “Analyte Specific Reagent. Analytical and performance... and performance characteristics are not established”; and (4) Shall not make any statement regarding...

  11. Active Control of Inlet Noise on the JT15D Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.

    1999-01-01

    This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.

  12. Development Of Antibody-Based Fiber-Optic Sensors

    NASA Astrophysics Data System (ADS)

    Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan

    1988-06-01

    The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.

  13. Search for life on Mars: Evaluation of techniques

    NASA Technical Reports Server (NTRS)

    Schwartz, D. E.; Mancinelli, R. L.; White, M. R.

    1995-01-01

    An important question for exobiology is, did life evolve on Mars? To answer this question, experiments must be conducted on the martian surface. Given current mission constraints on mass, power, and volume, these experiments can only be performed using proposed analytical techniques such as: electron microscopy, X-ray fluorescence, X-ray diffraction, a-proton backscatter, g-ray spectrometry, differential thermal analysis, differential scanning calorimetry, pyrolysis gas chromatography, mass spectrometry, and specific element detectors. Using prepared test samples consisting of 1% organic matter (bovine serum albumin) in palagonite and a mixture of palagonite, clays, iron oxides, and evaporites, it was determined that a combination of X-ray diffraction and differential thermal analysis coupled with gas chromatography provides the best insight into the chemistry, mineralogy, and geological history of the samples.

  14. Nondestructive testing of Scout rocket motors

    NASA Technical Reports Server (NTRS)

    Oaks, A. E.

    1972-01-01

    The nondestructive tests applied to Scout rocket motors were reviewed and appraised. Analytical techniques were developed to evaluate the capabilities of the radiographic and ultrasonic procedures used. Major problem areas found were the inadequacy of high voltage radiography for detecting unbonds and propellant cracks having narrow widths, the inability to relate the ultrasonic signals received from flat-bottomed holes in standards to those received from real defects and in the general area of the specification of acceptance criteria and how these were to be met. To counter the deficiencies noted, analyses were conducted to the potential utility of radiometric, acoustic, holographic and thermographic techniques for motor and nozzle bond inspection, a new approach to qualifying magnetic particle inspection and the application of acoustic emission analysis to the evaluation of proof and leak test data.

  15. Search for life on Mars: evaluation of techniques.

    PubMed

    Schwartz, D E; Mancinelli, R L; White, M R

    1995-03-01

    An important question for exobiology is, did life evolve on Mars? To answer this question, experiments must be conducted on the martian surface. Given current mission constraints on mass, power, and volume, these experiments can only be performed using proposed analytical techniques such as: electron microscopy, X-ray fluorescence, X-ray diffraction, alpha-proton backscatter, gamma-ray spectrometry, differential thermal analysis, differential scanning calorimetry, pyrolysis gas chromatography, mass spectrometry, and specific element detectors. Using prepared test samples consisting of 1% organic matter (bovine serum albumin) in palagonite and a mixture of palagonite, clays, iron oxides, and evaporites, it was determined that a combination of X-ray diffraction and differential thermal analysis coupled with gas chromatography provides the best insight into the chemistry, mineralogy, and geological history of the samples.

  16. Elemental and isotopic imaging of biological samples using NanoSIMS.

    PubMed

    Kilburn, Matt R; Clode, Peta L

    2014-01-01

    With its low detection limits and the ability to analyze most of the elements in the periodic table, secondary ion mass spectrometry (SIMS) represents one of the most versatile in situ analytical techniques available, and recent developments have resulted in significant advantages for the use of imaging mass spectrometry in biological and biomedical research. Increases in spatial resolution and sensitivity allow detailed interrogation of samples at relevant scales and chemical concentrations. Advances in dynamic SIMS, specifically with the advent of NanoSIMS, now allow the tracking of stable isotopes within biological systems at subcellular length scales, while static SIMS combines subcellular imaging with molecular identification. In this chapter, we present an introduction to the SIMS technique, with particular reference to NanoSIMS, and discuss its application in biological and biomedical research.

  17. SERS-based application in food analytics (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Cialla-May, Dana; Radu, Andreea; Jahn, Martin; Weber, Karina; Popp, Jürgen

    2017-02-01

    To establish detection schemes in life science applications, specific and sensitive methods allowing for fast detection times are required. Due to the interaction of molecules with strong electromagnetic fields excited at metallic nanostructures, the molecular fingerprint specific Raman spectrum is increased by several orders of magnitude. This effect is described as surface-enhanced Raman spectroscopy (SERS) and became a very powerful analytical tool in many fields of application. Within this presentation, we will introduce innovative bottom-up strategies to prepare SERS-active nanostructures coated with a lipophilic sensor layer. To do so, the food colorant Sudan III, an indirect carcinogen substance found in chili powder, palm oil or spice mixtures, is detected quantitatively in the background of the competitor riboflavin as well as paprika powder extracts. The SERS-based detection of azorubine (E122) in commercial available beverages with different complexity (e.g. sugar content, alcohol concentration) illustrates the strong potential of SERS as a qualitative as well as semiquantitative prescan method in food analytics. Here, a good agreement between the estimated concentration employing SERS as well as the gold standard technique HPLC, a highly laborious method, is found. Finally, SERS is applied to detect vitamin B2 and B12 in cereals as well as the estimate the ratio of lycopene and β-carotene in tomatoes. Acknowledgement: Funding the projects "QuantiSERS" and "Jenaer Biochip Initiative 2.0" within the framework "InnoProfile Transfer - Unternehmen Region" the Federal Ministry of Education and Research, Germany (BMBF) is gratefully acknowledged.

  18. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    PubMed

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.

  19. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  20. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  1. Magnetoresistive biosensors for quantitative proteomics

    NASA Astrophysics Data System (ADS)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  2. Discourse-Centric Learning Analytics: Mapping the Terrain

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…

  3. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  4. Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations

    ERIC Educational Resources Information Center

    Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei

    2016-01-01

    Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…

  5. Techniques for sensing methanol concentration in aqueous environments

    NASA Technical Reports Server (NTRS)

    Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)

    2001-01-01

    An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.

  6. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    DOT National Transportation Integrated Search

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  7. INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ANTIBIOTICS WITH ANALYTICAL CHEMISTRY

    EPA Science Inventory

    Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques.

  8. Reassessment of the NH4 NO3 thermal decomposition technique for calibration of the N2 O isotopic composition.

    PubMed

    Mohn, Joachim; Gutjahr, Wilhelm; Toyoda, Sakae; Harris, Eliza; Ibraim, Erkan; Geilmann, Heike; Schleppi, Patrick; Kuhn, Thomas; Lehmann, Moritz F; Decock, Charlotte; Werner, Roland A; Yoshida, Naohiro; Brand, Willi A

    2016-09-08

    In the last few years, the study of N 2 O site-specific nitrogen isotope composition has been established as a powerful technique to disentangle N 2 O emission pathways. This trend has been accelerated by significant analytical progress in the field of isotope-ratio mass-spectrometry (IRMS) and more recently quantum cascade laser absorption spectroscopy (QCLAS). Methods The ammonium nitrate (NH 4 NO 3 ) decomposition technique provides a strategy to scale the 15 N site-specific (SP ≡ δ 15 N α - δ 15 N β ) and bulk (δ 15 N bulk  = (δ 15 N α  + δ 15 N β )/2) isotopic composition of N 2 O against the international standard for the 15 N/ 14 N isotope ratio (AIR-N 2 ). Within the current project 15 N fractionation effects during thermal decomposition of NH 4 NO 3 on the N 2 O site preference were studied using static and dynamic decomposition techniques. The validity of the NH 4 NO 3 decomposition technique to link NH 4 + and NO 3 - moiety-specific δ 15 N analysis by IRMS to the site-specific nitrogen isotopic composition of N 2 O was confirmed. However, the accuracy of this approach for the calibration of δ 15 N α and δ 15 N β values was found to be limited by non-quantitative NH 4 NO 3 decomposition in combination with substantially different isotope enrichment factors for the conversion of the NO 3 - or NH 4 + nitrogen atom into the α or β position of the N 2 O molecule. The study reveals that the completeness and reproducibility of the NH 4 NO 3 decomposition reaction currently confine the anchoring of N 2 O site-specific isotopic composition to the international isotope ratio scale AIR-N 2 . The authors suggest establishing a set of N 2 O isotope reference materials with appropriate site-specific isotopic composition, as community standards, to improve inter-laboratory compatibility. This article is protected by copyright. All rights reserved.

  9. Species authentication and geographical origin discrimination of herbal medicines by near infrared spectroscopy: A review.

    PubMed

    Wang, Pei; Yu, Zhiguo

    2015-10-01

    Near infrared (NIR) spectroscopy as a rapid and nondestructive analytical technique, integrated with chemometrics, is a powerful process analytical tool for the pharmaceutical industry and is becoming an attractive complementary technique for herbal medicine analysis. This review mainly focuses on the recent applications of NIR spectroscopy in species authentication of herbal medicines and their geographical origin discrimination.

  10. MeRy-B: a web knowledgebase for the storage, visualization, analysis and annotation of plant NMR metabolomic profiles

    PubMed Central

    2011-01-01

    Background Improvements in the techniques for metabolomics analyses and growing interest in metabolomic approaches are resulting in the generation of increasing numbers of metabolomic profiles. Platforms are required for profile management, as a function of experimental design, and for metabolite identification, to facilitate the mining of the corresponding data. Various databases have been created, including organism-specific knowledgebases and analytical technique-specific spectral databases. However, there is currently no platform meeting the requirements for both profile management and metabolite identification for nuclear magnetic resonance (NMR) experiments. Description MeRy-B, the first platform for plant 1H-NMR metabolomic profiles, is designed (i) to provide a knowledgebase of curated plant profiles and metabolites obtained by NMR, together with the corresponding experimental and analytical metadata, (ii) for queries and visualization of the data, (iii) to discriminate between profiles with spectrum visualization tools and statistical analysis, (iv) to facilitate compound identification. It contains lists of plant metabolites and unknown compounds, with information about experimental conditions, the factors studied and metabolite concentrations for several plant species, compiled from more than one thousand annotated NMR profiles for various organs or tissues. Conclusion MeRy-B manages all the data generated by NMR-based plant metabolomics experiments, from description of the biological source to identification of the metabolites and determinations of their concentrations. It is the first database allowing the display and overlay of NMR metabolomic profiles selected through queries on data or metadata. MeRy-B is available from http://www.cbib.u-bordeaux2.fr/MERYB/index.php. PMID:21668943

  11. Uncovering the structure of (super)conformal field theories

    NASA Astrophysics Data System (ADS)

    Liendo, Pedro

    Conformal field theories (CFTs) are of central importance in modern theoretical physics, with applications that range from condensed matter physics to particle theory phenomenology. In this Ph.D. thesis we study CFTs from two somehow orthogonal (but complementary) points of view. In the first approach we concentrate our efforts in two specific examples: the Veneziano limit of N = 2 and N = 1 superconformal QCD. The addition of supersymmetry makes these theories amenable to analytical analysis. In particular, we use the correspondence between single trace operators and states of a spin chain to study the integrability properties of each theory. Our results indicate that these theories are not completely integrable, but they do contain some subsectors in which integrability might hold. In the second approach, we consider the so-called "bootstrap program'', which is the ambitious idea that the restrictions imposed by conformal symmetry (crossing symmetry in particular) are so powerful that starting from a few basic assumptions one should be able to fix the form of a theory. In this thesis we apply bootstrap techniques to CFTs in the presence of a boundary. We study two-point functions using analytical and numerical methods. One-loop results were re-obtained from crossing symmetry alone and a variety of numerical bounds for conformal dimensions of operators were obtained. These bounds are quite general and valid for any CFT in the presence of a boundary, in contrast to our first approach where a specific set of theories was studied. A natural continuation of this work is to apply bootstrap techniques to supersymmetric theories. Some preliminary results along these lines are presented.

  12. Production of monoclonal antibody to acaricide dicofol and its derivatives.

    PubMed

    Hongsibsong, Surat; Prapamontol, Tippawan; Suphavilai, Chaisuree; Wipasa, Jiraprapa; Pattarawarapan, Mookda; Kasinrerk, Watchara

    2010-12-01

    In Thailand detection of acaricide dicofol residues has been sporadically performed due to the limitation of analytical techniques. Conventional analytical methods for detecting dicofol residues most often use chromatographic-based techniques. Our ultimate aim is to develop an alternative method for rapidly analyzing dicofol residues in vegetables and fruit samples. Here we report the production of monoclonal antibodies specific to dicofol and its derivatives. Hapten-protein carriers were prepared by linking succinic anhydride to dichlorobenzhydrol (DCBH), which was then conjugated to bovine serum albumin (BSA) and oval albumin (OVA). DCBH-BSA conjugate was used as immunogen while DCBH-OVA conjugate was used as capture antigen for competitive inhibition assay. Female BALB/c mice were immunized with DCBH-BSA conjugate subcutaneously, and antibody (Ab) level was determined 2 weeks after the last immunization. Spleen cells producing high titer antibody were isolated and fused with myeloma cells of P3.X6.Ag8.653. After limiting dilutions, antibody produced by one clone had high affinity, which was found to be of IgG1 with κ light chain. Specificity and inhibition concentrations of the monoclonal antibody (MAb) were determined by competitive indirect ELISA with dicofol, and its 50% (IC(50)) was 0.28 μg/mL. Working ranges of the developed immunoassay were from 0.07 to 25 μg/mL. Hence, the prepared MAb will be able to be applied for immunoassay development for detecting dicofol residue in vegetables and fruits far below the maximum residue limit such that 5 g of fruits and berries can be detected below 0.1 mg/kg.

  13. Microsphere integrated microfluidic disk: synergy of two techniques for rapid and ultrasensitive dengue detection.

    PubMed

    Hosseini, Samira; Aeinehvand, Mohammad M; Uddin, Shah M; Benzina, Abderazak; Rothan, Hussin A; Yusof, Rohana; Koole, Leo H; Madou, Marc J; Djordjevic, Ivan; Ibrahim, Fatimah

    2015-11-09

    The application of microfluidic devices in diagnostic systems is well-established in contemporary research. Large specific surface area of microspheres, on the other hand, has secured an important position for their use in bioanalytical assays. Herein, we report a combination of microspheres and microfluidic disk in a unique hybrid platform for highly sensitive and selective detection of dengue virus. Surface engineered polymethacrylate microspheres with carefully designed functional groups facilitate biorecognition in a multitude manner. In order to maximize the utility of the microspheres' specific surface area in biomolecular interaction, the microfluidic disk was equipped with a micromixing system. The mixing mechanism (microballoon mixing) enhances the number of molecular encounters between spheres and target analyte by accessing the entire sample volume more effectively, which subsequently results in signal amplification. Significant reduction of incubation time along with considerable lower detection limits were the prime motivations for the integration of microspheres inside the microfluidic disk. Lengthy incubations of routine analytical assays were reduced from 2 hours to 5 minutes while developed system successfully detected a few units of dengue virus. Obtained results make this hybrid microsphere-microfluidic approach to dengue detection a promising avenue for early detection of this fatal illness.

  14. Evaluation of data analytic approaches to generating cross-domain mappings of controlled science vocabularies

    NASA Astrophysics Data System (ADS)

    Zednik, S.

    2015-12-01

    Recent data publication practices have made increasing amounts of diverse datasets available online for the general research community to explore and integrate. Even with the abundance of data online, relevant data discovery and successful integration is still highly dependent upon the data being published with well-formed and understandable metadata. Tagging a dataset with well-known or controlled community terms is a common mechanism to indicate the intended purpose, subject matter, or other relevant facts of a dataset, however controlled domain terminology can be difficult for cross-domain researchers to interpret and leverage. It is also a challenge for integration portals to successfully provide cross-domain search capabilities over data holdings described using many different controlled vocabularies. Mappings between controlled vocabularies can be challenging because communities frequently develop specialized terminologies and have highly specific and contextual usages of common words. Despite this specificity it is highly desirable to produce cross-domain mappings to support data integration. In this contribution we evaluate the applicability of several data analytic techniques for the purpose of generating mappings between hierarchies of controlled science terms. We hope our efforts initiate more discussion on the topic and encourage future mapping efforts.

  15. MALDI matrices for low molecular weight compounds: an endless story?

    PubMed

    Calvano, Cosima Damiana; Monopoli, Antonio; Cataldi, Tommaso R I; Palmisano, Francesco

    2018-04-23

    Since its introduction in the 1980s, matrix-assisted laser desorption/ionization mass spectrometry (MALDI MS) has gained a prominent role in the analysis of high molecular weight biomolecules such as proteins, peptides, oligonucleotides, and polysaccharides. Its application to low molecular weight compounds has remained for long time challenging due to the spectral interferences produced by conventional organic matrices in the low m/z window. To overcome this problem, specific sample preparation such as analyte/matrix derivatization, addition of dopants, or sophisticated deposition technique especially useful for imaging experiments, have been proposed. Alternative approaches based on second generation (rationally designed) organic matrices, ionic liquids, and inorganic matrices, including metallic nanoparticles, have been the object of intense and continuous research efforts. Definite evidences are now provided that MALDI MS represents a powerful and invaluable analytical tool also for small molecules, including their quantification, thus opening new, exciting applications in metabolomics and imaging mass spectrometry. This review is intended to offer a concise critical overview of the most recent achievements about MALDI matrices capable of specifically address the challenging issue of small molecules analysis. Graphical abstract An ideal Book of matrices for MALDI MS of small molecules.

  16. Analysis of three tests of the unconfined aquifer in southern Nassau County, Long Island, New York

    USGS Publications Warehouse

    Lindner, J.B.; Reilly, T.E.

    1982-01-01

    Drawdown and recovery data from three 2-day aquifer tests (OF) the unconfined (water-table) aquifer in southern Nassau County, N.Y., during the fall of 1979, were analyzed. Several simple analytical solutions, a typecurve-matching procedure, and a Galerkin finite-element radial-flow model were used to determine hydraulic conductivity, ratio of horizontal to vertical hydraulic conductivity, and specific yield. Results of the curve-matching procedure covered a broad range of values that could be narrowed through consideration of data from other sources such as published reports, drillers ' logs, or values determined by analytical solutions. Analysis by the radial-flow model was preferred because it allows for vertical variability in aquifer properties and solves the system for all observation points simultaneously, whereas the other techniques treat the aquifer as homogeneous and must treat each observation well separately. All methods produced fairly consistent results. The ranges of aquifer values at the three sites were: horizontal hydraulic conductivity, 140 to 380 feet per day; transmissivity 11,200 to 17,100 feet squared per day; ratio of horizontal to vertical hydraulic conductivity 2.4:1 to 7:1, and specific yield , 0.13 to 0.23. (USGS)

  17. Research in health sciences library and information science: a quantitative analysis.

    PubMed Central

    Dimitroff, A

    1992-01-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504

  18. Analytical techniques for measuring hydrocarbon emissions from the manufacture of fiberglass-reinforced plastics. Report for June 1995--March 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, R.S.; Kong, E.J.; Bahner, M.A.

    The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.

  19. The Coordinate Orthogonality Check (corthog)

    NASA Astrophysics Data System (ADS)

    Avitabile, P.; Pechinsky, F.

    1998-05-01

    A new technique referred to as the coordinate orthogonality check (CORTHOG) helps to identify how each physical degree of freedom contributes to the overall orthogonality relationship between analytical and experimental modal vectors on a mass-weighted basis. Using the CORTHOG technique together with the pseudo-orthogonality check (POC) clarifies where potential discrepancies exist between the analytical and experimental modal vectors. CORTHOG improves the understanding of the correlation (or lack of correlation) that exists between modal vectors. The CORTHOG theory is presented along with the evaluation of several cases to show the use of the technique.

  20. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  1. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  2. Molecular detection of pathogens in water--the pros and cons of molecular techniques.

    PubMed

    Girones, Rosina; Ferrús, Maria Antonia; Alonso, José Luis; Rodriguez-Manzano, Jesus; Calgua, Byron; Corrêa, Adriana de Abreu; Hundesa, Ayalkibet; Carratala, Anna; Bofill-Mas, Sílvia

    2010-08-01

    Pollution of water by sewage and run-off from farms produces a serious public health problem in many countries. Viruses, along with bacteria and protozoa in the intestine or in urine are shed and transported through the sewer system. Even in highly industrialized countries, pathogens, including viruses, are prevalent throughout the environment. Molecular methods are used to monitor viral, bacterial, and protozoan pathogens, and to track pathogen- and source-specific markers in the environment. Molecular techniques, specifically polymerase chain reaction-based methods, provide sensitive, rapid, and quantitative analytical tools with which to study such pathogens, including new or emerging strains. These techniques are used to evaluate the microbiological quality of food and water, and to assess the efficiency of virus removal in drinking and wastewater treatment plants. The range of methods available for the application of molecular techniques has increased, and the costs involved have fallen. These developments have allowed the potential standardization and automation of certain techniques. In some cases they facilitate the identification, genotyping, enumeration, viability assessment, and source-tracking of human and animal contamination. Additionally, recent improvements in detection technologies have allowed the simultaneous detection of multiple targets in a single assay. However, the molecular techniques available today and those under development require further refinement in order to be standardized and applicable to a diversity of matrices. Water disinfection treatments may have an effect on the viability of pathogens and the numbers obtained by molecular techniques may overestimate the quantification of infectious microorganisms. The pros and cons of molecular techniques for the detection and quantification of pathogens in water are discussed. (c) 2010 Elsevier Ltd. All rights reserved.

  3. Flexible aircraft dynamic modeling for dynamic analysis and control synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1989-01-01

    The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.

  4. Use of genetic data to infer population-specific ecological and phenotypic traits from mixed aggregations

    USGS Publications Warehouse

    Moran, Paul; Bromaghin, Jeffrey F.; Masuda, Michele

    2014-01-01

    Many applications in ecological genetics involve sampling individuals from a mixture of multiple biological populations and subsequently associating those individuals with the populations from which they arose. Analytical methods that assign individuals to their putative population of origin have utility in both basic and applied research, providing information about population-specific life history and habitat use, ecotoxins, pathogen and parasite loads, and many other non-genetic ecological, or phenotypic traits. Although the question is initially directed at the origin of individuals, in most cases the ultimate desire is to investigate the distribution of some trait among populations. Current practice is to assign individuals to a population of origin and study properties of the trait among individuals within population strata as if they constituted independent samples. It seemed that approach might bias population-specific trait inference. In this study we made trait inferences directly through modeling, bypassing individual assignment. We extended a Bayesian model for population mixture analysis to incorporate parameters for the phenotypic trait and compared its performance to that of individual assignment with a minimum probability threshold for assignment. The Bayesian mixture model outperformed individual assignment under some trait inference conditions. However, by discarding individuals whose origins are most uncertain, the individual assignment method provided a less complex analytical technique whose performance may be adequate for some common trait inference problems. Our results provide specific guidance for method selection under various genetic relationships among populations with different trait distributions.

  5. Use of Genetic Data to Infer Population-Specific Ecological and Phenotypic Traits from Mixed Aggregations

    PubMed Central

    Moran, Paul; Bromaghin, Jeffrey F.; Masuda, Michele

    2014-01-01

    Many applications in ecological genetics involve sampling individuals from a mixture of multiple biological populations and subsequently associating those individuals with the populations from which they arose. Analytical methods that assign individuals to their putative population of origin have utility in both basic and applied research, providing information about population-specific life history and habitat use, ecotoxins, pathogen and parasite loads, and many other non-genetic ecological, or phenotypic traits. Although the question is initially directed at the origin of individuals, in most cases the ultimate desire is to investigate the distribution of some trait among populations. Current practice is to assign individuals to a population of origin and study properties of the trait among individuals within population strata as if they constituted independent samples. It seemed that approach might bias population-specific trait inference. In this study we made trait inferences directly through modeling, bypassing individual assignment. We extended a Bayesian model for population mixture analysis to incorporate parameters for the phenotypic trait and compared its performance to that of individual assignment with a minimum probability threshold for assignment. The Bayesian mixture model outperformed individual assignment under some trait inference conditions. However, by discarding individuals whose origins are most uncertain, the individual assignment method provided a less complex analytical technique whose performance may be adequate for some common trait inference problems. Our results provide specific guidance for method selection under various genetic relationships among populations with different trait distributions. PMID:24905464

  6. 40 CFR 1066.101 - Overview.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.101 Overview. (a) This subpart addresses equipment related to emission testing, as well as test fuels and... specifications for fuels, engine fluids, and analytical gases; these specifications apply for testing under this...

  7. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  9. Large Ensemble Analytic Framework for Consequence-Driven Discovery of Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Lamontagne, Jonathan R.; Reed, Patrick M.; Link, Robert; Calvin, Katherine V.; Clarke, Leon E.; Edmonds, James A.

    2018-03-01

    An analytic scenario generation framework is developed based on the idea that the same climate outcome can result from very different socioeconomic and policy drivers. The framework builds on the Scenario Matrix Framework's abstraction of "challenges to mitigation" and "challenges to adaptation" to facilitate the flexible discovery of diverse and consequential scenarios. We combine visual and statistical techniques for interrogating a large factorial data set of 33,750 scenarios generated using the Global Change Assessment Model. We demonstrate how the analytic framework can aid in identifying which scenario assumptions are most tied to user-specified measures for policy relevant outcomes of interest, specifically for our example high or low mitigation costs. We show that the current approach for selecting reference scenarios can miss policy relevant scenario narratives that often emerge as hybrids of optimistic and pessimistic scenario assumptions. We also show that the same scenario assumption can be associated with both high and low mitigation costs depending on the climate outcome of interest and the mitigation policy context. In the illustrative example, we show how agricultural productivity, population growth, and economic growth are most predictive of the level of mitigation costs. Formulating policy relevant scenarios of deeply and broadly uncertain futures benefits from large ensemble-based exploration of quantitative measures of consequences. To this end, we have contributed a large database of climate change futures that can support "bottom-up" scenario generation techniques that capture a broader array of consequences than those that emerge from limited sampling of a few reference scenarios.

  10. Electrochemical Quartz Crystal Nanobalance (EQCN) Based Biosensor for Sensitive Detection of Antibiotic Residues in Milk.

    PubMed

    Bhand, Sunil; Mishra, Geetesh K

    2017-01-01

    An electrochemical quartz crystal nanobalance (EQCN), which provides real-time analysis of dynamic surface events, is a valuable tool for analyzing biomolecular interactions. EQCN biosensors are based on mass-sensitive measurements that can detect small mass changes caused by chemical binding to small piezoelectric crystals. Among the various biosensors, the piezoelectric biosensor is considered one of the most sensitive analytical techniques, capable of detecting antigens at picogram levels. EQCN is an effective monitoring technique for regulation of the antibiotics below the maximum residual limit (MRL). The analysis of antibiotic residues requires high sensitivity, rapidity, reliability and cost effectiveness. For analytical purposes the general approach is to take advantage of the piezoelectric effect by immobilizing a biosensing layer on top of the piezoelectric crystal. The sensing layer usually comprises a biological material such as an antibody, enzymes, or aptamers having high specificity and selectivity for the target molecule to be detected. The biosensing layer is usually functionalized using surface chemistry modifications. When these bio-functionalized quartz crystals are exposed to a particular substance of interest (e.g., a substrate, inhibitor, antigen or protein), binding interaction occurs. This causes a frequency or mass change that can be used to determine the amount of material interacted or bound. EQCN biosensors can easily be automated by using a flow injection analysis (FIA) setup coupled through automated pumps and injection valves. Such FIA-EQCN biosensors have great potential for the detection of different analytes such as antibiotic residues in various matrices such as water, waste water, and milk.

  11. Nanoflow Separation of Amino Acids for the Analysis of Cosmic Dust

    NASA Technical Reports Server (NTRS)

    Martin, M. P.; Glavin, D. P.; Dworkin, Jason P.

    2008-01-01

    The delivery of amino acids to the early Earth by interplanetary dust particles, comets, and carbonaceous meteorites could have been a significant source of the early Earth's prebiotic organic inventory. Amino acids are central to modern terrestrial biochemistry as major components of proteins and enzymes and were probably vital in the origin of life. A variety of amino acids have been detected in the CM carbonaceous meteorite Murchison, many of which are exceptionally rare in the terrestrial biosphere including a-aminoisobutyric acid (AIB) and isovaline. AIB has also been detected in a small percentage of Antarctic micrometeorite grains believed to be related to the CM meteorites We report on progress in optimizing a nanoflow liquid chromatography separation system with dual detection via laser-induced-fluorescence time of flight mass spectrometry (nLC-LIF/ToF-MS) for the analysis of o-phthaldialdehydelN-acetyl-L-cysteine (OPA/NAC) labeled amino acids in cosmic dust grains. The very low flow rates (<3 micro-L/min) of nLC over analytical LC (>0.1 ml/min) combined with <2 micron column bead sizes has the potential to produce efficient analyte ionizations andchromatograms with very sharp peaks; both increase sensitivity. The combination of the selectivity (only primary amines are derivatized), sensitivity (>4 orders of magnitude lower than traditional GC-MS techniques), and specificity (compounds identities are determined by both retention time and exact mass) makes this a compelling technique. However, the development of an analytical method to achieve separation of compounds as structurally similar as amino acid monomers and produce the sharp peaks required for maximum sensitivity is challenging.

  12. Nuclear Resonance Fluorescence to Measure Plutonium Mass in Spent Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludewigt, Bernhard A; Quiter, Brian J.; Ambers, Scott D.

    2011-01-14

    The Next Generation Safeguard Initiative (NGSI) of the U.S Department of Energy is supporting a multi-lab/university collaboration to quantify the plutonium (Pu) mass in spent nuclear fuel (SNF) assemblies and to detect the diversion of pins with non-destructive assay (NDA) methods. The following 14 NDA techniques are being studied: Delayed Neutrons, Differential Die-Away, Differential Die-Away Self-Interrogation, Lead Slowing Down Spectrometer, Neutron Multiplicity, Passive Neutron Albedo Reactivity, Total Neutron (Gross Neutron), X-Ray Fluorescence, {sup 252}Cf Interrogation with Prompt Neutron Detection, Delayed Gamma, Nuclear Resonance Fluorescence, Passive Prompt Gamma, Self-integration Neutron Resonance Densitometry, and Neutron Resonance Transmission Analysis. Understanding and maturity ofmore » the techniques vary greatly, ranging from decades old, well-understood methods to new approaches. Nuclear Resonance Fluorescence (NRF) is a technique that had not previously been studied for SNF assay or similar applications. Since NRF generates isotope-specific signals, the promise and appeal of the technique lies in its potential to directly measure the amount of a specific isotope in an SNF assay target. The objectives of this study were to design and model suitable NRF measurement methods, to quantify capabilities and corresponding instrumentation requirements, and to evaluate prospects and the potential of NRF for SNF assay. The main challenge of the technique is to achieve the sensitivity and precision, i.e., to accumulate sufficient counting statistics, required for quantifying the mass of Pu isotopes in SNF assemblies. Systematic errors, considered a lesser problem for a direct measurement and only briefly discussed in this report, need to be evaluated for specific instrument designs in the future. Also, since the technical capability of using NRF to measure Pu in SNF has not been established, this report does not directly address issues such as cost, size, development time, nor concerns related to the use of Pu in measurement systems. This report discusses basic NRF measurement concepts, i.e., backscatter and transmission methods, and photon source and {gamma}-ray detector options in Section 2. An analytical model for calculating NRF signal strengths is presented in Section 3 together with enhancements to the MCNPX code and descriptions of modeling techniques that were drawn upon in the following sections. Making extensive use of the model and MCNPX simulations, the capabilities of the backscatter and transmission methods based on bremsstrahlung or quasi-monoenergetic photon sources were analyzed as described in Sections 4 and 5. A recent transmission experiment is reported on in Appendix A. While this experiment was not directly part of this project, its results provide an important reference point for our analytical estimates and MCNPX simulations. Used fuel radioactivity calculations, the enhancements to the MCNPX code, and details of the MCNPX simulations are documented in the other appendices.« less

  13. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE PAGES

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    2016-09-16

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  14. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  15. Conceptual data sampling for breast cancer histology image classification.

    PubMed

    Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir

    2017-10-01

    Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    PubMed

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.

  17. Impact of polymers on the crystallization and phase transition kinetics of amorphous nifedipine during dissolution in aqueous media.

    PubMed

    Raina, Shweta A; Alonzo, David E; Zhang, Geoff G Z; Gao, Yi; Taylor, Lynne S

    2014-10-06

    The commercial and clinical success of amorphous solid dispersions (ASD) in overcoming the low bioavailability of poorly soluble molecules has generated momentum among pharmaceutical scientists to advance the fundamental understanding of these complex systems. A major limitation of these formulations stems from the propensity of amorphous solids to crystallize upon exposure to aqueous media. This study was specifically focused on developing analytical techniques to evaluate the impact of polymers on the crystallization behavior during dissolution, which is critical in designing effective amorphous formulations. In the study, the crystallization and polymorphic conversions of a model compound, nifedipine, were explored in the absence and presence of polyvinylpyrrolidone (PVP), hydroxypropylmethyl cellulose (HPMC), and HPMC-acetate succinate (HPMC-AS). A combination of analytical approaches including Raman spectroscopy, polarized light microscopy, and chemometric techniques such as multivariate curve resolution (MCR) were used to evaluate the kinetics of crystallization and polymorphic transitions as well as to identify the primary route of crystallization, i.e., whether crystallization took place in the dissolving solid matrix or from the supersaturated solutions generated during dissolution. Pure amorphous nifedipine, when exposed to aqueous media, was found to crystallize rapidly from the amorphous matrix, even when polymers were present in the dissolution medium. Matrix crystallization was avoided when amorphous solid dispersions were prepared, however, crystallization from the solution phase was rapid. MCR was found to be an excellent data processing technique to deconvolute the complex phase transition behavior of nifedipine.

  18. The flotation and adsorption of mixed collectors on oxide and silicate minerals.

    PubMed

    Xu, Longhua; Tian, Jia; Wu, Houqin; Lu, Zhongyuan; Sun, Wei; Hu, Yuehua

    2017-12-01

    The analysis of flotation and adsorption of mixed collectors on oxide and silicate minerals is of great importance for both industrial applications and theoretical research. Over the past years, significant progress has been achieved in understanding the adsorption of single collectors in micelles as well as at interfaces. By contrast, the self-assembly of mixed collectors at liquid/air and solid/liquid interfaces remains a developing area as a result of the complexity of the mixed systems involved and the limited availability of suitable analytical techniques. In this work, we systematically review the processes involved in the adsorption of mixed collectors onto micelles and at interface by examining four specific points, namely, theoretical background, factors that affect adsorption, analytical techniques, and self-assembly of mixed surfactants at the mineral/liquid interface. In the first part, the theoretical background of collector mixtures is introduced, together with several core solution theories, which are classified according to their application in the analysis of physicochemical properties of mixed collector systems. In the second part, we discuss the factors that can influence adsorption, including factors related to the structure of collectors and environmental conditions. We summarize their influence on the adsorption of mixed systems, with the objective to provide guidance on the progress achieved in this field to date. Advances in measurement techniques can greatly promote our understanding of adsorption processes. In the third part, therefore, modern techniques such as optical reflectometry, neutron scattering, neutron reflectometry, thermogravimetric analysis, fluorescence spectroscopy, ultrafiltration, atomic force microscopy, analytical ultracentrifugation, X-ray photoelectron spectroscopy, Vibrational Sum Frequency Generation Spectroscopy and molecular dynamics simulations are introduced in virtue of their application. Finally, focusing on oxide and silicate minerals, we review and summarize the flotation and adsorption of three most widely used mixed surfactant systems (anionic-cationic, anionic-nonionic, and cationic-nonionic) at the liquid/mineral interface in order to fully understand the self-assembly progress. In the end, the paper gives a brief future outlook of the possible development in the mixed surfactants. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Geophysical technique for mineral exploration and discrimination based on electromagnetic methods and associated systems

    DOEpatents

    Zhdanov,; Michael, S [Salt Lake City, UT

    2008-01-29

    Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.

  20. Microphotographs of cyanobacteria documenting the effects of various cell-lysis techniques

    USGS Publications Warehouse

    Rosen, Barry H.; Loftin, Keith A.; Smith, Christopher E.; Lane, Rachael F.; Keydel, Susan P.

    2011-01-01

    Cyanotoxins are a group of organic compounds biosynthesized intracellularly by many species of cyanobacteria found in surface water. The United States Environmental Protection Agency has listed cyanotoxins on the Safe Drinking Water Act's Contaminant Candidate List 3 for consideration for future regulation to protect public health. Cyanotoxins also pose a risk to humans and other organisms in a variety of other exposure scenarios. Accurate and precise analytical measurements of cyanotoxins are critical to the evaluation of concentrations in surface water to address the human health and ecosystem effects. A common approach to total cyanotoxin measurement involves cell membrane disruption to release the cyanotoxins to the dissolved phase followed by filtration to remove cellular debris. Several methods have been used historically, however no standard protocols exist to ensure this process is consistent between laboratories before the dissolved phase is measured by an analytical technique for cyanotoxin identification and quantitation. No systematic evaluation has been conducted comparing the multiple laboratory sample processing techniques for physical disruption of cell membrane or cyanotoxins recovery. Surface water samples collected from lakes, reservoirs, and rivers containing mixed assemblages of organisms dominated by cyanobacteria, as well as laboratory cultures of species-specific cyanobacteria, were used as part of this study evaluating multiple laboratory cell-lysis techniques in partnership with the U.S. Environmental Protection Agency. Evaluated extraction techniques included boiling, autoclaving, sonication, chemical treatment, and freeze-thaw. Both treated and untreated samples were evaluated for cell membrane integrity microscopically via light, epifluorescence, and epifluorescence in the presence of a DNA stain. The DNA stain, which does not permeate live cells with intact membrane structures, was used as an indicator for cyanotoxin release into the dissolved phase. Of the five techniques, sonication (at 70 percent) was most effective at complete cell destruction while QuikLyse (Trademarked) was least effective. Autoclaving, boiling, and sequential freeze-thaw were moderately effective in physical destruction of colonies and filaments.

  1. Systematically reviewing and synthesizing evidence from conversation analytic and related discursive research to inform healthcare communication practice and policy: an illustrated guide

    PubMed Central

    2013-01-01

    Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181

  2. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    PubMed

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  3. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  4. Mass spectrometric based approaches in urine metabolomics and biomarker discovery.

    PubMed

    Khamis, Mona M; Adamko, Darryl J; El-Aneed, Anas

    2017-03-01

    Urine metabolomics has recently emerged as a prominent field for the discovery of non-invasive biomarkers that can detect subtle metabolic discrepancies in response to a specific disease or therapeutic intervention. Urine, compared to other biofluids, is characterized by its ease of collection, richness in metabolites and its ability to reflect imbalances of all biochemical pathways within the body. Following urine collection for metabolomic analysis, samples must be immediately frozen to quench any biogenic and/or non-biogenic chemical reactions. According to the aim of the experiment; sample preparation can vary from simple procedures such as filtration to more specific extraction protocols such as liquid-liquid extraction. Due to the lack of comprehensive studies on urine metabolome stability, higher storage temperatures (i.e. 4°C) and repetitive freeze-thaw cycles should be avoided. To date, among all analytical techniques, mass spectrometry (MS) provides the best sensitivity, selectivity and identification capabilities to analyze the majority of the metabolite composition in the urine. Combined with the qualitative and quantitative capabilities of MS, and due to the continuous improvements in its related technologies (i.e. ultra high-performance liquid chromatography [UPLC] and hydrophilic interaction liquid chromatography [HILIC]), liquid chromatography (LC)-MS is unequivocally the most utilized and the most informative analytical tool employed in urine metabolomics. Furthermore, differential isotope tagging techniques has provided a solution to ion suppression from urine matrix thus allowing for quantitative analysis. In addition to LC-MS, other MS-based technologies have been utilized in urine metabolomics. These include direct injection (infusion)-MS, capillary electrophoresis-MS and gas chromatography-MS. In this article, the current progresses of different MS-based techniques in exploring the urine metabolome as well as the recent findings in providing potentially diagnostic urinary biomarkers are discussed. © 2015 Wiley Periodicals, Inc. Mass Spec Rev 36:115-134, 2017. © 2015 Wiley Periodicals, Inc.

  5. Effects of PCB exposure on neuropsychological function in children.

    PubMed

    Schantz, Susan L; Widholm, John J; Rice, Deborah C

    2003-03-01

    In the last decade advances in the analytic methods for quantification of polychlorinated biphenyls (PCBs) have resulted in widespread availability of congener-specific analysis procedures, and large amounts of data on PCB congener profiles in soil, air, water, sediments, foodstuffs, and human tissues have become available. These data have revealed that the PCB residues in environmental media and human tissues may not closely resemble any of the commercial PCB mixtures, depending on source of exposure, bioaccumulation through the food chain, and weathering of PCBs in the environment. At the same time, toxicological research has led to a growing awareness that different classes of PCB congeners have different profiles of toxicity. These advances in analytic techniques and toxicological knowledge are beginning to influence the risk assessment process. As the data from ongoing PCB studies assessing the mediators of neurobehavioral outcomes in children are published, the weight of evidence for PCB effects on neurodevelopment is growing. Studies in Taiwan, Michigan (USA), New York (USA), Holland, Germany, and the Faroe Islands have all reported negative associations between prenatal PCB exposure and measures of cognitive functioning in infancy or childhood. The German study also reported a negative association between postnatal PCB exposure and cognitive function in early childhood--a result that had not been found in previous studies. Only one published study in North Carolina (USA) has failed to find an association between PCB exposure and cognitive outcomes. Despite the fact that several more recent studies have used congener-specific analytic techniques, there have been only limited attempts to assess the role of specific PCB congeners or classes of congeners in mediating neurodevelopmental outcomes. From a statistical standpoint, attempts to determine the role of individual congeners in mediating outcomes are hampered by the fact that concentrations of most individual congeners are highly correlated with each other and with total PCBs. From a toxicological standpoint, these efforts are hampered by the fact that many of the PCB congeners present in human tissues have never been studied in the laboratory, and their relative potency to produce nervous system effects is unknown. More complete information on the health effects of various congeners or congener classes would allow more informed scientific and risk assessment decisions.

  6. Analytical evaluation of three enzymatic assays for measuring total bile acids in plasma using a fully-automated clinical chemistry platform.

    PubMed

    Danese, Elisa; Salvagno, Gian Luca; Negrini, Davide; Brocco, Giorgio; Montagnana, Martina; Lippi, Giuseppe

    2017-01-01

    Although the clinical significance of measuring bile acids concentration in plasma or serum has been recognized for long in patients with hepatobiliary disease and/or bile acid malabsorption, the reference separation techniques are expensive and mostly unsuitable for early diagnosis and for measuring large volumes of samples. Therefore, this study was aimed to evaluate the analytical performance of three commercial enzymatic techniques for measuring total bile acids in plasma using a fully-automated clinical chemistry platform. Three commercial enzymatic assays (from Diazyme, Randox and Sentinel) were adapted for use on a Cobas Roche c501. We performed imprecision and linearity studies, and we compared results with those obtained using a reference liquid chromatography-mass spectrometry (LC-MS) technique on an identical set of lithium-heparin plasma samples. Total imprecision was optimal, always equal or lower than 3%. All assays had optimal linearity between 3-138 μmol/L. The comparison studies showed good correlation with LC-MS data (Spearman's correlation coefficients always >0.92), but all plasma samples values were significantly underestimated using the commercial enzymatic assays (-44% for Diazyme, -16% for Randox and -12% for Sentinel). The agreement at the 10 and 40 μmol/L diagnostic thresholds of total bile acids in plasma ranged between 86-92%. This discrepancy was found to be mainly attributable to a heterogeneous composition in terms of bile acids content of the three assay calibrators. This study suggests that the analytical performance of the three commercial enzymatic assays is excellent, thus confirming that automation of this important test by means of enzymatic assessment may be feasible, practical, reliable and supposedly cheap. Nevertheless, the underestimation of values compared to the reference LC-MS also suggests that the local definition and validation of reference ranges according to the combination between the specific enzymatic assay and the different clinical chemistry platforms may be advisable.

  7. Further Investigations of Content Analytic Techniques for Extracting the Differentiating Information Contained in the Narrative Sections of Performance Evaluations for Navy Enlisted Personnel. Technical Report No. 75-1.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.; Richman, Vivian

    The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…

  8. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  9. Finalizing the Libby Action Plan Research Program | Science ...

    EPA Pesticide Factsheets

    Libby, Montana is the location of a former vermiculite mine that operated from 1923 to 1990. The vermiculite ore from the mine co-existed with amphibole asbestos, referred to as Libby Amphibole Asbestos (LAA). Combined with the cessation of the asbestos mining and processing operations, there has been significant progress in reducing the exposure to LAA in Libby, Montana. In 2009, the U.S Environmental Protection Agency (EPA) jointly with the Department of Health and Human Services (DHHS) declared a public health emergency in Libby due to observed asbestos-related health effects in the region. As part of this effort, the EPA led a cross-agency research program that conducted analytical, toxicological, and epidemiological research on the health effects of asbestos at the Libby Asbestos Superfund Site (Libby Site) in Libby, Montana. The Libby Action Plan (LAP) was initiated in 2007 to support the site-specific risk assessment for the Libby Site. The goal of the LAP research program was to explore the health effects of LAA, and determine toxicity information specific to LAA in order to accurately inform a human health risk assessment at the Libby Site. LAP research informed data gaps related to the health effects of exposure to LAA, particularly related to specific mechanisms of fiber dosimetry and toxicity (e.g., inflammatory responses), as well as investigated disease progression in exposed populations and advanced asbestos analytical techniques. This work incl

  10. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    PubMed

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.

  11. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  12. Insights from Smart Meters: The Potential for Peak-Hour Savings from Behavior-Based Programs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Todd, Annika; Perry, Michael; Smith, Brian

    The rollout of smart meters in the last several years has opened up new forms of previously unavailable energy data. Many utilities are now able in real-time to capture granular, household level interval usage data at very high-frequency levels for a large proportion of their residential and small commercial customer population. This can be linked to other time and locationspecific information, providing vast, constantly growing streams of rich data (sometimes referred to by the recently popular buzz word, “big data”). Within the energy industry there is increasing interest in tapping into the opportunities that these data can provide. What canmore » we do with all of these data? The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull interesting insights out of this highfrequency, human-focused data. We at LBNL are calling this “behavior analytics”. This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, highly disaggregated and heterogeneous information about actual energy use would allow energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; would enable evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and would provide better insights in to the energy and peak hour savings associated with specifics types of EE and DR programs (e.g., behavior-based (BB) programs). In this series, “Insights from Smart Meters”, we will present concrete, illustrative examples of the type of value that insights from behavior analytics of these data can provide (as well as pointing out its limitations). We will supply several types of key findings, including: • Novel results, which answer questions the industry previously was unable to answer; • Proof-of-concept analytics tools that can be adapted and used by others; and • Guidelines and protocols that summarize analytical best practices. This report focuses on one example of the kind of value that analysis of this data can provide: insights into whether behavior-based (BB) efficiency programs have the potential to provide peak-hour energy savings.« less

  13. Enantioselective Analytical- and Preparative-Scale Separation of Hexabromocyclododecane Stereoisomers Using Packed Column Supercritical Fluid Chromatography.

    PubMed

    Riddell, Nicole; Mullin, Lauren Gayle; van Bavel, Bert; Ericson Jogsten, Ingrid; McAlees, Alan; Brazeau, Allison; Synnott, Scott; Lough, Alan; McCrindle, Robert; Chittim, Brock

    2016-11-10

    Hexabromocyclododecane (HBCDD) is an additive brominated flame retardant which has been listed in Annex A of the Stockholm Convention for elimination of production and use. It has been reported to persist in the environment and has the potential for enantiomer-specific degradation, accumulation, or both, making enantioselective analyses increasingly important. The six main stereoisomers of technical HBCDD (i.e., the (+) and (-) enantiomers of α-, β-, and γ-HBCDD) were separated and isolated for the first time using enantioselective packed column supercritical fluid chromatography (pSFC) separation methods on a preparative scale. Characterization was completed using published chiral liquid chromatography (LC) methods and elution profiles, as well as X-ray crystallography, and the isolated fractions were definitively identified. Additionally, the resolution of the enantiomers, along with two minor components of the technical product (δ- and ε-HBCDD), was investigated on an analytical scale using both LC and pSFC separation techniques, and changes in elution order were highlighted. Baseline separation of all HBCDD enantiomers was achieved by pSFC on an analytical scale using a cellulose-based column. The described method emphasizes the potential associated with pSFC as a green method of isolating and analyzing environmental contaminants of concern.

  14. MOMA Gas Chromatograph-Mass Spectrometer onboard the 2018 ExoMars Mission: results and performance

    NASA Astrophysics Data System (ADS)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Humeau, O.; van Amerom, F. H.; Danell, R.; Freissinet, C.; Brinckerhoff, W.; Gonnsen, Z.; Mahaffy, P. R.; Coll, P.; Raulin, F.; Goesmann, F.

    2015-10-01

    The Mars Organic Molecule Analyzer (MOMA) is a dual ion source linear ion trap mass spectrometer that was designed for the 2018 joint ESA-Roscosmos mission to Mars. The main scientific aim of the mission is to search for signs of extant or extinct life in the near subsurface of Mars by acquiring samples from as deep as 2 m below the surface. MOMA will be a key analytical tool in providing chemical (molecular and chiral) information from the solid samples, with particular focus on the characterization of organic content. The MOMA instrument, itself, is a joint venture for NASA and ESA to develop a mass spectrometer capable of analyzing samples from pyrolysis/chemical derivatization gas chromatography (GC) as well as ambient pressure laser desorption ionization (LDI). The combination of the two analytical techniques allows for the chemical characterization of a broad range of compounds, including volatile and non-volatile species. Generally, MOMA can provide information on elemental and molecular makeup, polarity, chirality and isotopic patterns of analyte species. Here we report on the current performance of the MOMA prototype instruments, specifically the demonstration of the gas chromatographymass spectrometry (GC-MS) mode of operation.

  15. Peptide interfaces with graphene: an emerging intersection of analytical chemistry, theory, and materials.

    PubMed

    Russell, Shane R; Claridge, Shelley A

    2016-04-01

    Because noncovalent interface functionalization is frequently required in graphene-based devices, biomolecular self-assembly has begun to emerge as a route for controlling substrate electronic structure or binding specificity for soluble analytes. The remarkable diversity of structures that arise in biological self-assembly hints at the possibility of equally diverse and well-controlled surface chemistry at graphene interfaces. However, predicting and analyzing adsorbed monolayer structures at such interfaces raises substantial experimental and theoretical challenges. In contrast with the relatively well-developed monolayer chemistry and characterization methods applied at coinage metal surfaces, monolayers on graphene are both less robust and more structurally complex, levying more stringent requirements on characterization techniques. Theory presents opportunities to understand early binding events that lay the groundwork for full monolayer structure. However, predicting interactions between complex biomolecules, solvent, and substrate is necessitating a suite of new force fields and algorithms to assess likely binding configurations, solvent effects, and modulations to substrate electronic properties. This article briefly discusses emerging analytical and theoretical methods used to develop a rigorous chemical understanding of the self-assembly of peptide-graphene interfaces and prospects for future advances in the field.

  16. Fiber optic evanescent wave biosensor

    NASA Astrophysics Data System (ADS)

    Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.

    1991-09-01

    The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).

  17. Preliminary studies of using preheated carrier gas for on-line membrane extraction of semivolatile organic compounds.

    PubMed

    Liu, Xinyu; Pawliszyn, Janusz

    2007-04-01

    In this paper, we present results for the on-line determination of semivolatile organic compounds (SVOCs) in air using membrane extraction with a sorbent interface-ion mobility spectrometry (MESI-IMS) system with a preheated carrier (stripping) gas. The mechanism of the mass transfer of SVOCs across a membrane was initially studied. In comparison with the extraction of volatile analytes, the mass transfer resistance that originated from the slow desorption from the internal membrane surface during the SVOC extraction processes should be taken into account. A preheated carrier gas system was therefore built to facilitate desorption of analytes from the internal membrane surface. With the benefit of a temperature gradient existing between the internal and external membrane surfaces, an increase in the desorption rate of a specific analyte at the internal surface and the diffusion coefficient within the membrane could be achieved while avoiding a decrease of the distribution constant on the external membrane interface. This technique improved both the extraction rate and response times of the MESI-IMS system for the analysis of SVOCs. Finally, the MESI-IMS system was shown to be capable of on-site measurement by monitoring selected polynuclear aromatic hydrocarbons emitted from cigarette smoke.

  18. Analytical model of contamination during the drying of cylinders of jamonable muscle

    NASA Astrophysics Data System (ADS)

    Montoya Arroyave, Isabel

    2014-05-01

    For a cylinder of jamonable muscle of radius R and length much greater than R; considering that the internal resistance to the transfer of water is much greater than the external and that the internal resistance is one certain function of the distance to the axis; the distribution of the punctual moisture in the jamonable cylinder is analytically computed in terms of the Bessel's functions. During the process of drying and salted the jamonable cylinder is sensitive to contaminate with bacterium and protozoa that come from the environment. An analytical model of contamination is presents using the diffusion equation with sources and sinks, which is solve by the method of the Laplace transform, the Bromwich integral, the residue theorem and some special functions like Bessel and Heun. The critical times intervals of drying and salted are computed in order to obtain the minimum possible contamination. It is assumed that both external moisture and contaminants decrease exponentially with time. Contaminants profiles are plotted and discussed some possible techniques of contaminants detection. All computations are executed using Computer Algebra, specifically Maple. It is said that the results are important for the food industry and it is suggested some future research lines.

  19. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  20. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

Top