Sample records for analytical techniques provide

  1. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  2. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  3. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  4. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  5. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  6. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  7. Surface-Enhanced Raman Spectroscopy.

    ERIC Educational Resources Information Center

    Garrell, Robin L.

    1989-01-01

    Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)

  8. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  9. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  10. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  11. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  12. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  13. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  14. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routinemore » standard analyses to unique problems that require significant development of methods and techniques.« less

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standardmore » analyses to unique problems that require significant development of methods and techniques.« less

  17. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  18. Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations

    ERIC Educational Resources Information Center

    Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei

    2016-01-01

    Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…

  19. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  20. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  1. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  2. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  3. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  4. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  5. Quantitative and qualitative sensing techniques for biogenic volatile organic compounds and their oxidation products.

    PubMed

    Kim, Saewung; Guenther, Alex; Apel, Eric

    2013-07-01

    The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.

  6. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  7. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  8. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  9. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  10. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handlesmore » a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.« less

  12. Recommended Inorganic Chemicals for Calibration.

    ERIC Educational Resources Information Center

    Moody, John R.; And Others

    1988-01-01

    All analytical techniques depend on the use of calibration chemicals to relate analyte concentration to instrumental parameters. Discusses the preparation of standard solutions and provides a critical evaluation of available materials. Lists elements by group and discusses the purity and uses of each. (MVL)

  13. INTEGRATING BIOANALYTICAL CAPABILITY IN AN ENVIRONMENTAL ANALYTICAL LABORATORY

    EPA Science Inventory

    The product is a book chapter which is an introductory and summary chapter for the reference work "Immunoassays and Other Bianalytical Techniques" to be published by CRC Press, Taylor and Francis Books. The chapter provides analytical chemists information on new techni...

  14. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  15. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  16. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Assay techniques for detection of exposure to sulfur mustard, cholinesterase inhibitors, sarin, soman, GF, and cyanide. Technical bulletin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-05-01

    This technical bulletin provides analytical techniques to identify toxic chemical agents in urine or blood samples. It is intended to provide the clinician with laboratory tests to detect exposure to sulfur mustard, cholinesterase inhibitors, sarin, soman, GF, and cyanide.

  18. Analytical Chemistry of Surfaces: Part III. Ion Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    The fundamentals of two surface techniques--secondary-ion mass spectrometry (SIMS) and ion-scattering spectrometry (ISS)--are discussed. Examples of how these techniques have been applied to surface problems are provided. (JN)

  19. New techniques for imaging and analyzing lung tissue.

    PubMed Central

    Roggli, V L; Ingram, P; Linton, R W; Gutknecht, W F; Mastin, P; Shelburne, J D

    1984-01-01

    The recent technological revolution in the field of imaging techniques has provided pathologists and toxicologists with an expanding repertoire of analytical techniques for studying the interaction between the lung and the various exogenous materials to which it is exposed. Analytical problems requiring elemental sensitivity or specificity beyond the range of that offered by conventional scanning electron microscopy and energy dispersive X-ray analysis are particularly appropriate for the application of these newer techniques. Electron energy loss spectrometry, Auger electron spectroscopy, secondary ion mass spectrometry, and laser microprobe mass analysis each offer unique advantages in this regard, but also possess their own limitations and disadvantages. Diffraction techniques provide crystalline structural information available through no other means. Bulk chemical techniques provide useful cross-checks on the data obtained by microanalytical approaches. It is the purpose of this review to summarize the methodology of these techniques, acknowledge situations in which they have been used in addressing problems in pulmonary toxicology, and comment on the relative advantages and disadvantages of each approach. It is necessary for an investigator to weigh each of these factors when deciding which technique is best suited for any given analytical problem; often it is useful to employ a combination of two or more of the techniques discussed. It is anticipated that there will be increasing utilization of these technologies for problems in pulmonary toxicology in the decades to come. Images FIGURE 3. A FIGURE 3. B FIGURE 3. C FIGURE 3. D FIGURE 4. FIGURE 5. FIGURE 7. A FIGURE 7. B FIGURE 8. A FIGURE 8. B FIGURE 8. C FIGURE 9. A FIGURE 9. B FIGURE 10. PMID:6090115

  20. An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. L.; Cosby, R. M.

    1976-01-01

    Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.

  1. The HVT technique and the 'uncertainty' relation for central potentials

    NASA Astrophysics Data System (ADS)

    Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th

    2004-08-01

    The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) atmore » Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenik, E.A.

    X-ray microanalysis in an analytical electron microscope is a proven technique for the measurement of solute segregation in alloys. Solute segregation under equilibrium or nonequilibrium conditions can strongly influence material performance. X-ray microanalysis in an analytical electron microscope provides an alternative technique to measure grain boundary segregation, as well as segregation to other defects not accessible to Auger analysis. The utility of the technique is demonstrated by measurements of equilibrium segregation to boundaries in an antimony containing stainless steel, including the variation of segregation with boundary character and by measurements of nonequilibrium segregation to boundaries and dislocations in an ion-irradiatedmore » stainless steel.« less

  4. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  5. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  6. Permeation absorption sampler with multiple detection

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for detecting analytes in air or aqueous systems includes a permeation absorption preconcentrator sampler for the analytes and analyte detectors. The preconcentrator has an inner fluid-permeable container into which a charge of analyte-sorbing liquid is intermittently injected, and a fluid-impermeable outer container. The sample is passed through the outer container and around the inner container for trapping and preconcentrating the analyte in the sorbing liquid. The analyte can be detected photometrically by injecting with the sorbing material a reagent which reacts with the analyte to produce a characteristic color or fluorescence which is detected by illuminating the contents of the inner container with a light source and measuring the absorbed or emitted light, or by producing a characteristic chemiluminescence which can be detected by a suitable light sensor. The analyte can also be detected amperometrically. Multiple inner containers may be provided into which a plurality of sorbing liquids are respectively introduced for simultaneously detecting different analytes. Baffles may be provided in the outer container. A calibration technique is disclosed.

  7. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  8. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  9. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  10. Active Control of Inlet Noise on the JT15D Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.

    1999-01-01

    This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.

  11. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1991-01-01

    Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.

  12. Analytic score distributions for a spatially continuous tridirectional Monte Carol transport problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Booth, T.E.

    1996-01-01

    The interpretation of the statistical error estimates produced by Monte Carlo transport codes is still somewhat of an art. Empirically, there are variance reduction techniques whose error estimates are almost always reliable, and there are variance reduction techniques whose error estimates are often unreliable. Unreliable error estimates usually result from inadequate large-score sampling from the score distribution`s tail. Statisticians believe that more accurate confidence interval statements are possible if the general nature of the score distribution can be characterized. Here, the analytic score distribution for the exponential transform applied to a simple, spatially continuous Monte Carlo transport problem is provided.more » Anisotropic scattering and implicit capture are included in the theory. In large part, the analytic score distributions that are derived provide the basis for the ten new statistical quality checks in MCNP.« less

  13. Geophysical technique for mineral exploration and discrimination based on electromagnetic methods and associated systems

    DOEpatents

    Zhdanov,; Michael, S [Salt Lake City, UT

    2008-01-29

    Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.

  14. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  15. 1990 National Water Quality Laboratory Services Catalog

    USGS Publications Warehouse

    Pritt, Jeffrey; Jones, Berwyn E.

    1989-01-01

    PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.

  16. Liquid chromatographic determination of benzocaine and N-acetylbenzocaine in the edible fillet tissue from rainbow trout

    USGS Publications Warehouse

    Meinertz, J.R.; Stehly, G.R.; Hubert, T.D.; Bernardy, J.A.

    1999-01-01

    A method was developed for determining benzocaine and N-acetylbenzocaine concentrations in fillet tissue of rainbow trout. The method involves extracting the analytes with acetonitrile, removing lipids or hydrophobic compounds from the extract with hexane, and providing additional clean-up with solid-phase extraction techniques. Analyte concentrations are determined using reversed-phase high-performance liquid chromatographic techniques with an isocratic mobile phase and UV detection. The accuracy (range, 92 to 121%), precision (R.S.D., <14%), and sensitivity (method quantitation limit, <24 ng/g) for each analyte indicate the usefulness of this method for studies characterizing the depletion of benzocaine residues from fish exposed to benzocaine. Copyright (C) 1999.

  17. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  18. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  19. Equivalence and Differences between Structural Equation Modeling and State-Space Modeling Techniques

    ERIC Educational Resources Information Center

    Chow, Sy-Miin; Ho, Moon-ho R.; Hamaker, Ellen L.; Dolan, Conor V.

    2010-01-01

    State-space modeling techniques have been compared to structural equation modeling (SEM) techniques in various contexts but their unique strengths have often been overshadowed by their similarities to SEM. In this article, we provide a comprehensive discussion of these 2 approaches' similarities and differences through analytic comparisons and…

  20. Precise Heat Control: What Every Scientist Needs to Know About Pyrolytic Techniques to Solve Real Problems

    NASA Technical Reports Server (NTRS)

    Devivar, Rodrigo

    2014-01-01

    The performance of a material is greatly influenced by its thermal and chemical properties. Analytical pyrolysis, when coupled to a GC-MS system, is a powerful technique that can unlock the thermal and chemical properties of almost any substance and provide vital information. At NASA, we depend on precise thermal analysis instrumentation for understanding aerospace travel. Our analytical techniques allow us to test materials in the laboratory prior to an actual field test; whether the field test is miles up in the sky or miles underground, the properties of any involved material must be fully studied and understood in the laboratory.

  1. Risk prioritisation using the analytic hierarchy process

    NASA Astrophysics Data System (ADS)

    Sum, Rabihah Md.

    2015-12-01

    This study demonstrated how to use the Analytic Hierarchy Process (AHP) to prioritise risks of an insurance company. AHP is a technique to structure complex problems by arranging elements of the problems in a hierarchy, assigning numerical values to subjective judgements on the relative importance of the elements and synthesizing the judgements to determine which elements have the highest priority. The study is motivated by wide application of AHP as a prioritisation technique in complex problems. It aims to show AHP is able to minimise some limitations of risk assessment technique using likelihood and impact. The study shows AHP is able to provide consistency check on subjective judgements, organise a large number of risks into a structured framework, assist risk managers to make explicit risk trade-offs, and provide an easy to understand and systematic risk assessment process.

  2. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  3. Dimensions of Early Speech Sound Disorders: A Factor Analytic Study

    ERIC Educational Resources Information Center

    Lewis, Barbara A.; Freebairn, Lisa A.; Hansen, Amy J.; Stein, Catherine M.; Shriberg, Lawrence D.; Iyengar, Sudha K.; Taylor, H. Gerry

    2006-01-01

    The goal of this study was to classify children with speech sound disorders (SSD) empirically, using factor analytic techniques. Participants were 3-7-year olds enrolled in speech/language therapy (N=185). Factor analysis of an extensive battery of speech and language measures provided support for two distinct factors, representing the skill…

  4. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  5. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE PAGES

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    2016-09-16

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  6. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  7. Analytic and subjective assessments of operator workload imposed by communications tasks in transport aircraft

    NASA Technical Reports Server (NTRS)

    Eckel, J. S.; Crabtree, M. S.

    1984-01-01

    Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.

  8. Bioanalytical Applications of Fluorescence Line-Narrowing and Non-Line-Narrowing Spectroscopy Interfaced with Capillary Electrophoresis and High-Performance Liquid Chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Kenneth Paul

    Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less

  9. Problems at the Leading Edge of Space Weathering as Revealed by TEM Combined with Surface Science Techniques

    NASA Astrophysics Data System (ADS)

    Christoffersen, R.; Dukes, C. A.; Keller, L. P.; Rahman, Z.; Baragiola, R. A.

    2015-11-01

    Analytical field-emission TEM techniques cross-correlated with surface analyses by X-ray photoelectron spectroscopy (XPS) provides a unique two-prong approach for characterizing how solar wind ion processing contributes to space weathering.

  10. Electromigrative separation techniques in forensic science: combining selectivity, sensitivity, and robustness.

    PubMed

    Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin

    2015-01-01

    In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.

  11. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  13. Interference by the activated sludge matrix on the analysis of soluble microbial products in wastewater.

    PubMed

    Potvin, Christopher M; Zhou, Hongde

    2011-11-01

    The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Analytical Chemistry: A Literary Approach

    NASA Astrophysics Data System (ADS)

    Lucy, Charles A.

    2000-04-01

    The benefits of incorporating real-world examples of chemistry into lectures and lessons is reflected by the recent inclusion of the Teaching with Problems and Case Studies column in this Journal. However, these examples lie outside the experience of many students, and so much of the impact of "real-world" examples is lost. This paper provides an anthology of references to analytical chemistry techniques from history, popular fiction, and film. Such references are amusing to both instructor and student. Further, the fictional descriptions can serve as a focal point for discussions of a technique's true capabilities and limitations.

  15. Exploring the Efficacy of Behavioral Skills Training to Teach Basic Behavior Analytic Techniques to Oral Care Providers

    ERIC Educational Resources Information Center

    Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella

    2012-01-01

    Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…

  16. Dialogue as Data in Learning Analytics for Productive Educational Dialogue

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    This paper provides a novel, conceptually driven stance on the state of the contemporary analytic challenges faced in the treatment of dialogue as a form of data across on- and offline sites of learning. In prior research, preliminary steps have been taken to detect occurrences of such dialogue using automated analysis techniques. Such advances…

  17. An Overview on the Importance of Combining Complementary Analytical Platforms in Metabolomic Research.

    PubMed

    Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul

    2017-01-01

    The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  19. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  20. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel.

    PubMed

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed.

  1. The Promise of Intensive Longitudinal Data Capture for Behavioral Health Research

    PubMed Central

    2014-01-01

    Advances in technology and the associated cultural norms, especially the advent of the smartphone, offer an unprecedented opportunity to collect data on relevant health behaviors and experiences unobtrusively at a greater frequency and in greater volumes than ever before. This special issue will acquaint the readership of Nicotine and Tobacco Research with the potential for intensive longitudinal data and will illustrate some innovative analytic techniques for addressing research questions associated with this type of complex data. This introductory article will provide a brief history of the analytic techniques for intensive longitudinal data and will point to some resources that support and enable the use of these techniques. PMID:24711629

  2. Multilevel Modeling and Policy Development: Guidelines and Applications to Medical Travel

    PubMed Central

    Garcia-Garzon, Eduardo; Zhukovsky, Peter; Haller, Elisa; Plakolm, Sara; Fink, David; Petrova, Dafina; Mahalingam, Vaishali; Menezes, Igor G.; Ruggeri, Kai

    2016-01-01

    Medical travel has expanded rapidly in recent years, resulting in new markets and increased access to medical care. Whereas several studies investigated the motives of individuals seeking healthcare abroad, the conventional analytical approach is limited by substantial caveats. Classical techniques as found in the literature cannot provide sufficient insight due to the nested nature of data generated. The application of adequate analytical techniques, specifically multilevel modeling, is scarce to non-existent in the context of medical travel. This study introduces the guidelines for application of multilevel techniques in public health research by presenting an application of multilevel modeling in analyzing the decision-making patterns of potential medical travelers. Benefits and potential limitations are discussed. PMID:27252672

  3. Iterative categorization (IC): a systematic technique for analysing qualitative data

    PubMed Central

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  4. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures.

    PubMed

    Montgomery, L D; Montgomery, R W; Guisado, R

    1995-05-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  5. Rheoencephalographic and electroencephalographic measures of cognitive workload: analytical procedures

    NASA Technical Reports Server (NTRS)

    Montgomery, L. D.; Montgomery, R. W.; Guisado, R.

    1995-01-01

    This investigation demonstrates the feasibility of mental workload assessment by rheoencephalographic (REG) and multichannel electroencephalographic (EEG) monitoring. During the performance of this research, unique testing, analytical and display procedures were developed for REG and EEG monitoring that extend the current state of the art and provide valuable tools for the study of cerebral circulatory and neural activity during cognition. REG records are analyzed to provide indices of the right and left hemisphere hemodynamic changes that take place during each test sequence. The EEG data are modeled using regression techniques and mathematically transformed to provide energy-density distributions of the scalp electrostatic field. These procedures permit concurrent REG/EEG cognitive testing not possible with current techniques. The introduction of a system for recording and analysis of cognitive REG/EEG test sequences facilitates the study of learning and memory disorders, dementia and other encephalopathies.

  6. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  7. Role of microextraction sampling procedures in forensic toxicology.

    PubMed

    Barroso, Mário; Moreno, Ivo; da Fonseca, Beatriz; Queiroz, João António; Gallardo, Eugenia

    2012-07-01

    The last two decades have provided analysts with more sensitive technology, enabling scientists from all analytical fields to see what they were not able to see just a few years ago. This increased sensitivity has allowed drug detection at very low concentrations and testing in unconventional samples (e.g., hair, oral fluid and sweat), where despite having low analyte concentrations has also led to a reduction in sample size. Along with this reduction, and as a result of the use of excessive amounts of potentially toxic organic solvents (with the subsequent environmental pollution and costs associated with their proper disposal), there has been a growing tendency to use miniaturized sampling techniques. Those sampling procedures allow reducing organic solvent consumption to a minimum and at the same time provide a rapid, simple and cost-effective approach. In addition, it is possible to get at least some degree of automation when using these techniques, which will enhance sample throughput. Those miniaturized sample preparation techniques may be roughly categorized in solid-phase and liquid-phase microextraction, depending on the nature of the analyte. This paper reviews recently published literature on the use of microextraction sampling procedures, with a special focus on the field of forensic toxicology.

  8. Development of airframe design technology for crashworthiness.

    NASA Technical Reports Server (NTRS)

    Kruszewski, E. T.; Thomson, R. G.

    1973-01-01

    This paper describes the NASA portion of a joint FAA-NASA General Aviation Crashworthiness Program leading to the development of improved crashworthiness design technology. The objectives of the program are to develop analytical technology for predicting crashworthiness of structures, provide design improvements, and perform full-scale crash tests. The analytical techniques which are being developed both in-house and under contract are described, and typical results from these analytical programs are shown. In addition, the full-scale testing facility and test program are discussed.

  9. Retail video analytics: an overview and survey

    NASA Astrophysics Data System (ADS)

    Connell, Jonathan; Fan, Quanfu; Gabbur, Prasad; Haas, Norman; Pankanti, Sharath; Trinh, Hoang

    2013-03-01

    Today retail video analytics has gone beyond the traditional domain of security and loss prevention by providing retailers insightful business intelligence such as store traffic statistics and queue data. Such information allows for enhanced customer experience, optimized store performance, reduced operational costs, and ultimately higher profitability. This paper gives an overview of various camera-based applications in retail as well as the state-ofthe- art computer vision techniques behind them. It also presents some of the promising technical directions for exploration in retail video analytics.

  10. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  11. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Radar cross sections of standard and complex shape targets

    NASA Technical Reports Server (NTRS)

    Sohel, M. S.

    1974-01-01

    The theoretical, analytical, and experimental results are described for radar cross sections (RCS) of different-shaped targets. Various techniques for predicting RCS are given, and RCS of finite standard targets are presented. Techniques used to predict the RCS of complex targets are made, and the RCS complex shapes are provided.

  14. Advances in functional brain imaging technology and developmental neuro-psychology: their applications in the Jungian analytic domain.

    PubMed

    Petchkovsky, Leon

    2017-06-01

    Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.

  15. High speed operation of permanent magnet machines

    NASA Astrophysics Data System (ADS)

    El-Refaie, Ayman M.

    This work proposes methods to extend the high-speed operating capabilities of both the interior PM (IPM) and surface PM (SPM) machines. For interior PM machines, this research has developed and presented the first thorough analysis of how a new bi-state magnetic material can be usefully applied to the design of IPM machines. Key elements of this contribution include identifying how the unique properties of the bi-state magnetic material can be applied most effectively in the rotor design of an IPM machine by "unmagnetizing" the magnet cavity center posts rather than the outer bridges. The importance of elevated rotor speed in making the best use of the bi-state magnetic material while recognizing its limitations has been identified. For surface PM machines, this research has provided, for the first time, a clear explanation of how fractional-slot concentrated windings can be applied to SPM machines in order to achieve the necessary conditions for optimal flux weakening. A closed-form analytical procedure for analyzing SPM machines designed with concentrated windings has been developed. Guidelines for designing SPM machines using concentrated windings in order to achieve optimum flux weakening are provided. Analytical and numerical finite element analysis (FEA) results have provided promising evidence of the scalability of the concentrated winding technique with respect to the number of poles, machine aspect ratio, and output power rating. Useful comparisons between the predicted performance characteristics of SPM machines equipped with concentrated windings and both SPM and IPM machines designed with distributed windings are included. Analytical techniques have been used to evaluate the impact of the high pole number on various converter performance metrics. Both analytical techniques and FEA have been used for evaluating the eddy-current losses in the surface magnets due to the stator winding subharmonics. Techniques for reducing these losses have been investigated. A 6kW, 36slot/30pole prototype SPM machine has been designed and built. Experimental measurements have been used to verify the analytical and FEA results. These test results have demonstrated that wide constant-power speed range can be achieved. Other important machine features such as the near-sinusoidal back-emf, high efficiency, and low cogging torque have also been demonstrated.

  16. Surface Plasmon Resonance-Based Fiber Optic Sensors Utilizing Molecular Imprinting

    PubMed Central

    Gupta, Banshi D.; Shrivastav, Anand M.; Usha, Sruthi P.

    2016-01-01

    Molecular imprinting is earning worldwide attention from researchers in the field of sensing and diagnostic applications, due to its properties of inevitable specific affinity for the template molecule. The fabrication of complementary template imprints allows this technique to achieve high selectivity for the analyte to be sensed. Sensors incorporating this technique along with surface plasmon or localized surface plasmon resonance (SPR/LSPR) provide highly sensitive real time detection with quick response times. Unfolding these techniques with optical fiber provide the additional advantages of miniaturized probes with ease of handling, online monitoring and remote sensing. In this review a summary of optical fiber sensors using the combined approaches of molecularly imprinted polymer (MIP) and the SPR/LSPR technique is discussed. An overview of the fundamentals of SPR/LSPR implementation on optical fiber is provided. The review also covers the molecular imprinting technology (MIT) with its elementary study, synthesis procedures and its applications for chemical and biological anlayte detection with different sensing methods. In conclusion, we explore the advantages, challenges and the future perspectives of developing highly sensitive and selective methods for the detection of analytes utilizing MIT with the SPR/LSPR phenomenon on optical fiber platforms. PMID:27589746

  17. Modern analytical methods for the detection of food fraud and adulteration by food category.

    PubMed

    Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook

    2017-09-01

    This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  18. Love Canal Emergency Declaration Area habitability study. Volume 3. Soil assessment: indicator chemicals. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    Environmental studies were conducted to provide data that could be used by the Commissioner of Health for the State of New York in determining whether the Emergency Declaration Area (EDA) surrounding the Love Canal hazardous-waste site is habitable. The soil assessment compared concentrations of the Love Canal Indicator Chemicals found in the EDA to concentrations found in similar western New York communities. An analytical technique was developed to detect the indicator chemicals at very low levels, i.e. 1.0 ppb. The analytical technique utilized a gas chromatograph/mass spectrometer operating in the selected ion monitoring mode. The analytical results were statistically comparedmore » between the EDA and the comparison areas using a modified Wilcoxon rank sum test.« less

  19. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  20. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  1. [Composition of chicken and quail eggs].

    PubMed

    Closa, S J; Marchesich, C; Cabrera, M; Morales, J C

    1999-06-01

    Qualified food composition data on lipids composition are needed to evaluate intakes as a risk factor in the development of heart disease. Proximal composition, cholesterol and fatty acid content of chicken and quail eggs, usually consumed or traded, were analysed. Proximal composition were determined using AOAC (1984) specific techniques; lipids were extracted by a Folch's modified technique and cholesterol and fatty acids were determined by gas chromatography. Results corroborate the stability of eggs composition. Cholesterol content of quail eggs is similar to chicken eggs, but it is almost the half content of data registered in Handbook 8. Differences may be attributed to the analytical methodology used to obtain them. This study provides data obtained with up-date analytical techniques and accessory information useful for food composition tables.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  3. Safety and quality of food contact materials. Part 1: evaluation of analytical strategies to introduce migration testing into good manufacturing practice.

    PubMed

    Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N

    2002-02-01

    The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.

  4. [Developments in preparation and experimental method of solid phase microextraction fibers].

    PubMed

    Yi, Xu; Fu, Yujie

    2004-09-01

    Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.

  5. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  6. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  7. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  8. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  10. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    PubMed

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. An analysis technique for testing log grades

    Treesearch

    Carl A. Newport; William G. O' Regan

    1963-01-01

    An analytical technique that may be used in evaluating log-grading systems is described. It also provides means of comparing two or more grading systems, or a proposed change with the system from which it was developed. The total volume and computed value of lumber from each sample log are the basic data used.

  13. MASS SPECTROMETRY IMAGING FOR DRUGS AND METABOLITES

    PubMed Central

    Greer, Tyler; Sturm, Robert; Li, Lingjun

    2011-01-01

    Mass spectrometric imaging (MSI) is a powerful analytical technique that provides two- and three-dimensional spatial maps of multiple compounds in a single experiment. This technique has been routinely applied to protein, peptide, and lipid molecules with much less research reporting small molecule distributions, especially pharmaceutical drugs. This review’s main focus is to provide readers with an up-to-date description of the substrates and compounds that have been analyzed for drug and metabolite composition using MSI technology. Additionally, ionization techniques, sample preparation, and instrumentation developments are discussed. PMID:21515430

  14. Bioinformatics Symposium of the Analytical Division of the American Chemical Society Meeting. Final Technical Report from 03/15/2000 to 03/14/2001 [sample pages of agenda, abstracts, index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Robert T.

    Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.

  15. 40 CFR 141.33 - Record maintenance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... laboratory reports may be kept, or data may be transferred to tabular summaries, provided that the following...) Laboratory and person responsible for performing analysis; (5) The analytical technique/method used; and (6...

  16. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  17. Application of a voltammetric electronic tongue and near infrared spectroscopy for a rapid umami taste assessment.

    PubMed

    Bagnasco, Lucia; Cosulich, M Elisabetta; Speranza, Giovanna; Medini, Luca; Oliveri, Paolo; Lanteri, Silvia

    2014-08-15

    The relationships between sensory attribute and analytical measurements, performed by electronic tongue (ET) and near-infrared spectroscopy (NIRS), were investigated in order to develop a rapid method for the assessment of umami taste. Commercially available umami products and some aminoacids were submitted to sensory analysis. Results were analysed in comparison with the outcomes of analytical measurements. Multivariate exploratory analysis was performed by principal component analysis (PCA). Calibration models for prediction of the umami taste on the basis of ET and NIR signals were obtained using partial least squares (PLS) regression. Different approaches for merging data from the two different analytical instruments were considered. Both of the techniques demonstrated to provide information related with umami taste. In particular, ET signals showed the higher correlation with umami attribute. Data fusion was found to be slightly beneficial - not so significantly as to justify the coupled use of the two analytical techniques. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  19. Elements of analytic style: Bion's clinical seminars.

    PubMed

    Ogden, Thomas H

    2007-10-01

    The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.

  20. Data management system performance modeling

    NASA Technical Reports Server (NTRS)

    Kiser, Larry M.

    1993-01-01

    This paper discusses analytical techniques that have been used to gain a better understanding of the Space Station Freedom's (SSF's) Data Management System (DMS). The DMS is a complex, distributed, real-time computer system that has been redesigned numerous times. The implications of these redesigns have not been fully analyzed. This paper discusses the advantages and disadvantages for static analytical techniques such as Rate Monotonic Analysis (RMA) and also provides a rationale for dynamic modeling. Factors such as system architecture, processor utilization, bus architecture, queuing, etc. are well suited for analysis with a dynamic model. The significance of performance measures for a real-time system are discussed.

  1. A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines

    PubMed Central

    Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua

    2018-01-01

    The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905

  2. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  3. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    PubMed

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  4. Analytical Characterisation of Nanoscale Zero-Valent Iron: A ...

    EPA Pesticide Factsheets

    Zero-valent iron nanoparticles (nZVI) have been widely tested as they are showing significant promise for environmental remediation. However, many recent studies have demonstrated that their mobility and reactivity in subsurface environments are significantly affected by their tendency to aggregate. Both the mobility and reactivity of nZVI mainly depends on properties such as particle size, surface chemistry and bulk composition. In order to ensure efficient remediation, it is crucial to accurately assess and understand the implications of these properties before deploying these materials into contaminated environments. Many analytical techniques are now available to determine these parameters and this paper provides a critical review of their usefulness and limitations for nZVI characterisation. These analytical techniques include microscopy and light scattering techniques for the determination of particle size, size distribution and aggregation state, and X-ray techniques for the characterisation of surface chemistry and bulk composition. Example characterisation data derived from commercial nZVI materials is used to further illustrate method strengths and limitations. Finally, some important challenges with respect to the characterisation of nZVI in groundwater samples are discussed. In recent years, manufactured nanoparticles (MNPs) have attracted increasing interest for their potential applications in the treatment of contaminated soil and water. In compar

  5. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  6. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  7. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  8. Quantitative DNA fiber mapping

    DOEpatents

    Gray, Joe W.; Weier, Heinz-Ulrich G.

    1998-01-01

    The present invention relates generally to the DNA mapping and sequencing technologies. In particular, the present invention provides enhanced methods and compositions for the physical mapping and positional cloning of genomic DNA. The present invention also provides a useful analytical technique to directly map cloned DNA sequences onto individual stretched DNA molecules.

  9. High resolution x-ray CMT: Reconstruction methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, J.K.

    This paper qualitatively discusses the primary characteristics of methods for reconstructing tomographic images from a set of projections. These reconstruction methods can be categorized as either {open_quotes}analytic{close_quotes} or {open_quotes}iterative{close_quotes} techniques. Analytic algorithms are derived from the formal inversion of equations describing the imaging process, while iterative algorithms incorporate a model of the imaging process and provide a mechanism to iteratively improve image estimates. Analytic reconstruction algorithms are typically computationally more efficient than iterative methods; however, analytic algorithms are available for a relatively limited set of imaging geometries and situations. Thus, the framework of iterative reconstruction methods is better suited formore » high accuracy, tomographic reconstruction codes.« less

  10. Trace metal speciation in natural waters: Computational vs. analytical

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    1996-01-01

    Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.

  11. Wing download reduction using vortex trapping plates

    NASA Technical Reports Server (NTRS)

    Light, Jeffrey S.; Stremel, Paul M.; Bilanin, Alan J.

    1994-01-01

    A download reduction technique using spanwise plates on the upper and lower wing surfaces has been examined. Experimental and analytical techniques were used to determine the download reduction obtained using this technique. Simple two-dimensional wind tunnel testing confirmed the validity of the technique for reducing two-dimensional airfoil drag. Computations using a two-dimensional Navier-Stokes analysis provided insight into the mechanism causing the drag reduction. Finally, the download reduction technique was tested using a rotor and wing to determine the benefits for a semispan configuration representative of a tilt rotor aircraft.

  12. Matrix vapor deposition/recrystallization and dedicated spray preparation for high-resolution scanning microprobe matrix-assisted laser desorption/ionization imaging mass spectrometry (SMALDI-MS) of tissue and single cells.

    PubMed

    Bouschen, Werner; Schulz, Oliver; Eikel, Daniel; Spengler, Bernhard

    2010-02-01

    Matrix preparation techniques such as air spraying or vapor deposition were investigated with respect to lateral migration, integration of analyte into matrix crystals and achievable lateral resolution for the purpose of high-resolution biological imaging. The accessible mass range was found to be beyond 5000 u with sufficient analytical sensitivity. Gas-assisted spraying methods (using oxygen-free gases) provide a good compromise between crystal integration of analyte and analyte migration within the sample. Controlling preparational parameters with this method, however, is difficult. Separation of the preparation procedure into two steps, instead, leads to an improved control of migration and incorporation. The first step is a dry vapor deposition of matrix onto the investigated sample. In a second step, incorporation of analyte into the matrix crystal is enhanced by a controlled recrystallization of matrix in a saturated water atmosphere. With this latter method an effective analytical resolution of 2 microm in the x and y direction was achieved for scanning microprobe matrix-assisted laser desorption/ionization imaging mass spectrometry (SMALDI-MS). Cultured A-498 cells of human renal carcinoma were successfully investigated by high-resolution MALDI imaging using the new preparation techniques. Copyright 2010 John Wiley & Sons, Ltd.

  13. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  14. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  15. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  16. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  17. Combination of Cyclodextrin and Ionic Liquid in Analytical Chemistry: Current and Future Perspectives.

    PubMed

    Hui, Boon Yih; Raoov, Muggundha; Zain, Nur Nadhirah Mohamad; Mohamad, Sharifah; Osman, Hasnah

    2017-09-03

    The growth in driving force and popularity of cyclodextrin (CDs) and ionic liquids (ILs) as promising materials in the field of analytical chemistry has resulted in an exponentially increase of their exploitation and production in analytical chemistry field. CDs belong to the family of cyclic oligosaccharides composing of α-(1,4) linked glucopyranose subunits and possess a cage-like supramolecular structure. This structure enables chemical reactions to proceed between interacting ions, radical or molecules in the absence of covalent bonds. Conversely, ILs are an ionic fluids comprising of only cation and anion often with immeasurable vapor pressure making them as green or designer solvent. The cooperative effect between CD and IL due to their fascinating properties, have nowadays contributed their footprints for a better development in analytical chemistry nowadays. This comprehensive review serves to give an overview on some of the recent studies and provides an analytical trend for the application of CDs with the combination of ILs that possess beneficial and remarkable effects in analytical chemistry including their use in various sample preparation techniques such as solid phase extraction, magnetic solid phase extraction, cloud point extraction, microextraction, and separation techniques which includes gas chromatography, high-performance liquid chromatography, capillary electrophoresis as well as applications of electrochemical sensors as electrode modifiers with references to recent applications. This review will highlight the nature of interactions and synergic effects between CDs, ILs, and analytes. It is hoped that this review will stimulate further research in analytical chemistry.

  18. TomoPhantom, a software package to generate 2D-4D analytical phantoms for CT image reconstruction algorithm benchmarks

    NASA Astrophysics Data System (ADS)

    Kazantsev, Daniil; Pickalov, Valery; Nagella, Srikanth; Pasca, Edoardo; Withers, Philip J.

    2018-01-01

    In the field of computerized tomographic imaging, many novel reconstruction techniques are routinely tested using simplistic numerical phantoms, e.g. the well-known Shepp-Logan phantom. These phantoms cannot sufficiently cover the broad spectrum of applications in CT imaging where, for instance, smooth or piecewise-smooth 3D objects are common. TomoPhantom provides quick access to an external library of modular analytical 2D/3D phantoms with temporal extensions. In TomoPhantom, quite complex phantoms can be built using additive combinations of geometrical objects, such as, Gaussians, parabolas, cones, ellipses, rectangles and volumetric extensions of them. Newly designed phantoms are better suited for benchmarking and testing of different image processing techniques. Specifically, tomographic reconstruction algorithms which employ 2D and 3D scanning geometries, can be rigorously analyzed using the software. TomoPhantom also provides a capability of obtaining analytical tomographic projections which further extends the applicability of software towards more realistic, free from the "inverse crime" testing. All core modules of the package are written in the C-OpenMP language and wrappers for Python and MATLAB are provided to enable easy access. Due to C-based multi-threaded implementation, volumetric phantoms of high spatial resolution can be obtained with computational efficiency.

  19. Combined use of quantitative ED-EPMA, Raman microspectrometry, and ATR-FTIR imaging techniques for the analysis of individual particles.

    PubMed

    Jung, Hae-Jin; Eom, Hyo-Jin; Kang, Hyun-Woo; Moreau, Myriam; Sobanska, Sophie; Ro, Chul-Un

    2014-08-21

    In this work, quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA) (called low-Z particle EPMA), Raman microspectrometry (RMS), and attenuated total reflectance Fourier transform infrared spectroscopic (ATR-FTIR) imaging were applied in combination for the analysis of the same individual airborne particles for the first time. After examining individual particles of micrometer size by low-Z particle EPMA, consecutive examinations by RMS and ATR-FTIR imaging of the same individual particles were then performed. The relocation of the same particles on Al or Ag foils was successfully carried out among the three standalone instruments for several standard samples and an indoor airborne particle sample, resulting in the successful acquisition of quality spectral data from the three single-particle analytical techniques. The combined application of the three techniques to several different standard particles confirmed that those techniques provided consistent and complementary chemical composition information on the same individual particles. Further, it was clearly demonstrated that the three different types of spectral and imaging data from the same individual particles in an indoor aerosol sample provided richer information on physicochemical characteristics of the particle ensemble than that obtainable by the combined use of two single-particle analytical techniques.

  20. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  1. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  2. Performance Analyses of Intercity Ground Passenger Transportation Systems

    DOT National Transportation Integrated Search

    1976-04-01

    This report documents the development of analytical techniques and their use for investigating the performance of intercity ground passenger transportation systems. The purpose of the study is twofold: (1) to provide a capability of evaluating new pa...

  3. Ultrasensitive detection of target analyte-induced aggregation of gold nanoparticles using laser-induced nanoparticle Rayleigh scattering.

    PubMed

    Lin, Jia-Hui; Tseng, Wei-Lung

    2015-01-01

    Detection of salt- and analyte-induced aggregation of gold nanoparticles (AuNPs) mostly relies on costly and bulky analytical instruments. To response this drawback, a portable, miniaturized, sensitive, and cost-effective detection technique is urgently required for rapid field detection and monitoring of target analyte via the use of AuNP-based sensor. This study combined a miniaturized spectrometer with a 532-nm laser to develop a laser-induced Rayleigh scattering technique, allowing the sensitive and selective detection of Rayleigh scattering from the aggregated AuNPs. Three AuNP-based sensing systems, including salt-, thiol- and metal ion-induced aggregation of the AuNPs, were performed to examine the sensitivity of laser-induced Rayleigh scattering technique. Salt-, thiol-, and metal ion-promoted NP aggregation were exemplified by the use of aptamer-adsorbed, fluorosurfactant-stabilized, and gallic acid-capped AuNPs for probing K(+), S-adenosylhomocysteine hydrolase-induced hydrolysis of S-adenosylhomocysteine, and Pb(2+), in sequence. Compared to the reported methods for monitoring the aggregated AuNPs, the proposed system provided distinct advantages of sensitivity. Laser-induced Rayleigh scattering technique was improved to be convenient, cheap, and portable by replacing a diode laser and a miniaturized spectrometer with a laser pointer and a smart-phone. Using this smart-phone-based detection platform, we can determine whether or not the Pb(2+) concentration exceed the maximum allowable level of Pb(2+) in drinking water. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.

    PubMed

    Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L

    2013-01-01

    Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.

  5. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review.

    PubMed

    Dwivedi, D; Lepkova, K; Becker, T

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  6. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Lepkova, K.; Becker, T.

    2017-03-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed.

  7. Emerging surface characterization techniques for carbon steel corrosion: a critical brief review

    PubMed Central

    Dwivedi, D.; Becker, T.

    2017-01-01

    Carbon steel is a preferred construction material in many industrial and domestic applications, including oil and gas pipelines, where corrosion mitigation using film-forming corrosion inhibitor formulations is a widely accepted method. This review identifies surface analytical techniques that are considered suitable for analysis of thin films at metallic substrates, but are yet to be applied to analysis of carbon steel surfaces in corrosive media or treated with corrosion inhibitors. The reviewed methods include time of flight-secondary ion mass spectrometry, X-ray absorption spectroscopy methods, particle-induced X-ray emission, Rutherford backscatter spectroscopy, Auger electron spectroscopy, electron probe microanalysis, near-edge X-ray absorption fine structure spectroscopy, X-ray photoemission electron microscopy, low-energy electron diffraction, small-angle neutron scattering and neutron reflectometry, and conversion electron Moessbauer spectrometry. Advantages and limitations of the analytical methods in thin-film surface investigations are discussed. Technical parameters of nominated analytical methods are provided to assist in the selection of suitable methods for analysis of metallic substrates deposited with surface films. The challenges associated with the applications of the emerging analytical methods in corrosion science are also addressed. PMID:28413351

  8. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  9. Interviewing a Silent (Radioactive) Witness through Nuclear Forensic Analysis.

    PubMed

    Mayer, Klaus; Wallenius, Maria; Varga, Zsolt

    2015-12-01

    Nuclear forensics is a relatively young discipline in science which aims at providing information on nuclear material of unknown origin. The determination of characteristic parameters through tailored analytical techniques enables establishing linkages to the material's processing history and hence provides hints on its place and date of production and on the intended use.

  10. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  11. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  12. Grabbing the Air Force by the Tail: Applying Strategic Cost Analytics to Understand and Manage Indirect Cost Behavior

    DTIC Science & Technology

    2015-09-17

    impact influenced by its internal and external supply chain activities. This starts with understanding how we currently apply advanced analytic techniques... minimalistic model provides for sufficient degrees of freedom to guard against overfitting. Second, to guard against the possibility of an over-trained model in... starting with 1*. 201* All Military compensation EEICs starting with 201*. Facility Sustainment 52* & 56* All facility maintenance, repair and minor

  13. Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.

    PubMed

    Stolper, Charles D; Perer, Adam; Gotz, David

    2014-12-01

    As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.

  14. Recognizing and Reducing Analytical Errors and Sources of Variation in Clinical Pathology Data in Safety Assessment Studies.

    PubMed

    Schultze, A E; Irizarry, A R

    2017-02-01

    Veterinary clinical pathologists are well positioned via education and training to assist in investigations of unexpected results or increased variation in clinical pathology data. Errors in testing and unexpected variability in clinical pathology data are sometimes referred to as "laboratory errors." These alterations may occur in the preanalytical, analytical, or postanalytical phases of studies. Most of the errors or variability in clinical pathology data occur in the preanalytical or postanalytical phases. True analytical errors occur within the laboratory and are usually the result of operator or instrument error. Analytical errors are often ≤10% of all errors in diagnostic testing, and the frequency of these types of errors has decreased in the last decade. Analytical errors and increased data variability may result from instrument malfunctions, inability to follow proper procedures, undetected failures in quality control, sample misidentification, and/or test interference. This article (1) illustrates several different types of analytical errors and situations within laboratories that may result in increased variability in data, (2) provides recommendations regarding prevention of testing errors and techniques to control variation, and (3) provides a list of references that describe and advise how to deal with increased data variability.

  15. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  16. Intermodal Ground Access to Airports: A Planning Guide - A Good Start

    DOT National Transportation Integrated Search

    1997-01-01

    This Guide is designed to provide policy guidance, rules of thumb, data, and analytical techniques related to airport access. It has been prepared to help airport operator, local governments, metropolitan planning organizations, consultants and other...

  17. 40 CFR 79.11 - Information and assurances to be provided by the fuel manufacturer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... description (or identification, in the case of a generally accepted method) of a suitable analytical technique... sold, offered for sale, or introduced into commerce for use in motor vehicles manufactured after model...

  18. Development and Calibration of Highway Safety Manual Equations for Florida Conditions

    DOT National Transportation Integrated Search

    2011-08-31

    The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for quantifying the potential effects on crashes as a result of decisions made in planning, design, operations, and maintenance. Implementation of the new te...

  19. Development and calibration of highway safety manual equations for Florida conditions.

    DOT National Transportation Integrated Search

    2011-08-31

    The Highway Safety Manual (HSM) provides statistically-valid analytical tools and techniques for : quantifying the potential effects on crashes as a result of decisions made in planning, design, : operations, and maintenance. Implementation of the ne...

  20. Forensic Chemistry--A Symposium Collection.

    ERIC Educational Resources Information Center

    Journal of Chemical Education, 1985

    1985-01-01

    Presents a collection of articles to provide chemistry teachers with resource materials to add forensic chemistry units to their chemistry courses. Topics range from development of forensic science laboratory courses and mock-crime scenes to forensic serology and analytical techniques. (JN)

  1. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  2. Practical Considerations for Determination of Glass Transition Temperature of a Maximally Freeze Concentrated Solution.

    PubMed

    Pansare, Swapnil K; Patel, Sajal Manubhai

    2016-08-01

    Glass transition temperature is a unique thermal characteristic of amorphous systems and is associated with changes in physical properties such as heat capacity, viscosity, electrical resistance, and molecular mobility. Glass transition temperature for amorphous solids is referred as (T g), whereas for maximally freeze concentrated solution, the notation is (T g'). This article is focused on the factors affecting determination of T g' for application to lyophilization process design and frozen storage stability. Also, this review provides a perspective on use of various types of solutes in protein formulation and their effect on T g'. Although various analytical techniques are used for determination of T g' based on the changes in physical properties associated with glass transition, the differential scanning calorimetry (DSC) is the most commonly used technique. In this article, an overview of DSC technique is provided along with brief discussion on the alternate analytical techniques for T g' determination. Additionally, challenges associated with T g' determination, using DSC for protein formulations, are discussed. The purpose of this review is to provide a practical industry perspective on determination of T g' for protein formulations as it relates to design and development of lyophilization process and/or for frozen storage; however, a comprehensive review of glass transition temperature (T g, T g'), in general, is outside the scope of this work.

  3. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    PubMed

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.

  4. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  5. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  7. Electrospray Modifications for Advancing Mass Spectrometric Analysis

    PubMed Central

    Meher, Anil Kumar; Chen, Yu-Chie

    2017-01-01

    Generation of analyte ions in gas phase is a primary requirement for mass spectrometric analysis. One of the ionization techniques that can be used to generate gas phase ions is electrospray ionization (ESI). ESI is a soft ionization method that can be used to analyze analytes ranging from small organics to large biomolecules. Numerous ionization techniques derived from ESI have been reported in the past two decades. These ion sources are aimed to achieve simplicity and ease of operation. Many of these ionization methods allow the flexibility for elimination or minimization of sample preparation steps prior to mass spectrometric analysis. Such ion sources have opened up new possibilities for taking scientific challenges, which might be limited by the conventional ESI technique. Thus, the number of ESI variants continues to increase. This review provides an overview of ionization techniques based on the use of electrospray reported in recent years. Also, a brief discussion on the instrumentation, underlying processes, and selected applications is also presented. PMID:28573082

  8. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less

  10. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  11. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  12. Spherical indentation of a freestanding circular membrane revisited: Analytical solutions and experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Congrui; Davoodabadi, Ali; Li, Jianlin

    Because of the development of novel micro-fabrication techniques to produce ultra-thin materials and increasing interest in thin biological membranes, in recent years, the mechanical characterization of thin films has received a significant amount of attention. To provide a more accurate solution for the relationship among contact radius, load and deflection, the fundamental and widely applicable problem of spherical indentation of a freestanding circular membrane have been revisited. The work presented here significantly extends the previous contributions by providing an exact analytical solution to the governing equations of Föppl–Hecky membrane indented by a frictionless spherical indenter. In this study, experiments ofmore » spherical indentation has been performed, and the exact analytical solution presented in this article is compared against experimental data from existing literature as well as our own experimental results.« less

  13. Spherical indentation of a freestanding circular membrane revisited: Analytical solutions and experiments

    DOE PAGES

    Jin, Congrui; Davoodabadi, Ali; Li, Jianlin; ...

    2017-01-11

    Because of the development of novel micro-fabrication techniques to produce ultra-thin materials and increasing interest in thin biological membranes, in recent years, the mechanical characterization of thin films has received a significant amount of attention. To provide a more accurate solution for the relationship among contact radius, load and deflection, the fundamental and widely applicable problem of spherical indentation of a freestanding circular membrane have been revisited. The work presented here significantly extends the previous contributions by providing an exact analytical solution to the governing equations of Föppl–Hecky membrane indented by a frictionless spherical indenter. In this study, experiments ofmore » spherical indentation has been performed, and the exact analytical solution presented in this article is compared against experimental data from existing literature as well as our own experimental results.« less

  14. Constructing and predicting solitary pattern solutions for nonlinear time-fractional dispersive partial differential equations

    NASA Astrophysics Data System (ADS)

    Arqub, Omar Abu; El-Ajou, Ahmad; Momani, Shaher

    2015-07-01

    Building fractional mathematical models for specific phenomena and developing numerical or analytical solutions for these fractional mathematical models are crucial issues in mathematics, physics, and engineering. In this work, a new analytical technique for constructing and predicting solitary pattern solutions of time-fractional dispersive partial differential equations is proposed based on the generalized Taylor series formula and residual error function. The new approach provides solutions in the form of a rapidly convergent series with easily computable components using symbolic computation software. For method evaluation and validation, the proposed technique was applied to three different models and compared with some of the well-known methods. The resultant simulations clearly demonstrate the superiority and potentiality of the proposed technique in terms of the quality performance and accuracy of substructure preservation in the construct, as well as the prediction of solitary pattern solutions for time-fractional dispersive partial differential equations.

  15. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  16. Nuclear analytical techniques in medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cesareo, R.

    1988-01-01

    This book acquaints one with the fundamental principles and the instrumentation relevant to analytical technique based on atomic and nuclear physics, as well as present and future biomedical applications. Besides providing a theoretical description of the physical phenomena, a large part of the book is devoted to applications in the medical and biological field, particularly in hematology, forensic medicine and environmental science. This volume reviews methods such as the possibility of carrying out rapid multi-element analysis of trace elements on biomedical samples, in vitro and in vivo, by XRF-analysis; the ability of the PIXE-microprobe to analyze in detail and tomore » map trace elements in fragments of biomedical samples or inside the cells; the potentiality of in vivo nuclear activation analysis for diagnostic purposes. Finally, techniques are described such as radiation scattering (elastic and inelastic scattering) and attenuation measurements which will undoubtedly see great development in the immediate future.« less

  17. Application of analytical methods in authentication and adulteration of honey.

    PubMed

    Siddiqui, Amna Jabbar; Musharraf, Syed Ghulam; Choudhary, M Iqbal; Rahman, Atta-Ur-

    2017-02-15

    Honey is synthesized from flower nectar and it is famous for its tremendous therapeutic potential since ancient times. Many factors influence the basic properties of honey including the nectar-providing plant species, bee species, geographic area, and harvesting conditions. Quality and composition of honey is also affected by many other factors, such as overfeeding of bees with sucrose, harvesting prior to maturity, and adulteration with sugar syrups. Due to the complex nature of honey, it is often challenging to authenticate the purity and quality by using common methods such as physicochemical parameters and more specialized procedures need to be developed. This article reviews the literature (between 2000 and 2016) on the use of analytical techniques, mainly NMR spectroscopy, for authentication of honey, its botanical and geographical origin, and adulteration by sugar syrups. NMR is a powerful technique and can be used as a fingerprinting technique to compare various samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Service Bundle Recommendation for Person-Centered Care Planning in Cities.

    PubMed

    Kotoulas, Spyros; Daly, Elizabeth; Tommasi, Pierpaolo; Kishimoto, Akihiro; Lopez, Vanessa; Stephenson, Martin; Botea, Adi; Sbodio, Marco; Marinescu, Radu; Rooney, Ronan

    2016-01-01

    Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning.

  19. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  20. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Analysis of high-aspect-ratio jet-flap wings of arbitrary geometry

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    An analytical technique to compute the performance of an arbitrary jet-flapped wing is developed. The solution technique is based on the method of Maskell and Spence in which the well-known lifting-line approach is coupled with an auxiliary equation providing the extra function needed in jet-flap theory. The present method is generalized to handle straight, uncambered wings of arbitrary planform, twist, and blowing (including unsymmetrical cases). An analytical procedure is developed for continuous variations in the above geometric data with special functions to exactly treat discontinuities in any of the geometric and blowing data. A rational theory for the effect of finite wing thickness is introduced as well as simplified concepts of effective aspect ratio for rapid estimation of performance.

  2. Subsynchronous instability of a geared centrifugal compressor of overhung design

    NASA Technical Reports Server (NTRS)

    Hudson, J. H.; Wittman, L. J.

    1980-01-01

    The original design analysis and shop test data are presented for a three stage (poster) air compressor with impellers mounted on the extensions of a twin pinion gear, and driven by an 8000 hp synchronous motor. Also included are field test data, subsequent rotor dynamics analysis, modifications, and final rotor behavior. A subsynchronous instability existed on a geared, overhung rotor. State-of-the-art rotor dynamics analysis techniques provided a reasonable analytical model of the rotor. A bearing modification arrived at analytically eliminated the instability.

  3. Dual nozzle aerodynamic and cooling analysis study

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1981-01-01

    Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.

  4. Mum's the word. Are we becoming silent on masturbation?

    PubMed

    Barrett, Denia G

    2012-01-01

    This paper explores the trend in contemporary child analytic technique away from addressing material related to masturbation. The author invites reconsideration of the value of timely, tactful exploration of a child's impulses, fantasies, and related conflicts. The analyst's resistances to open discussion of these are addressed, along with the limiting effect this may have on the patient feeling fully understood. Clinical examples are provided of analytic work with children from prelatency through preadolescence, whose symptoms range from neurotic conflict to more severe and early disturbances.

  5. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  6. Speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance-Fourier transform-infrared imaging and quantitative energy-dispersive electron probe X-ray microanalysis techniques.

    PubMed

    Jung, Hae-Jin; Malek, Md Abdul; Ryu, JiYeon; Kim, BoWha; Song, Young-Chul; Kim, HyeKyeong; Ro, Chul-Un

    2010-07-15

    Our previous work demonstrated for the first time the potential of the combined use of two techniques, attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis, low-Z particle EPMA, for the characterization of individual aerosol particles. In this work, the speciation of mineral particles was performed on a single particle level for 24 mineral samples, including kaolinite, montmorillonite, vermiculite, talc, quartz, feldspar, calcite, gypsum, and apatite, by the combined use of ATR-FT-IR imaging and low-Z particle EPMA techniques. These two single particle analytical techniques provide complementary information, the ATR-FT-IR imaging on mineral types and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles. This work demonstrates that the combined use of the two single particle analytical techniques can powerfully characterize externally heterogeneous mineral particle samples in detail and has great potential for the characterization of airborne mineral dust particles.

  7. Current applications of high-resolution mass spectrometry for the analysis of new psychoactive substances: a critical review.

    PubMed

    Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin

    2017-10-01

    The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.

  8. Material property analytical relations for the case of an AFM probe tapping a viscoelastic surface containing multiple characteristic times

    PubMed Central

    López-Guerra, Enrique A

    2017-01-01

    We explore the contact problem of a flat-end indenter penetrating intermittently a generalized viscoelastic surface, containing multiple characteristic times. This problem is especially relevant for nanoprobing of viscoelastic surfaces with the highly popular tapping-mode AFM imaging technique. By focusing on the material perspective and employing a rigorous rheological approach, we deliver analytical closed-form solutions that provide physical insight into the viscoelastic sources of repulsive forces, tip–sample dissipation and virial of the interaction. We also offer a systematic comparison to the well-established standard harmonic excitation, which is the case relevant for dynamic mechanical analysis (DMA) and for AFM techniques where tip–sample sinusoidal interaction is permanent. This comparison highlights the substantial complexity added by the intermittent-contact nature of the interaction, which precludes the derivation of straightforward equations as is the case for the well-known harmonic excitations. The derivations offered have been thoroughly validated through numerical simulations. Despite the complexities inherent to the intermittent-contact nature of the technique, the analytical findings highlight the potential feasibility of extracting meaningful viscoelastic properties with this imaging method. PMID:29114450

  9. Electrostatic Interactions between OmpG Nanopore and Analyte Protein Surface Can Distinguish between Glycosylated Isoforms.

    PubMed

    Fahie, Monifa A; Chen, Min

    2015-08-13

    The flexible loops decorating the entrance of OmpG nanopore move dynamically during ionic current recording. The gating caused by these flexible loops changes when a target protein is bound. The gating is characterized by parameters including frequency, duration, and open-pore current, and these features combine to reveal the identity of a specific analyte protein. Here, we show that OmpG nanopore equipped with a biotin ligand can distinguish glycosylated and deglycosylated isoforms of avidin by their differences in surface charge. Our studies demonstrate that the direct interaction between the nanopore and analyte surface, induced by the electrostatic attraction between the two molecules, is essential for protein isoform detection. Our technique is remarkably sensitive to the analyte surface, which may provide a useful tool for glycoprotein profiling.

  10. What We Do and Do Not Know about Teaching Medical Image Interpretation.

    PubMed

    Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F

    2017-01-01

    Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.

  11. An analytical-numerical approach for parameter determination of a five-parameter single-diode model of photovoltaic cells and modules

    NASA Astrophysics Data System (ADS)

    Hejri, Mohammad; Mokhtari, Hossein; Azizian, Mohammad Reza; Söder, Lennart

    2016-04-01

    Parameter extraction of the five-parameter single-diode model of solar cells and modules from experimental data is a challenging problem. These parameters are evaluated from a set of nonlinear equations that cannot be solved analytically. On the other hand, a numerical solution of such equations needs a suitable initial guess to converge to a solution. This paper presents a new set of approximate analytical solutions for the parameters of a five-parameter single-diode model of photovoltaic (PV) cells and modules. The proposed solutions provide a good initial point which guarantees numerical analysis convergence. The proposed technique needs only a few data from the PV current-voltage characteristics, i.e. open circuit voltage Voc, short circuit current Isc and maximum power point current and voltage Im; Vm making it a fast and low cost parameter determination technique. The accuracy of the presented theoretical I-V curves is verified by experimental data.

  12. Guidance to Achieve Accurate Aggregate Quantitation in Biopharmaceuticals by SV-AUC.

    PubMed

    Arthur, Kelly K; Kendrick, Brent S; Gabrielson, John P

    2015-01-01

    The levels and types of aggregates present in protein biopharmaceuticals must be assessed during all stages of product development, manufacturing, and storage of the finished product. Routine monitoring of aggregate levels in biopharmaceuticals is typically achieved by size exclusion chromatography (SEC) due to its high precision, speed, robustness, and simplicity to operate. However, SEC is error prone and requires careful method development to ensure accuracy of reported aggregate levels. Sedimentation velocity analytical ultracentrifugation (SV-AUC) is an orthogonal technique that can be used to measure protein aggregation without many of the potential inaccuracies of SEC. In this chapter, we discuss applications of SV-AUC during biopharmaceutical development and how characteristics of the technique make it better suited for some applications than others. We then discuss the elements of a comprehensive analytical control strategy for SV-AUC. Successful implementation of these analytical control elements ensures that SV-AUC provides continued value over the long time frames necessary to bring biopharmaceuticals to market. © 2015 Elsevier Inc. All rights reserved.

  13. Comments on "A Closed-Form Solution to Tensor Voting: Theory and Applications".

    PubMed

    Maggiori, Emmanuel; Lotito, Pablo; Manterola, Hugo Luis; del Fresno, Mariana

    2014-12-01

    We comment on a paper that describes a closed-form formulation to Tensor Voting, a technique to perceptually group clouds of points, usually applied to infer features in images. The authors proved an analytic solution to the technique, a highly relevant contribution considering that the original formulation required numerical integration, a time-consuming task. Their work constitutes the first closed-form expression for the Tensor Voting framework. In this work we first observe that the proposed formulation leads to unexpected results which do not satisfy the constraints for a Tensor Voting output, hence they cannot be interpreted. Given that the closed-form expression is said to be an analytic equivalent solution, unexpected outputs should not be encountered unless there are flaws in the proof. We analyzed the underlying math to find which were the causes of these unexpected results. In this commentary we show that their proposal does not in fact provide a proper analytic solution to Tensor Voting and we indicate the flaws in the proof.

  14. HANDBOOK: CONTINUOUS EMISSION MONITORING SYSTEMS FOR NON-CRITERIA POLLUTANTS

    EPA Science Inventory

    This Handbook provides a description of the methods used to continuously monitor non-criteria pollutants emitted from stationary sources. The Handbook contains a review of current regulatory programs, the state-of-the-art sampling system design, analytical techniques, and the use...

  15. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  16. Photochemical Degradation of the Anticancer Drug Bortezomib by V-UV/UV (185/254 nm) Investigated by (1)H NMR Fingerprinting: A Way to Follow Aromaticity Evolution.

    PubMed

    Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence

    2015-06-18

    We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.

  17. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    PubMed

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  18. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  19. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  20. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  1. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  2. Developing the role of big data and analytics in health professional education.

    PubMed

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  3. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  4. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  5. Chemical speciation of individual airborne particles by the combined use of quantitative energy-dispersive electron probe X-ray microanalysis and attenuated total reflection Fourier transform-infrared imaging techniques.

    PubMed

    Song, Young-Chul; Ryu, JiYeon; Malek, Md Abdul; Jung, Hae-Jin; Ro, Chul-Un

    2010-10-01

    In our previous work, it was demonstrated that the combined use of attenuated total reflectance (ATR) FT-IR imaging and quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA), named low-Z particle EPMA, had the potential for characterization of individual aerosol particles. Additionally, the speciation of individual mineral particles was performed on a single particle level by the combined use of the two techniques, demonstrating that simultaneous use of the two single particle analytical techniques is powerful for the detailed characterization of externally heterogeneous mineral particle samples and has great potential for characterization of atmospheric mineral dust aerosols. These single particle analytical techniques provide complementary information on the physicochemical characteristics of the same individual particles, such as low-Z particle EPMA on morphology and elemental concentrations and the ATR-FT-IR imaging on molecular species, crystal structures, functional groups, and physical states. In this work, this analytical methodology was applied to characterize an atmospheric aerosol sample collected in Incheon, Korea. Overall, 118 individual particles were observed to be primarily NaNO(3)-containing, Ca- and/or Mg-containing, silicate, and carbonaceous particles, although internal mixing states of the individual particles proved complicated. This work demonstrates that more detailed physiochemical properties of individual airborne particles can be obtained using this approach than when either the low-Z particle EPMA or ATR-FT-IR imaging technique is used alone.

  6. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  7. Computational overlay metrology with adaptive data analytics

    NASA Astrophysics Data System (ADS)

    Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris

    2017-03-01

    With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.

  8. METHODOLOGY TO EVALUATE THE POTENTIAL FOR GROUND WATER CONTAMINATION FROM GEOTHERMAL FLUID RELEASES

    EPA Science Inventory

    This report provides analytical methods and graphical techniques to predict potential ground water contamination from geothermal energy development. Overflows and leaks from ponds, pipe leaks, well blowouts, leaks from well casing, and migration from injection zones can be handle...

  9. The use of surface-enhanced Raman scattering for detecting molecular evidence of life in rocks, sediments, and sedimentary deposits.

    PubMed

    Bowden, Stephen A; Wilson, Rab; Cooper, Jonathan M; Parnell, John

    2010-01-01

    Raman spectroscopy is a versatile analytical technique capable of characterizing the composition of both inorganic and organic materials. Consequently, it is frequently suggested as a payload on many planetary landers. Only approximately 1 in every 10(6) photons are Raman scattered; therefore, the detection of trace quantities of an analyte dispersed in a sample matrix can be much harder to achieve. To overcome this, surface-enhanced Raman scattering (SERS) and surface-enhanced resonance Raman scattering (SERRS) both provide greatly enhanced signals (enhancements between 10(5) and 10(9)) through the analyte's interaction with the locally generated surface plasmons, which occur at a "roughened" or nanostructured metallic surface (e.g., Cu, Au, and Ag). Both SERS and SERRS may therefore provide a viable technique for trace analysis of samples. In this paper, we describe the development of SERS assays for analyzing trace amounts of compounds present in the solvent extracts of sedimentary deposits. These assays were used to detect biological pigments present in an Arctic microoasis (a small locale of elevated biological productivity) and its detrital regolith, characterize the pigmentation of microbial mats around hydrothermal springs, and detect fossil organic matter in hydrothermal deposits. These field study examples demonstrate that SERS technology is sufficiently mature to be applied to many astrobiological analog studies on Earth. Many current and proposed imaging systems intended for remote deployment already posses the instrumental components needed for SERS. The addition of wet chemistry sample processing facilities to these instruments could yield field-deployable analytical instruments with a broadened analytical window for detecting organic compounds with a biological or geological origin.

  10. The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.

    2014-10-01

    Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less

  11. Protein molecular data from ancient (>1 million years old) fossil material: pitfalls, possibilities and grand challenges.

    PubMed

    Schweitzer, Mary Higby; Schroeter, Elena R; Goshe, Michael B

    2014-07-15

    Advances in resolution and sensitivity of analytical techniques have provided novel applications, including the analyses of fossil material. However, the recovery of original proteinaceous components from very old fossil samples (defined as >1 million years (1 Ma) from previously named limits in the literature) is far from trivial. Here, we discuss the challenges to recovery of proteinaceous components from fossils, and the need for new sample preparation techniques, analytical methods, and bioinformatics to optimize and fully utilize the great potential of information locked in the fossil record. We present evidence for survival of original components across geological time, and discuss the potential benefits of recovery, analyses, and interpretation of fossil materials older than 1 Ma, both within and outside of the fields of evolutionary biology.

  12. "Cork taint" responsible compounds. Determination of haloanisoles and halophenols in cork matrix: A review.

    PubMed

    Tarasov, Andrii; Rauhut, Doris; Jung, Rainer

    2017-12-01

    Analytical methods of haloanisoles and halophenols quantification in cork matrix are summarized in the current review. Sample-preparation and sample-treatment techniques have been compared and discussed from the perspective of their efficiency, time- and extractant-optimization, easiness of performance. Primary interest of these analyses usually addresses to 2,4,6-trichloroanisole (TCA), which is a major wine contaminant among haloanisoles. Two concepts of TCA determination are described in the review: releasable TCA and total TCA analyses. Chromatographic, bioanalytical and sensorial methods were compared according to their application in the cork industry and in scientific investigations. Finally, it was shown that modern analytical techniques are able to provide required sensitivity, selectivity and repeatability for haloanisoles and halophenols determination. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Development of an external ceramic insulation for the space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Tanzilli, R. A. (Editor)

    1972-01-01

    The development and evaluation of a family of reusable external insulation systems for use on the space shuttle orbiter is discussed. The material development and evaluation activities are described. Additional information is provided on the development of an analytical micromechanical model of the reusable insulation and the development of techniques for reducing the heat transfer. Design data on reusable insulation systems and test techniques used for design data generation are included.

  14. Data Confidentiality Challenges in Big Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin, Jian; Zhao, Dongfang

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  15. Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing

    PubMed Central

    Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko

    2015-01-01

    The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523

  16. Synchrotron based mass spectrometry to investigate the molecular properties of mineral-organic associations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Suet Yi; Kleber, Markus; Takahashi, Lynelle K.

    2013-04-01

    Soil organic matter (OM) is important because its decay drives life processes in the biosphere. Analysis of organic compounds in geological systems is difficult because of their intimate association with mineral surfaces. To date there is no procedure capable of quantitatively separating organic from mineral phases without creating artifacts or mass loss. Therefore, analytical techniques that can (a) generate information about both organic and mineral phases simultaneously and (b) allow the examination of predetermined high-interest regions of the sample as opposed to conventional bulk analytical techniques are valuable. Laser Desorption Synchrotron Postionization (synchrotron-LDPI) mass spectrometry is introduced as a novelmore » analytical tool to characterize the molecular properties of organic compounds in mineral-organic samples from terrestrial systems, and it is demonstrated that when combined with Secondary Ion Mass Spectrometry (SIMS), can provide complementary information on mineral composition. Mass spectrometry along a decomposition gradient in density fractions, verifies the consistency of our results with bulk analytical techniques. We further demonstrate that by changing laser and photoionization energies, variations in molecular stability of organic compounds associated with mineral surfaces can be determined. The combination of synchrotron-LDPI and SIMS shows that the energetic conditions involved in desorption and ionization of organic matter may be a greater determinant of mass spectral signatures than the inherent molecular structure of the organic compounds investigated. The latter has implications for molecular models of natural organic matter that are based on mass spectrometric information.« less

  17. MEASUREMENT OF PYRETHROID RESIDUES IN ENVIRONMENTAL AND FOOD SAMPLES BY ENHANCED SOLVENT EXTRACTION/SUPERCRITICAL FLUID EXTRACTION COUPLED WITH GAS CHROMATOGRAPHY-TANDEM MASS SPECTROMETRY

    EPA Science Inventory

    The abstract summarizes pyrethorid methods development research. It provides a summary of sample preparation and analytical techniques such as supercritical fluid extraction, enhance solvent extraction, gas chromatography and tandem mass spectrometry.

  18. Philosophical Inquiry: (An Investigation of Basic Philosophical Presuppositions) Teacher's Manual.

    ERIC Educational Resources Information Center

    Institute for Services to Education, Inc., Washington, DC.

    This guide provides teaching techniques for an undergraduate philosophy course. Students examine specific philosophic issues related to the black person's experience. They are required to apply critical and analytical procedures leading to philosophical investigations of topics of both philosophical and nonphilosophical origins. The teaching…

  19. Characteristics of Effective Leadership Networks

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Azah, Vera Ndifor

    2016-01-01

    Purpose: The purpose of this paper is to inquire about the characteristics of effective school leadership networks and the contribution of such networks to the development of individual leaders' professional capacities. Design/methodology/approach: The study used path-analytic techniques with survey data provided by 450 school and district leaders…

  20. GUIDELINES FOR THE APPLICATION OF SEM/EDX ANALYTICAL TECHNIQUES FOR FINE AND COARSE PM SAMPLES

    EPA Science Inventory

    Scanning Electron Microscopy (SEM) coupled with Energy-Dispersive X-ray analysis (EDX) is a powerful tool in the characterization and source apportionment of environmental particulate matter (PM), providing size, chemistry, and morphology of particles as small as a few tenths ...

  1. DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.

    PubMed

    Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco

    2017-07-24

    DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Mixed Initiative Visual Analytics Using Task-Driven Recommendations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Cramer, Nicholas O.; Israel, David

    2015-12-07

    Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less

  3. Model of separation performance of bilinear gradients in scanning format counter-flow gradient electrofocusing techniques.

    PubMed

    Shameli, Seyed Mostafa; Glawdel, Tomasz; Ren, Carolyn L

    2015-03-01

    Counter-flow gradient electrofocusing allows the simultaneous concentration and separation of analytes by generating a gradient in the total velocity of each analyte that is the sum of its electrophoretic velocity and the bulk counter-flow velocity. In the scanning format, the bulk counter-flow velocity is varying with time so that a number of analytes with large differences in electrophoretic mobility can be sequentially focused and passed by a single detection point. Studies have shown that nonlinear (such as a bilinear) velocity gradients along the separation channel can improve both peak capacity and separation resolution simultaneously, which cannot be realized by using a single linear gradient. Developing an effective separation system based on the scanning counter-flow nonlinear gradient electrofocusing technique usually requires extensive experimental and numerical efforts, which can be reduced significantly with the help of analytical models for design optimization and guiding experimental studies. Therefore, this study focuses on developing an analytical model to evaluate the separation performance of scanning counter-flow bilinear gradient electrofocusing methods. In particular, this model allows a bilinear gradient and a scanning rate to be optimized for the desired separation performance. The results based on this model indicate that any bilinear gradient provides a higher separation resolution (up to 100%) compared to the linear case. This model is validated by numerical studies. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  5. The role of chromatographic and chiroptical spectroscopic techniques and methodologies in support of drug discovery for atropisomeric drug inhibitors of Bruton's tyrosine kinase.

    PubMed

    Dai, Jun; Wang, Chunlei; Traeger, Sarah C; Discenza, Lorell; Obermeier, Mary T; Tymiak, Adrienne A; Zhang, Yingru

    2017-03-03

    Atropisomers are stereoisomers resulting from hindered bond rotation. From synthesis of pure atropisomers, characterization of their interconversion thermodynamics to investigation of biological stereoselectivity, the evaluation of drug candidates subject to atropisomerism creates special challenges and can be complicated in both early drug discovery and later drug development. In this paper, we demonstrate an array of analytical techniques and systematic approaches to study the atropisomerism of drug molecules to meet these challenges. Using a case study of Bruton's tyrosine kinase (BTK) inhibitor drug candidates at Bristol-Myers Squibb, we present the analytical strategies and methodologies used during drug discovery including the detection of atropisomers, the determination of their relative composition, the identification of relative chirality, the isolation of individual atropisomers, the evaluation of interconversion kinetics, and the characterization of chiral stability in the solid state and in solution. In vivo and in vitro stereo-stability and stereo-selectivity were investigated as well as the pharmacological significance of any changes in atropisomer ratios. Techniques applied in these studies include analytical and preparative enantioselective supercritical fluid chromatography (SFC), enantioselective high performance liquid chromatography (HPLC), circular dichroism (CD), and mass spectrometry (MS). Our experience illustrates how atropisomerism can be a very complicated issue in drug discovery and why a thorough understanding of this phenomenon is necessary to provide guidance for pharmaceutical development. Analytical techniques and methodologies facilitate key decisions during the discovery of atropisomeric drug candidates by characterizing time-dependent physicochemical properties that can have significant biological implications and relevance to pharmaceutical development plans. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  8. In vivo imaging of sulfotransferases

    DOEpatents

    Barrio, Jorge R; Kepe, Vladimir; Small, Gary W; Satyamurthy, Nagichettiar

    2013-02-12

    Radiolabeled tracers for sulfotransferases (SULTs), their synthesis, and their use are provided. Included are substituted phenols, naphthols, coumarins, and flavones radiolabeled with .sup.18F, .sup.123I, .sup.124I, .sup.125I, or .sup.11C. Also provided are in vivo techniques for using these and other tracers as analytical and diagnostic tools to study sulfotransferase distribution and activity, in health and disease, and to evaluate therapeutic interventions.

  9. Characterization of carrier erythrocytes for biosensing applications

    NASA Astrophysics Data System (ADS)

    Bustamante López, Sandra C.; Meissner, Kenith E.

    2017-09-01

    Erythrocyte abundance, mobility, and carrying capacity make them attractive as a platform for blood analyte sensing as well as for drug delivery. Sensor-loaded erythrocytes, dubbed erythrosensors, could be reinfused into the bloodstream, excited noninvasively through the skin, and used to provide measurement of analyte levels in the bloodstream. Several techniques to load erythrocytes, thus creating carrier erythrocytes, exist. However, their cellular characteristics remain largely unstudied. Changes in cellular characteristics lead to removal from the bloodstream. We hypothesize that erythrosensors need to maintain native erythrocytes' (NEs) characteristics to serve as a long-term sensing platform. Here, we investigate two loading techniques and the properties of the resulting erythrosensors. For loading, hypotonic dilution requires a hypotonic solution while electroporation relies on electrical pulses to perforate the erythrocyte membrane. We analyze the resulting erythrosensor signal, size, morphology, and hemoglobin content. Although the resulting erythrosensors exhibit morphological changes, their size was comparable with NEs. The hypotonic dilution technique was found to load erythrosensors much more efficiently than electroporation, and the sensors were loaded throughout the volume of the erythrosensors. Finally, both techniques resulted in significant loss of hemoglobin. This study points to the need for continued development of loading techniques that better preserve NE characteristics.

  10. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  11. Study of a two-dimension transient heat propagation in cylindrical coordinates by means of two finite difference methods

    NASA Astrophysics Data System (ADS)

    Dumencu, A.; Horbaniuc, B.; Dumitraşcu, G.

    2016-08-01

    The analytical approach of unsteady conduction heat transfer under actual conditions represent a very difficult (if not insurmountable) problem due to the issues related to finding analytical solutions for the conduction heat transfer equation. Various techniques have been developed in order to overcome these difficulties, among which the alternate directions method and the decomposition method. Both of them are particularly suited for two-dimension heat propagation. The paper deals with both techniques in order to verify whether the results provided are in good accordance. The studied case consists of a long hollow cylinder, and considers that the time-dependent temperature field varies both in the radial and the axial directions. The implicit technique is used in both methods and involves the simultaneous solving of a set of equations for all of the nodes for each time step successively for each of the two directions. Gauss elimination is used to obtain the solution of the set, representing the nodal temperatures. After using the two techniques the results show a very good agreement, and since the decomposition is easier to use in terms of computer code and running time, this technique seems to be more recommendable.

  12. Towards nanometric resolution in multilayer depth profiling: a comparative study of RBS, SIMS, XPS and GDOES.

    PubMed

    Escobar Galindo, Ramón; Gago, Raul; Duday, David; Palacio, Carlos

    2010-04-01

    An increasing amount of effort is currently being directed towards the development of new functionalized nanostructured materials (i.e., multilayers and nanocomposites). Using an appropriate combination of composition and microstructure, it is possible to optimize and tailor the final properties of the material to its final application. The analytical characterization of these new complex nanostructures requires high-resolution analytical techniques that are able to provide information about surface and depth composition at the nanometric level. In this work, we comparatively review the state of the art in four different depth-profiling characterization techniques: Rutherford backscattering spectroscopy (RBS), secondary ion mass spectrometry (SIMS), X-ray photoelectron spectroscopy (XPS) and glow discharge optical emission spectroscopy (GDOES). In addition, we predict future trends in these techniques regarding improvements in their depth resolutions. Subnanometric resolution can now be achieved in RBS using magnetic spectrometry systems. In SIMS, the use of rotating sample holders and oxygen flooding during analysis as well as the optimization of floating low-energy ion guns to lower the impact energy of the primary ions improves the depth resolution of the technique. Angle-resolved XPS provides a very powerful and nondestructive technique for obtaining depth profiling and chemical information within the range of a few monolayers. Finally, the application of mathematical tools (deconvolution algorithms and a depth-profiling model), pulsed sources and surface plasma cleaning procedures is expected to greatly improve GDOES depth resolution.

  13. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  14. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  15. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  16. Refraction-enhanced backlit imaging of axially symmetric inertial confinement fusion plasmas.

    PubMed

    Koch, Jeffrey A; Landen, Otto L; Suter, Laurence J; Masse, Laurent P; Clark, Daniel S; Ross, James S; Mackinnon, Andrew J; Meezan, Nathan B; Thomas, Cliff A; Ping, Yuan

    2013-05-20

    X-ray backlit radiographs of dense plasma shells can be significantly altered by refraction of x rays that would otherwise travel straight-ray paths, and this effect can be a powerful tool for diagnosing the spatial structure of the plasma being radiographed. We explore the conditions under which refraction effects may be observed, and we use analytical and numerical approaches to quantify these effects for one-dimensional radial opacity and density profiles characteristic of inertial-confinement fusion (ICF) implosions. We also show how analytical and numerical approaches allow approximate radial plasma opacity and density profiles to be inferred from point-projection refraction-enhanced radiography data. This imaging technique can provide unique data on electron density profiles in ICF plasmas that cannot be obtained using other techniques, and the uniform illumination provided by point-like x-ray backlighters eliminates a significant source of uncertainty in inferences of plasma opacity profiles from area-backlit pinhole imaging data when the backlight spatial profile cannot be independently characterized. The technique is particularly suited to in-flight radiography of imploding low-opacity shells surrounding hydrogen ice, because refraction is sensitive to the electron density of the hydrogen plasma even when it is invisible to absorption radiography. It may also provide an alternative approach to timing shockwaves created by the implosion drive, that are currently invisible to absorption radiography.

  17. Nanomanipulation-Coupled Matrix-Assisted Laser Desorption/ Ionization-Direct Organelle Mass Spectrometry: A Technique for the Detailed Analysis of Single Organelles

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.

    2016-02-01

    We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.

  18. Recent trends in analytical methods and separation techniques for drugs of abuse in hair.

    PubMed

    Baciu, T; Borrull, F; Aguilar, C; Calull, M

    2015-01-26

    Hair analysis of drugs of abuse has been a subject of growing interest from a clinical, social and forensic perspective for years because of the broad time detection window after intake in comparison to urine and blood analysis. Over the last few years, hair analysis has gained increasing attention and recognition for the retrospective investigation of drug abuse in a wide variety of contexts, shown by the large number of applications developed. This review aims to provide an overview of the state of the art and the latest trends used in the literature from 2005 to the present in the analysis of drugs of abuse in hair, with a special focus on separation analytical techniques and their hyphenation with mass spectrometry detection. The most recently introduced sample preparation techniques are also addressed in this paper. The main strengths and weaknesses of all of these approaches are critically discussed by means of relevant applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  20. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    PubMed

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    PubMed Central

    Alaidi, Osama; Rames, Matthew J.

    2016-01-01

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. PMID:26087941

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eppich, G.; Kips, R.; Lindvall, R.

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived frommore » the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results indicate that the CUP-2 standard has a natural isotopic ratio, and does not appear to have been isotopically enriched or depleted in any way, and was not contaminated by a source of uranium with a non-natural isotopic composition. Furthermore, the lack of 233U and 236U above the instrumental detection limit indicates that this sample was not exposed to a neutron flux, which would have generated one or both of these isotopes in measurable concentrations.« less

  3. Laser vaporization/ionization interface for coupling microscale separation techniques with mass spectrometry

    DOEpatents

    Yeung, Edward S.; Chang, Yu-chen

    1999-06-29

    The present invention provides a laser-induced vaporization and ionization interface for directly coupling microscale separation processes to a mass spectrometer. Vaporization and ionization of the separated analytes are facilitated by the addition of a light-absorbing component to the separation buffer or solvent.

  4. Advances in Adaptive Control Methods

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan

    2009-01-01

    This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.

  5. Laser vaporization/ionization interface for coupling microscale separation techniques with mass spectrometry

    DOEpatents

    Yeung, E.S.; Chang, Y.C.

    1999-06-29

    The present invention provides a laser-induced vaporization and ionization interface for directly coupling microscale separation processes to a mass spectrometer. Vaporization and ionization of the separated analytes are facilitated by the addition of a light-absorbing component to the separation buffer or solvent. 8 figs.

  6. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  7. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  8. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  9. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  10. Uranium determination in natural water by the fissiontrack technique

    USGS Publications Warehouse

    Reimer, G.M.

    1975-01-01

    The fission track technique, utilizing the neutron-induced fission of uranium-235, provides a versatile analytical method for the routine analysis of uranium in liquid samples of natural water. A detector is immersed in the sample and both are irradiated. The fission track density observed in the detector is directly proportional to the uranium concentration. The specific advantages of this technique are: (1) only a small quantity of sample, typically 0.1-1 ml, is needed; (2) no sample concentration is necessary; (3) it is capable of providing analyses with a lower reporting limit of 1 ??g per liter; and (4) the actual time spent on an analysis can be only a few minutes. This paper discusses and describes the method. ?? 1975.

  11. The edge of chaos: A nonlinear view of psychoanalytic technique.

    PubMed

    Galatzer-Levy, Robert M

    2016-04-01

    The field of nonlinear dynamics (or chaos theory) provides ways to expand concepts of psychoanalytic process that have implications for the technique of psychoanalysis. This paper describes how concepts of "the edge of chaos," emergence, attractors, and coupled oscillators can help shape analytic technique resulting in an approach to doing analysis which is at the same time freer and more firmly based in an enlarged understanding of the ways in which psychoanalysis works than some current recommendation about technique. Illustrations from a lengthy analysis of an analysand with obsessive-compulsive disorder show this approach in action. Copyright © 2016 Institute of Psychoanalysis.

  12. Numerical modeling of pollutant transport using a Lagrangian marker particle technique

    NASA Technical Reports Server (NTRS)

    Spaulding, M.

    1976-01-01

    A derivation and code were developed for the three-dimensional mass transport equation, using a particle-in-cell solution technique, to solve coastal zone waste discharge problems where particles are a major component of the waste. Improvements in the particle movement techniques are suggested and typical examples illustrated. Preliminary model comparisons with analytic solutions for an instantaneous point release in a uniform flow show good results in resolving the waste motion. The findings to date indicate that this computational model will provide a useful technique to study the motion of sediment, dredged spoils, and other particulate waste commonly deposited in coastal waters.

  13. Specialized data analysis of SSME and advanced propulsion system vibration measurements

    NASA Technical Reports Server (NTRS)

    Coffin, Thomas; Swanson, Wayne L.; Jong, Yen-Yi

    1993-01-01

    The basic objectives of this contract were to perform detailed analysis and evaluation of dynamic data obtained during Space Shuttle Main Engine (SSME) test and flight operations, including analytical/statistical assessment of component dynamic performance, and to continue the development and implementation of analytical/statistical models to effectively define nominal component dynamic characteristics, detect anomalous behavior, and assess machinery operational conditions. This study was to provide timely assessment of engine component operational status, identify probable causes of malfunction, and define feasible engineering solutions. The work was performed under three broad tasks: (1) Analysis, Evaluation, and Documentation of SSME Dynamic Test Results; (2) Data Base and Analytical Model Development and Application; and (3) Development and Application of Vibration Signature Analysis Techniques.

  14. Drug screening in clinical or forensic toxicology: are there differences?

    PubMed

    Gerostamoulos, Dimitri; Beyer, Jochen

    2010-09-01

    Legal and medical practitioners need to remember that, with respect to drug analysis, there are two distinct disciplines in analytical toxicology concerned with human biological matrices, namely clinical and forensic toxicology. Both fields use similar analytical techniques designed to detect and quantify drugs, chemicals and poisons in fluids or tissues. In clinical toxicology, analytical results help to specify the appropriate treatment of a poisoned or intoxicated patient. In forensic toxicology, the results often play a vital role in determining the possible impairment or behavioural changes in an individual, or the contribution of drugs or poisons to death in a medico-legal investigation. This column provides an overview of the similarities and differences inherent in clinical and forensic toxicology.

  15. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    NASA Astrophysics Data System (ADS)

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-11-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes--including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH--in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media.

  17. Complementary use of PIXE-alpha and XRF portable systems for the non-destructive and in situ characterization of gemstones in museums

    NASA Astrophysics Data System (ADS)

    Pappalardo, L.; Karydas, A. G.; Kotzamani, N.; Pappalardo, G.; Romano, F. P.; Zarkadas, Ch.

    2005-09-01

    Gemstones on gold Hellenistic (late 4th century BC, 1st AD) jewelry, exhibited at the Benaki Museum of Athens, were analyzed in situ by means of two non-destructive and portable analytical techniques. The composition of major and minor elements was determined using a new portable PIXE-alpha spectrometer. The analytical features of this spectrometer allow the determination of matrix elements from Na to Zn through the K-lines and the determination of higher atomic number elements via the L- or M-lines. The red stones analyzed were revealed as red garnets, displaying a compositional range from Mg-rich garnet to Fe-rich garnet. The complementary use of a portable XRF spectrometer provided additional information on some trace elements (Cr and Y), which are considered to be important for the chemical separation between different garnet groups. A comparison of our results with recent literature data offers useful indications about the possible geographical provenance of the stones. The analytical techniques, their complementarity and the results obtained are presented and discussed.

  18. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Quantitative Thermochronology

    NASA Astrophysics Data System (ADS)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  20. Endogenous glucocorticoid analysis by liquid chromatography-tandem mass spectrometry in routine clinical laboratories.

    PubMed

    Hawley, James M; Keevil, Brian G

    2016-09-01

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is a powerful analytical technique that offers exceptional selectivity and sensitivity. Used optimally, LC-MS/MS provides accurate and precise results for a wide range of analytes at concentrations that are difficult to quantitate with other methodologies. Its implementation into routine clinical biochemistry laboratories has revolutionised our ability to analyse small molecules such as glucocorticoids. Whereas immunoassays can suffer from matrix effects and cross-reactivity due to interactions with structural analogues, the selectivity offered by LC-MS/MS has largely overcome these limitations. As many clinical guidelines are now beginning to acknowledge the importance of the methodology used to provide results, the advantages associated with LC-MS/MS are gaining wider recognition. With their integral role in both the diagnosis and management of hypo- and hyperadrenal disorders, coupled with their widespread pharmacological use, the accurate measurement of glucocorticoids is fundamental to effective patient care. Here, we provide an up-to-date review of the LC-MS/MS techniques used to successfully measure endogenous glucocorticoids, particular reference is made to serum, urine and salivary cortisol. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Continuous electrophoretic purification of individual analytes from multicomponent mixtures.

    PubMed

    McLaren, David G; Chen, David D Y

    2004-04-15

    Individual analytes can be isolated from multicomponent mixtures and collected in the outlet vial by carrying out electrophoretic purification through a capillary column. Desired analytes are allowed to migrate continuously through the column under the electric field while undesired analytes are confined to the inlet vial by application of a hydrodynamic counter pressure. Using pressure ramping and buffer replenishment techniques, 18% of the total amount present in a bulk sample can be purified when the resolution to the adjacent peak is approximately 3. With a higher resolution, the yield could be further improved. Additionally, by periodically introducing fresh buffer into the sample, changes in pH and conductivity can be mediated, allowing higher purity (>or=99.5%) to be preserved in the collected fractions. With an additional reversed cycle of flow counterbalanced capillary electrophoresis, any individual component in a sample mixture can be purified providing it can be separated in an electrophoresis system.

  2. Current Status of Mycotoxin Analysis: A Critical Review.

    PubMed

    Shephard, Gordon S

    2016-07-01

    It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.

  3. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  4. Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses

    PubMed Central

    Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.

    2010-01-01

    Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573

  5. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  6. Full analytical solution of the bloch equation when using a hyperbolic-secant driving function.

    PubMed

    Zhang, Jinjin; Garwood, Michael; Park, Jang-Yeon

    2017-04-01

    The frequency-swept pulse known as the hyperbolic-secant (HS) pulse is popular in NMR for achieving adiabatic spin inversion. The HS pulse has also shown utility for achieving excitation and refocusing in gradient-echo and spin-echo sequences, including new ultrashort echo-time imaging (e.g., Sweep Imaging with Fourier Transform, SWIFT) and B 1 mapping techniques. To facilitate the analysis of these techniques, the complete theoretical solution of the Bloch equation, as driven by the HS pulse, was derived for an arbitrary state of initial magnetization. The solution of the Bloch-Riccati equation for transverse and longitudinal magnetization for an arbitrary initial state was derived analytically in terms of HS pulse parameters. The analytical solution was compared with the solutions using both the Runge-Kutta method and the small-tip approximation. The analytical solution was demonstrated on different initial states at different frequency offsets with/without a combination of HS pulses. Evolution of the transverse magnetization was influenced significantly by the choice of HS pulse parameters. The deviation of the magnitude of the transverse magnetization, as obtained by comparing the small-tip approximation to the analytical solution, was < 5% for flip angles < 30 °, but > 10% for the flip angles > 40 °. The derived analytical solution provides insights into the influence of HS pulse parameters on the magnetization evolution. Magn Reson Med 77:1630-1638, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  7. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  8. An illustrative analysis of technological alternatives for satellite communications

    NASA Technical Reports Server (NTRS)

    Metcalfe, M. R.; Cazalet, E. G.; North, D. W.

    1979-01-01

    The demand for satellite communications services in the domestic market is discussed. Two approaches to increasing system capacity are the expansion of service into frequencies presently allocated but not used for satellite communications, and the development of technologies that provide a greater level of service within the currently used frequency bands. The development of economic models and analytic techniques for evaluating capacity expansion alternatives such as these are presented. The satellite orbit spectrum problem, and also outlines of some suitable analytic approaches are examined. Illustrative analysis of domestic communications satellite technology options for providing increased levels of service are also examined. The analysis illustrates the use of probabilities and decision trees in analyzing alternatives, and provides insight into the important aspects of the orbit spectrum problem that would warrant inclusion in a larger scale analysis.

  9. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.

  10. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less

  11. Detection of genetically modified organisms in foods by DNA amplification techniques.

    PubMed

    García-Cañas, Virginia; Cifuentes, Alejandro; González, Ramón

    2004-01-01

    In this article, the different DNA amplification techniques that are being used for detecting genetically modified organisms (GMOs) in foods are examined. This study intends to provide an updated overview (including works published till June 2002) on the principal applications of such techniques together with their main advantages and drawbacks in GMO detection in foods. Some relevant facts on sampling, DNA isolation, and DNA amplification methods are discussed. Moreover; these analytical protocols are discuissed from a quantitative point of view, including the newest investigations on multiplex detection of GMOs in foods and validation of methods.

  12. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  13. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  14. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, S.S.; Zhu, S.; Cai, Y.

    Motion-dependent magnetic forces are the key elements in the study of magnetically levitated vehicle (maglev) system dynamics. In the past, most maglev-system designs were based on a quasisteady-motion theory of magnetic forces. This report presents an experimental and analytical study that will enhance our understanding of the role of unsteady-motion-dependent magnetic forces and demonstrate an experimental technique that can be used to measure those unsteady magnetic forces directly. The experimental technique provides a useful tool to measure motion-dependent magnetic forces for the prediction and control of maglev systems.

  16. Information support for decision making on dispatching control of water distribution in irrigation

    NASA Astrophysics Data System (ADS)

    Yurchenko, I. F.

    2018-05-01

    The research has been carried out on developing the technique of supporting decision making for on-line control, operational management of water allocation for the interfarm irrigation projects basing on the analytical patterns of dispatcher control. This technique provides an increase of labour productivity as well as higher management quality due to the improved level of automation, as well as decision making optimization taking into account diagnostics of the issues, solutions classification, information being required to the decision makers.

  17. Horses and cows might teach us about human knees

    NASA Astrophysics Data System (ADS)

    Holland, C.; Vollrath, F.; Gill, H. S.

    2014-04-01

    Our comparative study of the knees of horses and cows (paraphrased as highly evolved joggers and as domesticated couch-potatoes, respectively) demonstrates significant differences in the posterior sections of bovine and equine tibial cartilage, which are consistent with specialisation for gait. These insights were possible using a novel analytical measuring technique based on the shearing of small biopsy samples, called dynamic shear analysis. We assert that this technique could provide a powerful new tool to precisely quantify the pathology of osteoarthritis for the medical field.

  18. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  19. Turbomachinery noise

    NASA Astrophysics Data System (ADS)

    Groeneweg, John F.; Sofrin, Thomas G.; Rice, Edward J.; Gliebe, Phillip R.

    1991-08-01

    Summarized here are key advances in experimental techniques and theoretical applications which point the way to a broad understanding and control of turbomachinery noise. On the experimental side, the development of effective inflow control techniques makes it possible to conduct, in ground based facilities, definitive experiments in internally controlled blade row interactions. Results can now be valid indicators of flight behavior and can provide a firm base for comparison with analytical results. Inflow control coupled with detailed diagnostic tools such as blade pressure measurements can be used to uncover the more subtle mechanisms such as rotor strut interaction, which can set tone levels for some engine configurations. Initial mappings of rotor wake-vortex flow fields have provided a data base for a first generation semiempirical flow disturbance model. Laser velocimetry offers a nonintrusive method for validating and improving the model. Digital data systems and signal processing algorithms are bringing mode measurement closer to a working tool that can be frequently applied to a real machine such as a turbofan engine. On the analytical side, models of most of the links in the chain from turbomachine blade source to far field observation point have been formulated. Three dimensional lifting surface theory for blade rows, including source noncompactness and cascade effects, blade row transmission models incorporating mode and frequency scattering, and modal radiation calculations, including hybrid numerical-analytical approaches, are tools which await further application.

  20. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  1. Analysis of pigments in polychromes by use of laser induced breakdown spectroscopy and Raman microscopy

    NASA Astrophysics Data System (ADS)

    Castillejo, M.; Martín, M.; Silva, D.; Stratoudaki, T.; Anglos, D.; Burgio, L.; Clark, R. J. H.

    2000-09-01

    Two laser-based analytical techniques, Laser Induced Breakdown Spectroscopy (LIBS) and Raman microscopy, have been used for the identification of pigments on a polychrome from the Rococo period. Detailed spectral data are presented from analyses performed on a fragment of a gilded altarpiece from the church of Escatrón, Zaragoza, Spain. LIBS measurements yielded elemental analytical data which suggest the presence of certain pigments and, in addition, provide information on the stratigraphy of the paint layers. Identification of most pigments and of the materials used in the preparation layer was performed by Raman microscopy.

  2. Transient Thermal Model and Analysis of the Lunar Surface and Regolith for Cryogenic Fluid Storage

    NASA Technical Reports Server (NTRS)

    Christie, Robert J.; Plachta, David W.; Yasan, Mohammad M.

    2008-01-01

    A transient thermal model of the lunar surface and regolith was developed along with analytical techniques which will be used to evaluate the storage of cryogenic fluids at equatorial and polar landing sites. The model can provide lunar surface and subsurface temperatures as a function of latitude and time throughout the lunar cycle and season. It also accounts for the presence of or lack of the undisturbed fluff layer on the lunar surface. The model was validated with Apollo 15 and Clementine data and shows good agreement with other analytical models.

  3. Paper-based analytical devices for environmental analysis.

    PubMed

    Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S

    2016-03-21

    The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.

  4. Subjective education in analytic training: drawing on values from the art academy.

    PubMed

    Dougherty, Mary

    2008-11-01

    Kernberg and others have observed that psychoanalytic education has tended to promote the acquisition of theoretical knowledge and clinical technique within an atmosphere of indoctrination rather than of exploration. As a corrective, he proposed four models that correspond to values in psychoanalytic education: the art academy, the technical trade school, the religious seminary and the university. He commended models of the university and art academy to our collective attention because of their combined effectiveness in providing for the objective and subjective education of candidates: the university model for its capacity to provide a critical sense of a wide range of theories in an atmosphere tolerating debate and difference, and the art academy model for its capacity to facilitate the expression of individual creativity. In this paper, I will explore the art academy model for correspondences between artistic and analytic trainings that can enhance the development of the creative subjectivity of psychoanalytic candidates. I will draw additional correspondences between analytic and artistic learning that can enhance psychoanalytic education.

  5. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  6. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.

  7. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  8. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  9. Use of Latent Profile Analysis in Studies of Gifted Students

    ERIC Educational Resources Information Center

    Mammadov, Sakhavat; Ward, Thomas J.; Cross, Jennifer Riedl; Cross, Tracy L.

    2016-01-01

    To date, in gifted education and related fields various conventional factor analytic and clustering techniques have been used extensively for investigation of the underlying structure of data. Latent profile analysis is a relatively new method in the field. In this article, we provide an introduction to latent profile analysis for gifted education…

  10. Orbiter Avionics Radiation Handbook

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon D.

    1999-01-01

    This handbook was assembled to document he radiation environment for design of Orbiter avionics. It also maps the environment through vehicle shielding and mission usage into discrete requirements such as total dose. Some details of analytical techniques for calculating radiation effects are provided. It is anticipated that appropriate portions of this document will be added to formal program specifications.

  11. 75 FR 28616 - Agilent Technologies, Inc.; Analysis of the Agreement Containing Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... equipment used to test cell phones and communications equipment, machines that determine the contents of... employ various analytical techniques to test samples of many types, are used by academic researchers... require the sensitivity provided by ICP-MS, and because many customers perform tests pursuant to...

  12. Corpus Linguistics for Korean Language Learning and Teaching. NFLRC Technical Report No. 26

    ERIC Educational Resources Information Center

    Bley-Vroman, Robert, Ed.; Ko, Hyunsook, Ed.

    2006-01-01

    Dramatic advances in personal computer technology have given language teachers access to vast quantities of machine-readable text, which can be analyzed with a view toward improving the basis of language instruction. Corpus linguistics provides analytic techniques and practical tools for studying language in use. This volume includes both an…

  13. ION COMPOSITION ELUCIDATION (ICE): A HIGH RESOLUTION MASS SPECTROMETRIC TECHNIQUE FOR CHARACTERIZATION AND IDENTIFICATION OF ORGANIC COMPOUNDS

    EPA Science Inventory

    Identifying compounds found in the environment without knowledge of their origin is a very difficult analytical problem. Comparison of the low resolution mass spectrum of a compound with those in the NIST or Wiley mass spectral libraries can provide a tentative identification whe...

  14. SUBSURFACE CHARACTERIZATION AND MONITORING TECHNIQUES: A DESK REFERENCE GUIDE - VOLUME II: THE VADOSE ZONE, FIELD SCREENING AND ANALYTICAL METHODS - APPENDICES C AND D

    EPA Science Inventory

    Many EPA programs, including those under the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Response, Compensation, and Liability Act (CERCLA), require subsurface characterization and monitoring to detect ground-water contamination and provide data to deve...

  15. Preparation and Analysis of Potassium Tris(Oxalato)Ferrate(III)Trihydrate: A General Chemistry Experiment.

    ERIC Educational Resources Information Center

    Olmsted, John

    1984-01-01

    Describes a five-period experiment which: (1) integrates preparative and analytical techniques; (2) utilizes a photochemical reaction that excites student interest both from visual impact and as an introduction to photoinduced processes; (3) provides accurate results; and (4) costs less than $0.20 per student per laboratory session. (JN)

  16. Analytical estimation of ultrasound properties, thermal diffusivity, and perfusion using magnetic resonance-guided focused ultrasound temperature data

    PubMed Central

    Dillon, C R; Borasi, G; Payne, A

    2016-01-01

    For thermal modeling to play a significant role in treatment planning, monitoring, and control of magnetic resonance-guided focused ultrasound (MRgFUS) thermal therapies, accurate knowledge of ultrasound and thermal properties is essential. This study develops a new analytical solution for the temperature change observed in MRgFUS which can be used with experimental MR temperature data to provide estimates of the ultrasound initial heating rate, Gaussian beam variance, tissue thermal diffusivity, and Pennes perfusion parameter. Simulations demonstrate that this technique provides accurate and robust property estimates that are independent of the beam size, thermal diffusivity, and perfusion levels in the presence of realistic MR noise. The technique is also demonstrated in vivo using MRgFUS heating data in rabbit back muscle. Errors in property estimates are kept less than 5% by applying a third order Taylor series approximation of the perfusion term and ensuring the ratio of the fitting time (the duration of experimental data utilized for optimization) to the perfusion time constant remains less than one. PMID:26741344

  17. Chemical messages in 170-year-old champagne bottles from the Baltic Sea: Revealing tastes from the past

    PubMed Central

    Jeandet, Philippe; Heinzmann, Silke S.; Roullier-Gall, Chloé; Cilindre, Clara; Aron, Alissa; Deville, Marie Alice; Moritz, Franco; Karbowiak, Thomas; Demarville, Dominique; Brun, Cyril; Moreau, Fabienne; Michalke, Bernhard; Liger-Belair, Gérard; Witting, Michael; Lucio, Marianna; Steyer, Damien; Gougeon, Régis D.; Schmitt-Kopplin, Philippe

    2015-01-01

    Archaeochemistry as the application of the most recent analytical techniques to ancient samples now provides an unprecedented understanding of human culture throughout history. In this paper, we report on a multiplatform analytical investigation of 170-y-old champagne bottles found in a shipwreck at the bottom of the Baltic Sea, which provides insight into winemaking practices used at the time. Organic spectroscopy-based nontargeted metabolomics and metallomics give access to the detailed composition of these wines, revealing, for instance, unexpected chemical characteristics in terms of small ion, sugar, and acid contents as well as markers of barrel aging and Maillard reaction products. The distinct aroma composition of these ancient champagne samples, first revealed during tasting sessions, was later confirmed using state-of-the-art aroma analysis techniques. After 170 y of deep sea aging in close-to-perfect conditions, these sleeping champagne bottles awoke to tell us a chapter of the story of winemaking and to reveal their extraordinary archaeometabolome and elemental diversity in the form of chemical signatures related to each individual step of champagne production. PMID:25897020

  18. Chemical messages in 170-year-old champagne bottles from the Baltic Sea: Revealing tastes from the past.

    PubMed

    Jeandet, Philippe; Heinzmann, Silke S; Roullier-Gall, Chloé; Cilindre, Clara; Aron, Alissa; Deville, Marie Alice; Moritz, Franco; Karbowiak, Thomas; Demarville, Dominique; Brun, Cyril; Moreau, Fabienne; Michalke, Bernhard; Liger-Belair, Gérard; Witting, Michael; Lucio, Marianna; Steyer, Damien; Gougeon, Régis D; Schmitt-Kopplin, Philippe

    2015-05-12

    Archaeochemistry as the application of the most recent analytical techniques to ancient samples now provides an unprecedented understanding of human culture throughout history. In this paper, we report on a multiplatform analytical investigation of 170-y-old champagne bottles found in a shipwreck at the bottom of the Baltic Sea, which provides insight into winemaking practices used at the time. Organic spectroscopy-based nontargeted metabolomics and metallomics give access to the detailed composition of these wines, revealing, for instance, unexpected chemical characteristics in terms of small ion, sugar, and acid contents as well as markers of barrel aging and Maillard reaction products. The distinct aroma composition of these ancient champagne samples, first revealed during tasting sessions, was later confirmed using state-of-the-art aroma analysis techniques. After 170 y of deep sea aging in close-to-perfect conditions, these sleeping champagne bottles awoke to tell us a chapter of the story of winemaking and to reveal their extraordinary archaeometabolome and elemental diversity in the form of chemical signatures related to each individual step of champagne production.

  19. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  20. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  1. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  2. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  3. Compact and cost effective instrument for detecting drug precursors in different environments based on fluorescence polarization

    NASA Astrophysics Data System (ADS)

    Antolín-Urbaneja, J. C.; Eguizabal, I.; Briz, N.; Dominguez, A.; Estensoro, P.; Secchi, A.; Varriale, A.; Di Giovanni, S.; D'Auria, S.

    2013-05-01

    Several techniques for detecting chemical drug precursors have been developed in the last decade. Most of them are able to identify molecules at very low concentration under lab conditions. Other commercial devices are able to detect a fixed number and type of target substances based on a single detection technique providing an absence of flexibility with respect to target compounds. The construction of compact and easy to use detection systems providing screening for a large number of compounds being able to discriminate them with low false alarm rate and high probability of detection is still an open concern. Under CUSTOM project, funded by the European Commission within the FP7, a stand-alone portable sensing device based on multiple techniques is being developed. One of these techniques is based on the LED induced fluorescence polarization to detect Ephedrine and Benzyl Methyl Keton (BMK) as a first approach. This technique is highly selective with respect to the target compounds due to the generation of properly engineered fluorescent proteins which are able to bind the target analytes, as it happens in an "immune-type reaction". This paper deals with the advances in the design, construction and validation of the LED induced fluorescence sensor to detect BMK analytes. This sensor includes an analysis module based on high performance LED and PMT detector, a fluidic system to dose suitable quantities of reagents and some printed circuit boards, all of them fixed in a small structure (167mm × 193mm × 228mm) with the capability of working as a stand-alone application.

  4. A Single-Molecule Barcoding System using Nanoslits for DNA Analysis

    NASA Astrophysics Data System (ADS)

    Jo, Kyubong; Schramm, Timothy M.; Schwartz, David C.

    Single DNA molecule approaches are playing an increasingly central role in the analytical genomic sciences because single molecule techniques intrinsically provide individualized measurements of selected molecules, free from the constraints of bulk techniques, which blindly average noise and mask the presence of minor analyte components. Accordingly, a principal challenge that must be addressed by all single molecule approaches aimed at genome analysis is how to immobilize and manipulate DNA molecules for measurements that foster construction of large, biologically relevant data sets. For meeting this challenge, this chapter discusses an integrated approach for microfabricated and nanofabricated devices for the manipulation of elongated DNA molecules within nanoscale geometries. Ideally, large DNA coils stretch via nanoconfinement when channel dimensions are within tens of nanometers. Importantly, stretched, often immobilized, DNA molecules spanning hundreds of kilobase pairs are required by all analytical platforms working with large genomic substrates because imaging techniques acquire sequence information from molecules that normally exist in free solution as unrevealing random coils resembling floppy balls of yarn. However, nanoscale devices fabricated with sufficiently small dimensions fostering molecular stretching make these devices impractical because of the requirement of exotic fabrication technologies, costly materials, and poor operational efficiencies. In this chapter, such problems are addressed by discussion of a new approach to DNA presentation and analysis that establishes scaleable nanoconfinement conditions through reduction of ionic strength; stiffening DNA molecules thus enabling their arraying for analysis using easily fabricated devices that can also be mass produced. This new approach to DNA nanoconfinement is complemented by the development of a novel labeling scheme for reliable marking of individual molecules with fluorochrome labels, creating molecular barcodes, which are efficiently read using fluorescence resonance energy transfer techniques for minimizing noise from unincorporated labels. As such, our integrative approach for the realization of genomic analysis through nanoconfinement, named nanocoding, was demonstrated through the barcoding and mapping of bacterial artificial chromosomal molecules, thereby providing the basis for a high-throughput platform competent for whole genome investigations.

  5. S.S. Annunziata Church (L'Aquila, Italy) unveiled by non- and micro-destructive testing techniques

    NASA Astrophysics Data System (ADS)

    Sfarra, Stefano; Cheilakou, Eleni; Theodorakeas, Panagiotis; Paoletti, Domenica; Koui, Maria

    2017-03-01

    The present research work explores the potential of an integrated inspection methodology, combining Non-destructive testing and micro-destructive analytical techniques, for both the structural assessment of the S.S. Annunziata Church located in Roio Colle (L'Aquila, Italy) and the characterization of its wall paintings' pigments. The study started by applying passive thermal imaging for the structural monitoring of the church before and after the application of a consolidation treatment, while active thermal imaging was further used for assessing this consolidation procedure. After the earthquake of 2009, which seriously damaged the city of L'Aquila and its surroundings, part of the internal plaster fell off revealing the presence of an ancient mural painting that was subsequently investigated by means of a combined analytical approach involving portable VIS-NIR fiber optics diffuse reflectance spectroscopy (FORS) and laboratory methods, such as environmental scanning electron microscopy (ESEM) coupled with energy dispersive X-ray analysis (EDX), and attenuated total reflectance-fourier transform infrared spectroscopy (ATR-FTIR). The results obtained from the thermographic analysis provided information concerning the two different constrictive phases of the Church, enabled the assessment of the consolidation treatment, and contributed to the detection of localized problems mainly related to the rising damp phenomenon and to biological attack. In addition, the results obtained from the combined analytical approach allowed the identification of the wall painting pigments (red and yellow ochre, green earth, and smalt) and provided information on the binding media and the painting technique possibly applied by the artist. From the results of the present study, it is possible to conclude that the joint use of the above stated methods into an integrated methodology can produce the complete set of useful information required for the planning of the Church's restoration phase.

  6. How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation.

    PubMed

    Youn-Ah Kang; Görg, Carsten; Stasko, John

    2011-05-01

    Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.

  7. Commodity-Free Calibration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Commodity-free calibration is a reaction rate calibration technique that does not require the addition of any commodities. This technique is a specific form of the reaction rate technique, where all of the necessary reactants, other than the sample being analyzed, are either inherent in the analyzing system or specifically added or provided to the system for a reason other than calibration. After introduction, the component of interest is exposed to other reactants or flow paths already present in the system. The instrument detector records one of the following to determine the rate of reaction: the increase in the response of the reaction product, a decrease in the signal of the analyte response, or a decrease in the signal from the inherent reactant. With this data, the initial concentration of the analyte is calculated. This type of system can analyze and calibrate simultaneously, reduce the risk of false positives and exposure to toxic vapors, and improve accuracy. Moreover, having an excess of the reactant already present in the system eliminates the need to add commodities, which further reduces cost, logistic problems, and potential contamination. Also, the calculations involved can be simplified by comparison to those of the reaction rate technique. We conducted tests with hypergols as an initial investigation into the feasiblility of the technique.

  8. An overview on current fluid-inclusion research and applications

    USGS Publications Warehouse

    Chi, G.; Chou, I.-Ming; Lu, H.-Z.

    2003-01-01

    This paper provides an overview of some of the more important developments in fluid-inclusion research and applications in recent years, including fluid-inclusion petrography, PVTX studies, and analytical techniques. In fluid-inclusion petrography, the introduction of the concept of 'fluid-inclusion assemblage' has been a major advance. In PVTX studies, the use of synthetic fluid inclusions and hydrothermal diamond-anvil cells has greatly contributed to the characterization of the phase behaviour of geologically relevant fluid systems. Various analytical methods are being developed and refined rapidly, with the Laser-Raman and LA-ICP-MS techniques being particularly useful for volatile and solute analyses, respectively. Ore deposit research has been and will continue to be the main field of application of fluid inclusions. However, fluid inclusions have been increasingly applied to other fields of earth science, especially in petroleum geology and the study of magmatic and earth interior processes.

  9. Advanced analytical techniques for the extraction and characterization of plant-derived essential oils by gas chromatography with mass spectrometry.

    PubMed

    Waseem, Rabia; Low, Kah Hin

    2015-02-01

    In recent years, essential oils have received a growing interest because of the positive health effects of their novel characteristics such as antibacterial, antifungal, and antioxidant activities. For the extraction of plant-derived essential oils, there is the need of advanced analytical techniques and innovative methodologies. An exhaustive study of hydrodistillation, supercritical fluid extraction, ultrasound- and microwave-assisted extraction, solid-phase microextraction, pressurized liquid extraction, pressurized hot water extraction, liquid-liquid extraction, liquid-phase microextraction, matrix solid-phase dispersion, and gas chromatography (one- and two-dimensional) hyphenated with mass spectrometry for the extraction through various plant species and analysis of essential oils has been provided in this review. Essential oils are composed of mainly terpenes and terpenoids with low-molecular-weight aromatic and aliphatic constituents that are particularly important for public health. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  11. Electrospray ionization mass spectrometry: a technique to access the information beyond the molecular weight of the analyte.

    PubMed

    Banerjee, Shibdas; Mazumdar, Shyamalava

    2012-01-01

    The Electrospray Ionization (ESI) is a soft ionization technique extensively used for production of gas phase ions (without fragmentation) of thermally labile large supramolecules. In the present review we have described the development of Electrospray Ionization mass spectrometry (ESI-MS) during the last 25 years in the study of various properties of different types of biological molecules. There have been extensive studies on the mechanism of formation of charged gaseous species by the ESI. Several groups have investigated the origin and implications of the multiple charge states of proteins observed in the ESI-mass spectra of the proteins. The charged analytes produced by ESI can be fragmented by activating them in the gas-phase, and thus tandem mass spectrometry has been developed, which provides very important insights on the structural properties of the molecule. The review will highlight recent developments and emerging directions in this fascinating area of research.

  12. Hyperspectral imaging for non-contact analysis of forensic traces.

    PubMed

    Edelman, G J; Gaston, E; van Leeuwen, T G; Cullen, P J; Aalders, M C G

    2012-11-30

    Hyperspectral imaging (HSI) integrates conventional imaging and spectroscopy, to obtain both spatial and spectral information from a specimen. This technique enables investigators to analyze the chemical composition of traces and simultaneously visualize their spatial distribution. HSI offers significant potential for the detection, visualization, identification and age estimation of forensic traces. The rapid, non-destructive and non-contact features of HSI mark its suitability as an analytical tool for forensic science. This paper provides an overview of the principles, instrumentation and analytical techniques involved in hyperspectral imaging. We describe recent advances in HSI technology motivating forensic science applications, e.g. the development of portable and fast image acquisition systems. Reported forensic science applications are reviewed. Challenges are addressed, such as the analysis of traces on backgrounds encountered in casework, concluded by a summary of possible future applications. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less

  14. Halal authenticity issues in meat and meat products.

    PubMed

    Nakyinsige, Khadijah; Man, Yaakob Bin Che; Sazili, Awis Qurni

    2012-07-01

    In the recent years, Muslims have become increasingly concerned about the meat they eat. Proper product description is very crucial for consumers to make informed choices and to ensure fair trade, particularly in the ever growing halal food market. Globally, Muslim consumers are concerned about a number of issues concerning meat and meat products such as pork substitution, undeclared blood plasma, use of prohibited ingredients, pork intestine casings and non-halal methods of slaughter. Analytical techniques which are appropriate and specific have been developed to deal with particular issues. The most suitable technique for any particular sample is often determined by the nature of the sample itself. This paper sets out to identify what makes meat halal, highlight the halal authenticity issues that occur in meat and meat products and provide an overview of the possible analytical methods for halal authentication of meat and meat products. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Environmental Monitoring and the Gas Industry: Program Manager Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory D. Gillispie

    1997-12-01

    This document has been developed for the nontechnical gas industry manager who has the responsibility for the development of waste or potentially contaminated soil and groundwater data or must make decisions based on such data for the management or remediation of these materials. It explores the pse of common analytical chemistry instrumentation and associated techniques for identification of environmentally hazardous materials. Sufficient detail is given to familiarize the nontechnical reader with the principles behind the operation of each technique. The scope and realm of the techniques and their constituent variations are portrayed through a discussion of crucial details and, wheremore » appropriate, the depiction of real-life data. It is the author's intention to provide an easily understood handbook for gas industry management. Techniques which determine the presence, composition, and quantification of gas industry wastes are discussed. Greater focus is given to traditional techniques which have been the mainstay of modem analytical benchwork. However, with the continual advancement of instrumental principles and design, several techniques have been included which are likely to receive greater attention in fiture considerations for waste-related detection. Definitions and concepts inherent to a thorough understanding of the principles common to analytical chemistry are discussed. It is also crucial that gas industry managers understand the effects of the various actions which take place before, during, and after the actual sampling step. When a series of sample collection, storage, and transport activities occur, new or inexperienced project managers may overlook or misunderstand the importance of the sequence. Each step has an impact on the final results of the measurement process; errors in judgment or decision making can be costly. Specific techniques and methodologies for the collection, storage, and transport of environmental media samples are not described or discussed in detail in thk handbook. However, the underlying philosophy regarding the importance of proper collection, storage, and transport practices, as well as pertinent references, are presented.« less

  16. Performance of Orbital Neutron Instruments for Spatially Resolved Hydrogen Measurements of Airless Planetary Bodies

    PubMed Central

    Elphic, Richard C.; Feldman, William C.; Funsten, Herbert O.; Prettyman, Thomas H.

    2010-01-01

    Abstract Orbital neutron spectroscopy has become a standard technique for measuring planetary surface compositions from orbit. While this technique has led to important discoveries, such as the deposits of hydrogen at the Moon and Mars, a limitation is its poor spatial resolution. For omni-directional neutron sensors, spatial resolutions are 1–1.5 times the spacecraft's altitude above the planetary surface (or 40–600 km for typical orbital altitudes). Neutron sensors with enhanced spatial resolution have been proposed, and one with a collimated field of view is scheduled to fly on a mission to measure lunar polar hydrogen. No quantitative studies or analyses have been published that evaluate in detail the detection and sensitivity limits of spatially resolved neutron measurements. Here, we describe two complementary techniques for evaluating the hydrogen sensitivity of spatially resolved neutron sensors: an analytic, closed-form expression that has been validated with Lunar Prospector neutron data, and a three-dimensional modeling technique. The analytic technique, called the Spatially resolved Neutron Analytic Sensitivity Approximation (SNASA), provides a straightforward method to evaluate spatially resolved neutron data from existing instruments as well as to plan for future mission scenarios. We conclude that the existing detector—the Lunar Exploration Neutron Detector (LEND)—scheduled to launch on the Lunar Reconnaissance Orbiter will have hydrogen sensitivities that are over an order of magnitude poorer than previously estimated. We further conclude that a sensor with a geometric factor of ∼ 100 cm2 Sr (compared to the LEND geometric factor of ∼ 10.9 cm2 Sr) could make substantially improved measurements of the lunar polar hydrogen spatial distribution. Key Words: Planetary instrumentation—Planetary science—Moon—Spacecraft experiments—Hydrogen. Astrobiology 10, 183–200. PMID:20298147

  17. Investigation of the feasibility of an analytical method of accounting for the effects of atmospheric drag on satellite motion

    NASA Technical Reports Server (NTRS)

    Bozeman, Robert E.

    1987-01-01

    An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.

  18. Finding Waldo: Learning about Users from their Interactions.

    PubMed

    Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco

    2014-12-01

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.

  19. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  20. Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols

    NASA Astrophysics Data System (ADS)

    Shafer, M. M.; Schauer, J. J.; Park, J.

    2001-12-01

    Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.

  1. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  2. Chiral Separations

    NASA Astrophysics Data System (ADS)

    Stalcup, A. M.

    2010-07-01

    The main goal of this review is to provide a brief overview of chiral separations to researchers who are versed in the area of analytical separations but unfamiliar with chiral separations. To researchers who are not familiar with this area, there is currently a bewildering array of commercially available chiral columns, chiral derivatizing reagents, and chiral selectors for approaches that span the range of analytical separation platforms (e.g., high-performance liquid chromatography, gas chromatography, supercritical-fluid chromatography, and capillary electrophoresis). This review begins with a brief discussion of chirality before examining the general strategies and commonalities among all of the chiral separation techniques. Rather than exhaustively listing all the chiral selectors and applications, this review highlights significant issues and differences between chiral and achiral separations, providing salient examples from specific classes of chiral selectors where appropriate.

  3. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  4. Leidenfrost Phenomenon-assisted Thermal Desorption (LPTD) and Its Application to Open Ion Sources at Atmospheric Pressure Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Saha, Subhrakanti; Chen, Lee Chuin; Mandal, Mridul Kanti; Hiraoka, Kenzo

    2013-03-01

    This work describes the development and application of a new thermal desorption technique that makes use of the Leidenfrost phenomenon in open ion sources at atmospheric pressure for direct mass spectrometric detection of ultratrace levels of illicit, therapeutic, and stimulant drugs, toxicants, and peptides (molecular weight above 1 kDa) in their unaltered state from complex real world samples without or with minor sample pretreatment. A low temperature dielectric barrier discharge ion source was used throughout the experiments and the analytical figures of merit of this technique were investigated. Further, this desorption technique coupled with other ionization sources such as electrospray ionization (ESI) and dc corona discharge atmospheric pressure chemical ionization (APCI) in open atmosphere was also investigated. The use of the high-resolution `Exactive Orbitrap' mass spectrometer provided unambiguous identification of trace levels of the targeted compounds from complex mixtures and background noise; the limits of detection for various small organic molecules and peptides treated with this technique were at the level of parts per trillion and 10-9 M, respectively. The high sensitivity of the present technique is attributed to the spontaneous enrichment of analyte molecules during the slow evaporation of the solvent, as well as to the sequential desorption of molecules from complex mixtures based on their volatilities. This newly developed desorption technique is simple and fast, while molecular ions are observed as the major ions.

  5. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  6. Informatics for Metabolomics.

    PubMed

    Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote

    2016-01-01

    Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.

  7. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  8. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  9. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  10. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2003-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. Any sample material that will interact with infrared light produces a spectrum and, although normally associated with organic materials, inorganic compounds may also be infrared active. The technique is rapid, reproducible and usually non-invasive to the sample. That it is non-invasive allows for additional characterization of the original material using other analytical techniques including thermal analysis and RAMAN spectroscopic techniques. With the appropriate accessories, the technique can be used to examine samples in liquid, solid or gas phase. Both aqueous and non-aqueous free-flowing solutions can be analyzed, as can viscous liquids such as heavy oils and greases. Solid samples of varying sizes and shapes may also be examined and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be analyzed. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  11. An Undergraduate Experiment for the Measurement of Perfluorinated Surfactants in Fish Liver by Liquid Chromatography-Tandem Mass Spectrometry

    ERIC Educational Resources Information Center

    Stock, Naomi L.; Martin, Jonathan W.; Ye, Yun; Mabury, Scott A.

    2007-01-01

    A laboratory experiment that provides students a hands-on introduction to the specific techniques of liquid chromatography-tandem mass spectrometry (LC-MS/MS) and electrospray ionization is presented. The students can thus practice the analytical principles of sample extraction, detection, quantification, and quality control using a fresh fish…

  12. Rich Analysis and Rational Models: Inferring Individual Behavior from Infant Looking Data

    ERIC Educational Resources Information Center

    Piantadosi, Steven T.; Kidd, Celeste; Aslin, Richard

    2014-01-01

    Studies of infant looking times over the past 50 years have provided profound insights about cognitive development, but their dependent measures and analytic techniques are quite limited. In the context of infants' attention to discrete sequential events, we show how a Bayesian data analysis approach can be combined with a rational cognitive…

  13. Using Word Clouds for Fast, Formative Assessment of Students' Short Written Responses

    ERIC Educational Resources Information Center

    Brooks, Bill J.; Gilbuena, Debra M.; Krause, Stephen J.; Koretsky, Milo D.

    2014-01-01

    Active learning in class helps students develop deeper understanding of chemical engineering principles. While the use of multiple-choice ConcepTests is clearly effective, we advocate for including student writing in learning activities as well. In this article, we demonstrate that word clouds can provide a quick analytical technique to assess…

  14. Users guide for EASI graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, D.W.

    1978-03-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of EASI Graphics and illustrates its application with some examples.

  15. A modal parameter extraction procedure applicable to linear time-invariant dynamic systems

    NASA Technical Reports Server (NTRS)

    Kurdila, A. J.; Craig, R. R., Jr.

    1985-01-01

    Modal analysis has emerged as a valuable tool in many phases of the engineering design process. Complex vibration and acoustic problems in new designs can often be remedied through use of the method. Moreover, the technique has been used to enhance the conceptual understanding of structures by serving to verify analytical models. A new modal parameter estimation procedure is presented. The technique is applicable to linear, time-invariant systems and accommodates multiple input excitations. In order to provide a background for the derivation of the method, some modal parameter extraction procedures currently in use are described. Key features implemented in the new technique are elaborated upon.

  16. 3-MCPD in food other than soy sauce or hydrolysed vegetable protein (HVP).

    PubMed

    Baer, Ines; de la Calle, Beatriz; Taylor, Philip

    2010-01-01

    This review gives an overview of current knowledge about 3-monochloropropane-1,2-diol (3-MCPD) formation and detection. Although 3-MCPD is often mentioned with regard to soy sauce and acid-hydrolysed vegetable protein (HVP), and much research has been done in that area, the emphasis here is placed on other foods. This contaminant can be found in a great variety of foodstuffs and is difficult to avoid in our daily nutrition. Despite its low concentration in most foods, its carcinogenic properties are of general concern. Its formation is a multivariate problem influenced by factors such as heat, moisture and sugar/lipid content, depending on the type of food and respective processing employed. Understanding the formation of this contaminant in food is fundamental to not only preventing or reducing it, but also developing efficient analytical methods of detecting it. Considering the differences between 3-MCPD-containing foods, and the need to test for the contaminant at different levels of food processing, one would expect a variety of analytical approaches. In this review, an attempt is made to provide an up-to-date list of available analytical methods and to highlight the differences among these techniques. Finally, the emergence of 3-MCPD esters and analytical techniques for them are also discussed here, although they are not the main focus of this review.

  17. Application of Semi-analytical Satellite Theory orbit propagator to orbit determination for space object catalog maintenance

    NASA Astrophysics Data System (ADS)

    Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke

    2016-05-01

    Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.

  18. Laser Ablation-Aerosol Mass Spectrometry-Chemical Ionization Mass Spectrometry for Ambient Surface Imaging

    DOE PAGES

    Berry, Jennifer L.; Day, Douglas A.; Elseberg, Tim; ...

    2018-02-20

    Mass spectrometry imaging is becoming an increasingly common analytical technique due to its ability to provide spatially resolved chemical information. In this paper, we report a novel imaging approach combining laser ablation with two mass spectrometric techniques, aerosol mass spectrometry and chemical ionization mass spectrometry, separately and in parallel. Both mass spectrometric methods provide the fast response, rapid data acquisition, low detection limits, and high-resolution peak separation desirable for imaging complex samples. Additionally, the two techniques provide complementary information with aerosol mass spectrometry providing near universal detection of all aerosol molecules and chemical ionization mass spectrometry with a heated inletmore » providing molecular-level detail of both gases and aerosols. The two techniques operate with atmospheric pressure interfaces and require no matrix addition for ionization, allowing for samples to be investigated in their native state under ambient pressure conditions. We demonstrate the ability of laser ablation-aerosol mass spectrometry-chemical ionization mass spectrometry (LA-AMS-CIMS) to create 2D images of both standard compounds and complex mixtures. Finally, the results suggest that LA-AMS-CIMS, particularly when combined with advanced data analysis methods, could have broad applications in mass spectrometry imaging applications.« less

  19. Laser Ablation-Aerosol Mass Spectrometry-Chemical Ionization Mass Spectrometry for Ambient Surface Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Jennifer L.; Day, Douglas A.; Elseberg, Tim

    Mass spectrometry imaging is becoming an increasingly common analytical technique due to its ability to provide spatially resolved chemical information. In this paper, we report a novel imaging approach combining laser ablation with two mass spectrometric techniques, aerosol mass spectrometry and chemical ionization mass spectrometry, separately and in parallel. Both mass spectrometric methods provide the fast response, rapid data acquisition, low detection limits, and high-resolution peak separation desirable for imaging complex samples. Additionally, the two techniques provide complementary information with aerosol mass spectrometry providing near universal detection of all aerosol molecules and chemical ionization mass spectrometry with a heated inletmore » providing molecular-level detail of both gases and aerosols. The two techniques operate with atmospheric pressure interfaces and require no matrix addition for ionization, allowing for samples to be investigated in their native state under ambient pressure conditions. We demonstrate the ability of laser ablation-aerosol mass spectrometry-chemical ionization mass spectrometry (LA-AMS-CIMS) to create 2D images of both standard compounds and complex mixtures. Finally, the results suggest that LA-AMS-CIMS, particularly when combined with advanced data analysis methods, could have broad applications in mass spectrometry imaging applications.« less

  20. Application of surface analytical methods in thin film analysis

    NASA Astrophysics Data System (ADS)

    Wen, Xingu

    Self-assembly and the sol-gel process are two promising methods for the preparation of novel materials and thin films. In this research, these two methods were utilized to prepare two types of thin films: self-assembled monolayers of peptides on gold and SiO2 sol-gel thin films modified with Ru(II) complexes. The properties of the resulting thin films were investigated by several analytical techniques in order to explore their potential applications in biomaterials, chemical sensors, nonlinear optics and catalysis. Among the analytical techniques employed in the study, surface analytical techniques, such as X-ray photoelectron spectroscopy (XPS) and grazing angle reflection absorption Fourier transform infrared spectroscopy (RA-FTIR), are particularly useful in providing information regarding the compositions and structures of the thin films. In the preparation of peptide thin films, monodisperse peptides were self-assembled on gold substrate via the N-terminus-coupled lipoic acid. The film compositions were investigated by XPS and agreed well with the theoretical values. XPS results also revealed that the surface coverage of the self-assembled films was significantly larger than that of the physisorbed films and that the chemisorption between the peptides and gold surface was stable in solvent. Studies by angle dependent XPS (ADXPS) and grazing angle RA-FTIR indicated that the peptides were on average oriented at a small angle from the surface normal. By using a model of orientation distribution function, both the peptide tilt angle and film thickness can be well calculated. Ru(II) complex doped SiO2 sol-gel thin films were prepared by low temperature sol-gel process. The ability of XPS coupled with Ar + ion sputtering to provide both chemical and compositional depth profile information of these sol-gel films was evaluated. This technique, together with UV-VIS and electrochemical measurements, was used to investigate the stability of Ru complexes in the composite films. The stability of Ru complexes with respect to dopant leaching was dependent on the film microstructures. Three methods aiming to improve the dopant stability were also explored. In addition, the ion exchange properties of the composite films, upon exposure to various ions in aqueous solutions, were investigated by XPS, and the ion exchange mechanism was elucidated.

  1. Fall Velocities of Hydrometeors in the Atmosphere: Refinements to a Continuous Analytical Power Law.

    NASA Astrophysics Data System (ADS)

    Khvorostyanov, Vitaly I.; Curry, Judith A.

    2005-12-01

    This paper extends the previous research of the authors on the unified representation of fall velocities for both liquid and crystalline particles as a power law over the entire size range of hydrometeors observed in the atmosphere. The power-law coefficients are determined as continuous analytical functions of the Best or Reynolds number or of the particle size. Here, analytical expressions are formulated for the turbulent corrections to the Reynolds number and to the power-law coefficients that describe the continuous transition from the laminar to the turbulent flow around a falling particle. A simple analytical expression is found for the correction of fall velocities for temperature and pressure. These expressions and the resulting fall velocities are compared with observations and other calculations for a range of ice crystal habits and sizes. This approach provides a continuous analytical power-law description of the terminal velocities of liquid and crystalline hydrometeors with sufficiently high accuracy and can be directly used in bin-resolving models or incorporated into parameterizations for cloud- and large-scale models and remote sensing techniques.

  2. In situ ionic liquid dispersive liquid-liquid microextraction and direct microvial insert thermal desorption for gas chromatographic determination of bisphenol compounds.

    PubMed

    Cacho, Juan Ignacio; Campillo, Natalia; Viñas, Pilar; Hernández-Córdoba, Manuel

    2016-01-01

    A new procedure based on direct insert microvial thermal desorption injection allows the direct analysis of ionic liquid extracts by gas chromatography and mass spectrometry (GC-MS). For this purpose, an in situ ionic liquid dispersive liquid-liquid microextraction (in situ IL DLLME) has been developed for the quantification of bisphenol A (BPA), bisphenol Z (BPZ) and bisphenol F (BPF). Different parameters affecting the extraction efficiency of the microextraction technique and the thermal desorption step were studied. The optimized procedure, determining the analytes as acetyl derivatives, provided detection limits of 26, 18 and 19 ng L(-1) for BPA, BPZ and BPF, respectively. The release of the three analytes from plastic containers was monitored using this newly developed analytical method. Analysis of the migration test solutions for 15 different plastic containers in daily use identified the presence of the analytes at concentrations ranging between 0.07 and 37 μg L(-1) in six of the samples studied, BPA being the most commonly found and at higher concentrations than the other analytes.

  3. [Amanitine determination as an example of peptide analysis in the biological samples with HPLC-MS technique].

    PubMed

    Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof

    Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers satisfactory sensitivity of determination.

  4. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  5. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  6. Combined sensing platform for advanced diagnostics in exhaled mouse breath

    NASA Astrophysics Data System (ADS)

    Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris

    2013-03-01

    Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.

  7. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  8. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  9. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  10. Enantiospecific Detection of Chiral Nanosamples Using Photoinduced Force

    NASA Astrophysics Data System (ADS)

    Kamandi, Mohammad; Albooyeh, Mohammad; Guclu, Caner; Veysi, Mehdi; Zeng, Jinwei; Wickramasinghe, Kumar; Capolino, Filippo

    2017-12-01

    We propose a high-resolution microscopy technique for enantiospecific detection of chiral samples down to sub-100-nm size based on force measurement. We delve into the differential photoinduced optical force Δ F exerted on an achiral probe in the vicinity of a chiral sample when left and right circularly polarized beams separately excite the sample-probe interactive system. We analytically prove that Δ F is entangled with the enantiomer type of the sample enabling enantiospecific detection of chiral inclusions. Moreover, we demonstrate that Δ F is linearly dependent on both the chiral response of the sample and the electric response of the tip and is inversely related to the quartic power of probe-sample distance. We provide physical insight into the transfer of optical activity from the chiral sample to the achiral tip based on a rigorous analytical approach. We support our theoretical achievements by several numerical examples highlighting the potential application of the derived analytic properties. Lastly, we demonstrate the sensitivity of our method to enantiospecify nanoscale chiral samples with chirality parameter on the order of 0.01 and discuss how the sensitivity of our proposed technique can be further improved.

  11. Sensor Data Qualification System (SDQS) Implementation Study

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Melcher, Kevin; Fulton, Christopher; Maul, William

    2009-01-01

    The Sensor Data Qualification System (SDQS) is being developed to provide a sensor fault detection capability for NASA s next-generation launch vehicles. In addition to traditional data qualification techniques (such as limit checks, rate-of-change checks and hardware redundancy checks), SDQS can provide augmented capability through additional techniques that exploit analytical redundancy relationships to enable faster and more sensitive sensor fault detection. This paper documents the results of a study that was conducted to determine the best approach for implementing a SDQS network configuration that spans multiple subsystems, similar to those that may be implemented on future vehicles. The best approach is defined as one that most minimizes computational resource requirements without impacting the detection of sensor failures.

  12. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  13. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  14. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  15. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  16. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  17. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  18. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  19. Stratified Shear Flows In Pipe Geometries

    NASA Astrophysics Data System (ADS)

    Harabin, George; Camassa, Roberto; McLaughlin, Richard; UNC Joint Fluids Lab Team Team

    2015-11-01

    Exact and series solutions to the full Navier-Stokes equations coupled to the advection diffusion equation are investigated in tilted three-dimensional pipe geometries. Analytic techniques for studying the three-dimensional problem provide a means for tackling interesting questions such as the optimal domain for mass transport, and provide new avenues for experimental investigation of diffusion driven flows. Both static and time dependent solutions will be discussed. NSF RTG DMS-0943851, NSF RTG ARC-1025523, NSF DMS-1009750.

  20. An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach

    DTIC Science & Technology

    2012-08-01

    fusion. Therefore, we provide a detailed discussion on uncertain data types, their origins and three uncertainty pro- cessing formalisms that are popular...suitable membership functions corresponding to the fuzzy sets. 3.2.3 DS Theory The DS belief theory, originally proposed by Dempster, can be thought of as... originated and various imperfections of the source. Uncertainty handling formalisms provide techniques for modeling and working with these uncertain data types

  1. Common aspects influencing the translocation of SERS to Biomedicine.

    PubMed

    Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu

    2018-01-04

    In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  2. Process analytical technologies (PAT) in freeze-drying of parenteral products.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael

    2009-01-01

    Quality by Design (QbD), aims at assuring quality by proper design and control, utilizing appropriate Process Analytical Technologies (PAT) to monitor critical process parameters during processing to ensure that the product meets the desired quality attributes. This review provides a comprehensive list of process monitoring devices that can be used to monitor critical process parameters and will focus on a critical review of the viability of the PAT schemes proposed. R&D needs in PAT for freeze-drying have also been addressed with particular emphasis on batch techniques that can be used on all the dryers independent of the dryer scale.

  3. Analytical investigation of thermal barrier coatings for advanced power generation combustion turbines

    NASA Technical Reports Server (NTRS)

    Amos, D. J.

    1977-01-01

    An analytical evaluation was conducted to determine quantitatively the improvement potential in cycle efficiency and cost of electricity made possible by the introduction of thermal barrier coatings to power generation combustion turbine systems. The thermal barrier system, a metallic bond coat and yttria stabilized zirconia outer layer applied by plasma spray techniques, acts as a heat insulator to provide substantial metal temperature reductions below that of the exposed thermal barrier surface. The study results show the thermal barrier to be a potentially attractive means for improving performance and reducing cost of electricity for the simple, recuperated, and combined cycles evaluated.

  4. Monitoring by forward scatter radar techniques: an improved second-order analytical model

    NASA Astrophysics Data System (ADS)

    Falconi, Marta Tecla; Comite, Davide; Galli, Alessandro; Marzano, Frank S.; Pastina, Debora; Lombardo, Pierfrancesco

    2017-10-01

    In this work, a second-order phase approximation is introduced to provide an improved analytical model of the signal received in forward scatter radar systems. A typical configuration with a rectangular metallic object illuminated while crossing the baseline, in far- or near-field conditions, is considered. An improved second-order model is compared with a simplified one already proposed by the authors and based on a paraxial approximation. A phase error analysis is carried out to investigate benefits and limitations of the second-order modeling. The results are validated by developing full-wave numerical simulations implementing the relevant scattering problem on a commercial tool.

  5. Exact solutions to the fermion propagator Schwinger-Dyson equation in Minkowski space with on-shell renormalization for quenched QED

    DOE PAGES

    Jia, Shaoyang; Pennington, M. R.

    2017-08-01

    With the introduction of a spectral representation, the Schwinger-Dyson equation (SDE) for the fermion propagator is formulated in Minkowski space in QED. After imposing the on-shell renormalization conditions, analytic solutions for the fermion propagator spectral functions are obtained in four dimensions with a renormalizable version of the Gauge Technique anzatz for the fermion-photon vertex in the quenched approximation in the Landau gauge. Despite the limitations of this model, having an explicit solution provides a guiding example of the fermion propagator with the correct analytic structure. The Padé approximation for the spectral functions is also investigated.

  6. Exact solutions to the fermion propagator Schwinger-Dyson equation in Minkowski space with on-shell renormalization for quenched QED

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, Shaoyang; Pennington, M. R.

    With the introduction of a spectral representation, the Schwinger-Dyson equation (SDE) for the fermion propagator is formulated in Minkowski space in QED. After imposing the on-shell renormalization conditions, analytic solutions for the fermion propagator spectral functions are obtained in four dimensions with a renormalizable version of the Gauge Technique anzatz for the fermion-photon vertex in the quenched approximation in the Landau gauge. Despite the limitations of this model, having an explicit solution provides a guiding example of the fermion propagator with the correct analytic structure. The Padé approximation for the spectral functions is also investigated.

  7. Ku-band signal design study. [space shuttle orbiter data processing network

    NASA Technical Reports Server (NTRS)

    Rubin, I.

    1978-01-01

    Analytical tools, methods and techniques for assessing the design and performance of the space shuttle orbiter data processing system (DPS) are provided. The computer data processing network is evaluated in the key areas of queueing behavior synchronization and network reliability. The structure of the data processing network is described as well as the system operation principles and the network configuration. The characteristics of the computer systems are indicated. System reliability measures are defined and studied. System and network invulnerability measures are computed. Communication path and network failure analysis techniques are included.

  8. Cache-based error recovery for shared memory multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Wu, Kun-Lung; Fuchs, W. Kent; Patel, Janak H.

    1989-01-01

    A multiprocessor cache-based checkpointing and recovery scheme for of recovering from transient processor errors in a shared-memory multiprocessor with private caches is presented. New implementation techniques that use checkpoint identifiers and recovery stacks to reduce performance degradation in processor utilization during normal execution are examined. This cache-based checkpointing technique prevents rollback propagation, provides for rapid recovery, and can be integrated into standard cache coherence protocols. An analytical model is used to estimate the relative performance of the scheme during normal execution. Extensions that take error latency into account are presented.

  9. Mass spectrometry techniques in the survey of steroid metabolites as potential disease biomarkers: a review.

    PubMed

    Gouveia, Maria João; Brindley, Paul J; Santos, Lúcio Lara; Correia da Costa, José Manuel; Gomes, Paula; Vale, Nuno

    2013-09-01

    Mass spectrometric approaches have been fundamental to the identification of metabolites associated with steroid hormones, yet this topic has not been reviewed in depth in recent years. To this end, and given the increasing relevance of liquid chromatography-mass spectrometry (LC-MS) studies on steroid hormones and their metabolites, the present review addresses this subject. This review provides a timely summary of the use of various mass spectrometry-based analytical techniques during the evaluation of steroidal biomarkers in a range of human disease settings. The sensitivity and specificity of these technologies are clearly providing valuable new insights into breast cancer and cardiovascular disease. We aim to contribute to an enhanced understanding of steroid metabolism and how it can be profiled by LC-MS techniques. Crown Copyright © 2013. Published by Elsevier Inc. All rights reserved.

  10. Analytical methods manual for the Mineral Resource Surveys Program, U.S. Geological Survey

    USGS Publications Warehouse

    Arbogast, Belinda F.

    1996-01-01

    The analytical methods validated by the Mineral Resource Surveys Program, Geologic Division, is the subject of this manual. This edition replaces the methods portion of Open-File Report 90-668 published in 1990. Newer methods may be used which have been approved by the quality assurance (QA) project and are on file with the QA coordinator.This manual is intended primarily for use by laboratory scientists; this manual can also assist laboratory users to evaluate the data they receive. The analytical methods are written in a step by step approach so that they may be used as a training tool and provide detailed documentation of the procedures for quality assurance. A "Catalog of Services" is available for customer (submitter) use with brief listings of:the element(s)/species determined,method of determination,reference to cite,contact person,summary of the technique,and analyte concentration range.For a copy please contact the Branch office at (303) 236-1800 or fax (303) 236-3200.

  11. Characteristics, Properties and Analytical Methods of Amoxicillin: A Review with Green Approach.

    PubMed

    de Marco, Bianca Aparecida; Natori, Jéssica Sayuri Hisano; Fanelli, Stefany; Tótoli, Eliane Gandolpho; Salgado, Hérida Regina Nunes

    2017-05-04

    Bacterial infections are the second leading cause of global mortality. Considering this fact, it is extremely important studying the antimicrobial agents. Amoxicillin is an antimicrobial agent that belongs to the class of penicillins; it has bactericidal activity and is widely used in the Brazilian health system. In literature, some analytical methods are found for the identification and quantification of this penicillin, which are essential for its quality control, which ensures maintaining the product characteristics, therapeutic efficacy and patient's safety. Thus, this study presents a brief literature review on amoxicillin and the analytical methods developed for the analysis of this drug in official and scientific papers. The major analytical methods found were high-performance liquid chromatography (HPLC), ultra-performance liquid chromatography (U-HPLC), capillary electrophoresis and iodometry and diffuse reflectance infrared Fourier transform. It is essential to note that most of the developed methods used toxic and hazardous solvents, which makes necessary industries and researchers choose to develop environmental-friendly techniques to provide enhanced benefits to environment and staff.

  12. Rapid ultrasensitive single particle surface-enhanced Raman spectroscopy using metallic nanopores.

    PubMed

    Cecchini, Michael P; Wiener, Aeneas; Turek, Vladimir A; Chon, Hyangh; Lee, Sangyeop; Ivanov, Aleksandar P; McComb, David W; Choo, Jaebum; Albrecht, Tim; Maier, Stefan A; Edel, Joshua B

    2013-10-09

    Nanopore sensors embedded within thin dielectric membranes have been gaining significant interest due to their single molecule sensitivity and compatibility of detecting a large range of analytes, from DNA and proteins, to small molecules and particles. Building on this concept we utilize a metallic Au solid-state membrane to translocate and rapidly detect single Au nanoparticles (NPs) functionalized with 589 dye molecules using surface-enhanced resonance Raman spectroscopy (SERRS). We show that, due to the plasmonic coupling between the Au metallic nanopore surface and the NP, signal intensities are enhanced when probing analyte molecules bound to the NP surface. Although not single molecule, this nanopore sensing scheme benefits from the ability of SERRS to provide rich vibrational information on the analyte, improving on current nanopore-based electrical and optical detection techniques. We show that the full vibrational spectrum of the analyte can be detected with ultrahigh spectral sensitivity and a rapid temporal resolution of 880 μs.

  13. Multivariate Curve Resolution Applied to Infrared Reflection Measurements of Soil Contaminated with an Organophosphorus Analyte

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, Neal B.; Blake, Thomas A.; Gassman, Paul L.

    2006-07-01

    Multivariate curve resolution (MCR) is a powerful technique for extracting chemical information from measured spectra on complex mixtures. The difficulty with applying MCR to soil reflectance measurements is that light scattering artifacts can contribute much more variance to the measurements than the analyte(s) of interest. Two methods were integrated into a MCR decomposition to account for light scattering effects. Firstly, an extended mixture model using pure analyte spectra augmented with scattering ‘spectra’ was used for the measured spectra. And secondly, second derivative preprocessed spectra, which have higher selectivity than the unprocessed spectra, were included in a second block as amore » part of the decomposition. The conventional alternating least squares (ALS) algorithm was modified to simultaneously decompose the measured and second derivative spectra in a two-block decomposition. Equality constraints were also included to incorporate information about sampling conditions. The result was an MCR decomposition that provided interpretable spectra from soil reflectance measurements.« less

  14. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    PubMed Central

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-01-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes—including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH—in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media. PMID:27834380

  15. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes.

    PubMed

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-02-23

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques.

  16. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  17. Evaluation of three new laser spectrometer techniques for in-situ carbon monoxide measurements

    NASA Astrophysics Data System (ADS)

    Zellweger, C.; Steinbacher, M.; Buchmann, B.

    2012-07-01

    Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. However, the current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came on the market with promising properties for trace gas analytics. The current study compares three instruments that are recently commercially available (since 2011) with the up to now best available technique (vacuum UV fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques provide a considerably better performance compared to previous techniques, although some issues such as temperature influence and cross sensitivities need further attention.

  18. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    PubMed

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.

  19. Laser Ablation in situ (U-Th-Sm)/He and U-Pb Double-Dating of Apatite and Zircon: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.

    2015-12-01

    We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.

  20. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  1. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  2. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  3. Plant Ethylene Detection Using Laser-Based Photo-Acoustic Spectroscopy.

    PubMed

    Van de Poel, Bram; Van Der Straeten, Dominique

    2017-01-01

    Analytical detection of the plant hormone ethylene is an important prerequisite in physiological studies. Real-time and super sensitive detection of trace amounts of ethylene gas is possible using laser-based photo-acoustic spectroscopy. This Chapter will provide some background on the technique, compare it with conventional gas chromatography, and provide a detailed user-friendly hand-out on how to operate the machine and the software. In addition, this Chapter provides some tips and tricks for designing and performing physiological experiments suited for ethylene detection with laser-based photo-acoustic spectroscopy.

  4. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  5. Resonance Ionization, Mass Spectrometry.

    ERIC Educational Resources Information Center

    Young, J. P.; And Others

    1989-01-01

    Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)

  6. Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods

    ERIC Educational Resources Information Center

    Zhang, Ying

    2011-01-01

    Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…

  7. Classification of user interfaces for graph-based online analytical processing

    NASA Astrophysics Data System (ADS)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  8. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    PubMed

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and they are critically reviewed.

  9. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  10. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  11. Geochemistry of Rock Samples Collected from the Iron Hill Carbonatite Complex, Gunnison County, Colorado

    USGS Publications Warehouse

    Van Gosen, Bradley S.

    2008-01-01

    A study conducted in 2006 by the U.S. Geological Survey collected 57 surface rock samples from nine types of intrusive rock in the Iron Hill carbonatite complex. This intrusive complex, located in Gunnison County of southwestern Colorado, is known for its classic carbonatite-alkaline igneous geology and petrology. The Iron Hill complex is also noteworthy for its diverse mineral resources, including enrichments in titanium, rare earth elements, thorium, niobium (columbium), and vanadium. This study was performed to reexamine the chemistry and metallic content of the major rock units of the Iron Hill complex by using modern analytical techniques, while providing a broader suite of elements than the earlier published studies. The report contains the geochemical analyses of the samples in tabular and digital spreadsheet format, providing the analytical results for 55 major and trace elements.

  12. A strategy to determine operating parameters in tissue engineering hollow fiber bioreactors

    PubMed Central

    Shipley, RJ; Davidson, AJ; Chan, K; Chaudhuri, JB; Waters, SL; Ellis, MJ

    2011-01-01

    The development of tissue engineering hollow fiber bioreactors (HFB) requires the optimal design of the geometry and operation parameters of the system. This article provides a strategy for specifying operating conditions for the system based on mathematical models of oxygen delivery to the cell population. Analytical and numerical solutions of these models are developed based on Michaelis–Menten kinetics. Depending on the minimum oxygen concentration required to culture a functional cell population, together with the oxygen uptake kinetics, the strategy dictates the model needed to describe mass transport so that the operating conditions can be defined. If cmin ≫ Km we capture oxygen uptake using zero-order kinetics and proceed analytically. This enables operating equations to be developed that allow the user to choose the medium flow rate, lumen length, and ECS depth to provide a prescribed value of cmin. When , we use numerical techniques to solve full Michaelis–Menten kinetics and present operating data for the bioreactor. The strategy presented utilizes both analytical and numerical approaches and can be applied to any cell type with known oxygen transport properties and uptake kinetics. PMID:21370228

  13. Flexible pavement overlay design procedures. Volume 1: Evaluation and modification of the design methods

    NASA Astrophysics Data System (ADS)

    Majidzadeh, K.; Ilves, G. J.

    1981-08-01

    A ready reference to design procedures for asphaltic concrete overlay of flexible pavements based on elastic layer theory is provided. The design procedures and the analytical techniques presented were formulated to predict the structural fatigue response of asphaltic concrete overlays for various design conditions, including geometrical and material properties, loading conditions and environmental variables.

  14. Differential reliability : probabilistic engineering applied to wood members in bending-tension

    Treesearch

    Stanley K. Suddarth; Frank E. Woeste; William L. Galligan

    1978-01-01

    Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...

  15. Wildlife habitats of the north coast of California: new techniques for extensive forest inventory.

    Treesearch

    Janet L. Ohmann

    1992-01-01

    A study was undertaken to develop methods for extensive inventory and analysis of wildlife habitats. The objective was to provide information about amounts and conditions of wildlife habitats from extensive, sample based inventories so that wildlife can be better considered in forest planning and policy decisions at the regional scale. The new analytical approach...

  16. Using a Data Mining Approach to Develop a Student Engagement-Based Institutional Typology. IR Applications, Volume 18, February 8, 2009

    ERIC Educational Resources Information Center

    Luan, Jing; Zhao, Chun-Mei; Hayek, John C.

    2009-01-01

    Data mining provides both systematic and systemic ways to detect patterns of student engagement among students at hundreds of institutions. Using traditional statistical techniques alone, the task would be significantly difficult--if not impossible--considering the size and complexity in both data and analytical approaches necessary for this…

  17. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  18. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  19. World Spatiotemporal Analytics and Mapping Project (wstamp): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World's Largest Open Soruce Data Sets

    NASA Astrophysics Data System (ADS)

    Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.

    2015-07-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.

  20. Analytical and experimental studies of an optimum multisegment phased liner noise suppression concept

    NASA Technical Reports Server (NTRS)

    Sawdy, D. T.; Beckemeyer, R. J.; Patterson, J. D.

    1976-01-01

    Results are presented from detailed analytical studies made to define methods for obtaining improved multisegment lining performance by taking advantage of relative placement of each lining segment. Properly phased liner segments reflect and spatially redistribute the incident acoustic energy and thus provide additional attenuation. A mathematical model was developed for rectangular ducts with uniform mean flow. Segmented acoustic fields were represented by duct eigenfunction expansions, and mode-matching was used to ensure continuity of the total field. Parametric studies were performed to identify attenuation mechanisms and define preliminary liner configurations. An optimization procedure was used to determine optimum liner impedance values for a given total lining length, Mach number, and incident modal distribution. Optimal segmented liners are presented and it is shown that, provided the sound source is well-defined and flow environment is known, conventional infinite duct optimum attenuation rates can be improved. To confirm these results, an experimental program was conducted in a laboratory test facility. The measured data are presented in the form of analytical-experimental correlations. Excellent agreement between theory and experiment verifies and substantiates the analytical prediction techniques. The results indicate that phased liners may be of immediate benefit in the development of improved aircraft exhaust duct noise suppressors.

  1. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  2. A Modular GIS-Based Software Architecture for Model Parameter Estimation using the Method of Anchored Distributions (MAD)

    NASA Astrophysics Data System (ADS)

    Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.

    2012-12-01

    The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.

  3. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  4. SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements

    NASA Technical Reports Server (NTRS)

    Knox, E. C.; Woods, G. Hamilton

    1991-01-01

    Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.

  5. Finding Waldo: Learning about Users from their Interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Eli T.; Ottley, Alvitta; Zhao, Helen

    Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less

  6. Forensic toxicology.

    PubMed

    Drummer, Olaf H

    2010-01-01

    Forensic toxicology has developed as a forensic science in recent years and is now widely used to assist in death investigations, in civil and criminal matters involving drug use, in drugs of abuse testing in correctional settings and custodial medicine, in road and workplace safety, in matters involving environmental pollution, as well as in sports doping. Drugs most commonly targeted include amphetamines, benzodiazepines, cannabis, cocaine and the opiates, but can be any other illicit substance or almost any over-the-counter or prescribed drug, as well as poisons available to the community. The discipline requires high level skills in analytical techniques with a solid knowledge of pharmacology and pharmacokinetics. Modern techniques rely heavily on immunoassay screening analyses and mass spectrometry (MS) for confirmatory analyses using either high-performance liquid chromatography or gas chromatography as the separation technique. Tandem MS has become more and more popular compared to single-stage MS. It is essential that analytical systems are fully validated and fit for the purpose and the assay batches are monitored with quality controls. External proficiency programs monitor both the assay and the personnel performing the work. For a laboratory to perform optimally, it is vital that the circumstances and context of the case are known and the laboratory understands the limitations of the analytical systems used, including drug stability. Drugs and poisons can change concentration postmortem due to poor or unequal quality of blood and other specimens, anaerobic metabolism and redistribution. The latter provides the largest handicap in the interpretation of postmortem results.

  7. Geomagnetic field models for satellite angular motion studies

    NASA Astrophysics Data System (ADS)

    Ovchinnikov, M. Yu.; Penkov, V. I.; Roldugin, D. S.; Pichuzhkina, A. V.

    2018-03-01

    Four geomagnetic field models are discussed: IGRF, inclined, direct and simplified dipoles. Geomagnetic induction vector expressions are provided in different reference frames. Induction vector behavior is compared for different models. Models applicability for the analysis of satellite motion is studied from theoretical and engineering perspectives. Relevant satellite dynamics analysis cases using analytical and numerical techniques are provided. These cases demonstrate the benefit of a certain model for a specific dynamics study. Recommendations for models usage are summarized in the end.

  8. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  9. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  10. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  11. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  12. Leidenfrost phenomenon-assisted thermal desorption (LPTD) and its application to open ion sources at atmospheric pressure mass spectrometry.

    PubMed

    Saha, Subhrakanti; Chen, Lee Chuin; Mandal, Mridul Kanti; Hiraoka, Kenzo

    2013-03-01

    This work describes the development and application of a new thermal desorption technique that makes use of the Leidenfrost phenomenon in open ion sources at atmospheric pressure for direct mass spectrometric detection of ultratrace levels of illicit, therapeutic, and stimulant drugs, toxicants, and peptides (molecular weight above 1 kDa) in their unaltered state from complex real world samples without or with minor sample pretreatment. A low temperature dielectric barrier discharge ion source was used throughout the experiments and the analytical figures of merit of this technique were investigated. Further, this desorption technique coupled with other ionization sources such as electrospray ionization (ESI) and dc corona discharge atmospheric pressure chemical ionization (APCI) in open atmosphere was also investigated. The use of the high-resolution 'Exactive Orbitrap' mass spectrometer provided unambiguous identification of trace levels of the targeted compounds from complex mixtures and background noise; the limits of detection for various small organic molecules and peptides treated with this technique were at the level of parts per trillion and 10(-9) M, respectively. The high sensitivity of the present technique is attributed to the spontaneous enrichment of analyte molecules during the slow evaporation of the solvent, as well as to the sequential desorption of molecules from complex mixtures based on their volatilities. This newly developed desorption technique is simple and fast, while molecular ions are observed as the major ions.

  13. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  14. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  15. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    DOE PAGES

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; ...

    2015-06-18

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is amore » technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.« less

  16. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  17. Enhancing the chemical selectivity in discovery-based analysis with tandem ionization time-of-flight mass spectrometry detection for comprehensive two-dimensional gas chromatography.

    PubMed

    Freye, Chris E; Moore, Nicholas R; Synovec, Robert E

    2018-02-16

    The complementary information provided by tandem ionization time-of-flight mass spectrometry (TI-TOFMS) is investigated for comparative discovery-based analysis, when coupled with comprehensive two-dimensional gas chromatography (GC × GC). The TI conditions implemented were a hard ionization energy (70 eV) concurrently collected with a soft ionization energy (14 eV). Tile-based Fisher ratio (F-ratio) analysis is used to analyze diesel fuel spiked with twelve analytes at a nominal concentration of 50 ppm. F-ratio analysis is a supervised discovery-based technique that compares two different sample classes, in this case spiked and unspiked diesel, to reduce the complex GC × GC-TI-TOFMS data into a hit list of class distinguishing analyte features. Hit lists of the 70 eV and 14 eV data sets, and the single hit list produced when the two data sets are fused together, are all investigated. For the 70 eV hit list, eleven of the twelve analytes were found in the top thirteen hits. For the 14 eV hit list, nine of the twelve analytes were found in the top nine hits, with the other three analytes either not found or well down the hit list. As expected, the F-ratios per m/z used to calculate each average F-ratio per hit were generally smaller fragment ions for the 70 eV data set, while the larger fragment ions were emphasized in the 14 eV data set, supporting the notion that complementary information was provided. The discovery rate was improved when F-ratio analysis was performed on the fused data sets resulted in eleven of the twelve analytes being at the top of the single hit list. Using PARAFAC, analytes that were "discovered" were deconvoluted in order to obtain their identification via match values (MV). Location of the analytes and the "F-ratio spectra" obtained from F-ratio analysis were used to guide the deconvolution. Eight of the twelve analytes where successfully deconvoluted and identified using the in-house library for the 70 eV data set. PARAFAC deconvolution of the two separate data sets provided increased confidence in identification of "discovered" analytes. Herein, we explore the limit of analyte discovery and limit of analyte identification, and demonstrate a general workflow for the investigation of key chemical features in complex samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  19. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  20. The phonetics of talk in interaction--introduction to the special issue.

    PubMed

    Ogden, Richard

    2012-03-01

    This overview paper provides an introduction to work on naturally-occurring speech data, combining techniques of conversation analysis with techniques and methods from phonetics. The paper describes the development of the field, highlighting current challenges and progress in interdisciplinary work. It considers the role of quantification and its relationship to a qualitative methodology. It presents the conversation analytic notion of sequence as a version of context, and argues that sequences of talk constrain relevant phonetic design, and so provide one account for variability in naturally occurring speech. The paper also describes the manipulation of speech and language on many levels simultaneously. All of these themes occur and are explored in more detail in the papers contained in this special issue.

  1. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.

  3. Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices

    PubMed Central

    Volckens, John

    2014-01-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892

  4. Rapid detection of transition metals in welding fumes using paper-based analytical devices.

    PubMed

    Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John

    2014-05-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.

  5. Recent advances in immunosensor for narcotic drug detection

    PubMed Central

    Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman

    2015-01-01

    Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925

  6. Immunoanalysis Methods for the Detection of Dioxins and Related Chemicals

    PubMed Central

    Tian, Wenjing; Xie, Heidi Qunhui; Fu, Hualing; Pei, Xinhui; Zhao, Bin

    2012-01-01

    With the development of biotechnology, approaches based on antibodies, such as enzyme-linked immunosorbent assay (ELISA), active aryl hydrocarbon immunoassay (Ah-I) and other multi-analyte immunoassays, have been utilized as alternatives to the conventional techniques based on gas chromatography and mass spectroscopy for the analysis of dioxin and dioxin-like compounds in environmental and biological samples. These screening methods have been verified as rapid, simple and cost-effective. This paper provides an overview on the development and application of antibody-based approaches, such as ELISA, Ah-I, and multi-analyte immunoassays, covering the sample extraction and cleanup, antigen design, antibody preparation and immunoanalysis. However, in order to meet the requirements for on-site fast detection and relative quantification of dioxins in the environment, further optimization is needed to make these immuno-analytical methods more sensitive and easy to use. PMID:23443395

  7. On-line soft sensing in upstream bioprocessing.

    PubMed

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  8. Preanalytics in lung cancer.

    PubMed

    Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko

    2015-01-01

    Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.

  9. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  10. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  11. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less

  13. Nanosensors based on functionalized nanoparticles and surface enhanced raman scattering

    DOEpatents

    Talley, Chad E.; Huser, Thomas R.; Hollars, Christopher W.; Lane, Stephen M.; Satcher, Jr., Joe H.; Hart, Bradley R.; Laurence, Ted A.

    2007-11-27

    Surface-Enhanced Raman Spectroscopy (SERS) is a vibrational spectroscopic technique that utilizes metal surfaces to provide enhanced signals of several orders of magnitude. When molecules of interest are attached to designed metal nanoparticles, a SERS signal is attainable with single molecule detection limits. This provides an ultrasensitive means of detecting the presence of molecules. By using selective chemistries, metal nanoparticles can be functionalized to provide a unique signal upon analyte binding. Moreover, by using measurement techniques, such as, ratiometric received SERS spectra, such metal nanoparticles can be used to monitor dynamic processes in addition to static binding events. Accordingly, such nanoparticles can be used as nanosensors for a wide range of chemicals in fluid, gaseous and solid form, environmental sensors for pH, ion concentration, temperature, etc., and biological sensors for proteins, DNA, RNA, etc.

  14. Electrospray Ionization Mass Spectrometry: A Technique to Access the Information beyond the Molecular Weight of the Analyte

    PubMed Central

    Banerjee, Shibdas; Mazumdar, Shyamalava

    2012-01-01

    The Electrospray Ionization (ESI) is a soft ionization technique extensively used for production of gas phase ions (without fragmentation) of thermally labile large supramolecules. In the present review we have described the development of Electrospray Ionization mass spectrometry (ESI-MS) during the last 25 years in the study of various properties of different types of biological molecules. There have been extensive studies on the mechanism of formation of charged gaseous species by the ESI. Several groups have investigated the origin and implications of the multiple charge states of proteins observed in the ESI-mass spectra of the proteins. The charged analytes produced by ESI can be fragmented by activating them in the gas-phase, and thus tandem mass spectrometry has been developed, which provides very important insights on the structural properties of the molecule. The review will highlight recent developments and emerging directions in this fascinating area of research. PMID:22611397

  15. The comparative evaluation of expanded national immunization policies in Korea using an analytic hierarchy process.

    PubMed

    Shin, Taeksoo; Kim, Chun-Bae; Ahn, Yang-Heui; Kim, Hyo-Youl; Cha, Byung Ho; Uh, Young; Lee, Joo-Heon; Hyun, Sook-Jung; Lee, Dong-Han; Go, Un-Yeong

    2009-01-29

    The purpose of this paper is to propose new evaluation criteria and an analytic hierarchy process (AHP) model to assess the expanded national immunization programs (ENIPs) and to evaluate two alternative health care policies. One of the alternative policies is that private clinics and hospitals would offer free vaccination services to children and the other of them is that public health centers would offer these free vaccination services. Our model to evaluate the ENIPs was developed using brainstorming, Delphi techniques, and the AHP model. We first used the brainstorming and Delphi techniques, as well as literature reviews, to determine 25 criteria with which to evaluate the national immunization policy; we then proposed a hierarchical structure of the AHP model to assess ENIPs. By applying the proposed AHP model to the assessment of ENIPs for Korean immunization policies, we show that free vaccination services should be provided by private clinics and hospitals rather than public health centers.

  16. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  17. Insights from a Modular Interdisciplinary Laboratory Course

    NASA Astrophysics Data System (ADS)

    Bouvier-Brown, N. C.; Carmona, V.

    2016-12-01

    A laboratory curriculum is naturally oriented towards "hands on" and problem-based learning. An earth science course, like chemical ecology, has the additional advantage of interdisciplinary education. Our Chemical Ecology course at Loyola Marymount University was structured as a small modular workshop. Students first gained hands-on experience with each analytical technique in the laboratory. The class then discussed these first datasets, delved into lecture topics, and tweaked their experimental procedures. Lastly, students were given time to design and execute their own experiments and present their findings. Three-to-four class periods were allotted for each laboratory topic. Detailed information and student reflections will be presented from the Spectroscopy module. Students used the Folin-Ciocalteau reagent method to extract the phenolic content of vegetation and/or soils. Because phenols are produced by plants for defense, this spectroscopic laboratory activity provided insight on allelopathy using analytical chemistry. Students were extremely engaged and learned not only the lab theory and technique, but also its application to our local ecology.

  18. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  19. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  20. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  1. Multi-band transmission color filters for multi-color white LEDs based visible light communication

    NASA Astrophysics Data System (ADS)

    Wang, Qixia; Zhu, Zhendong; Gu, Huarong; Chen, Mengzhu; Tan, Qiaofeng

    2017-11-01

    Light-emitting diodes (LEDs) based visible light communication (VLC) can provide license-free bands, high data rates, and high security levels, which is a promising technique that will be extensively applied in future. Multi-band transmission color filters with enough peak transmittance and suitable bandwidth play a pivotal role for boosting signal-noise-ratio in VLC systems. In this paper, multi-band transmission color filters with bandwidth of dozens nanometers are designed by a simple analytical method. Experiment results of one-dimensional (1D) and two-dimensional (2D) tri-band color filters demonstrate the effectiveness of the multi-band transmission color filters and the corresponding analytical method.

  2. On-orbit cryogenic fluid transfer

    NASA Technical Reports Server (NTRS)

    Aydelott, J. C.; Gille, J. P.; Eberhardt, R. N.

    1984-01-01

    A number of future NASA and DOD missions have been identified that will require, or could benefit from resupply of cryogenic liquids in orbit. The most promising approach for accomplishing cryogenic fluid transfer in the weightlessness environment of space is to use the thermodynamic filling technique. This approach involves initially reducing the receiver tank temperature by using several charge hold vent cycles followed by filling the tank without venting. Martin Marietta Denver Aerospace, under contract to the NASA Lewis Research Center, is currently developing analytical models to describe the on orbit cryogenic fluid transfer process. A detailed design of a shuttle attached experimental facility, which will provide the data necessary to verify the analytical models, is also being performed.

  3. The Responsivity of a Miniaturized Passive Implantable Wireless Pressure Sensor.

    PubMed

    Jiang, Hao; Lan, Di; Goldman, Ken; Etemadi, Mozziyar; Shahnasser, Hamid; Roy, Shuvo

    2011-01-01

    A miniature batteryless implantable wireless pressure sensor that can be used deep inside the body is desired by the medical community. MEMS technology makes it possible to achieve high responsivity that directly determines the operating distance between a miniature implanted sensor and the external RF probe, while providing the read-out. In this paper, for the first time, an analytical expression of the system responsivity versus the sensor design is derived using an equivalent circuit model. Also, the integration of micro-coil inductors and pressure sensitive capacitors on a single silicon chip using MEMS fabrication techniques is demonstrated. Further, the derived analytical design theory is validated by the measured responsivity of these sensors.

  4. Focused Ion Beam Recovery and Analysis of Interplanetary Dust Particles (IDPs) and Stardust Analogues

    NASA Technical Reports Server (NTRS)

    Graham, G. A.; Bradley, J. P.; Bernas, M.; Stroud, R. M.; Dai, Z. R.; Floss, C.; Stadermann, F. J.; Snead, C. J.; Westphal, A. J.

    2004-01-01

    Meteoritics research is a major beneficiary of recent developments in analytical instrumentation [1,2]. Integrated studies in which multiple analytical techniques are applied to the same specimen are providing new insight about the nature of IDPs [1]. Such studies are dependent on the ability to prepare specimens that can be analyzed in multiple instruments. Focused ion beam (FIB) microscopy has revolutionized specimen preparation in materials science [3]. Although FIB has successfully been used for a few IDP and meteorite studies [1,4-6], it has yet to be widely utilized in meteoritics. We are using FIB for integrated TEM/NanoSIMS/synchrotron infrared (IR) studies [1].

  5. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  6. Attenuated total reflectance FT-IR imaging and quantitative energy dispersive-electron probe X-ray microanalysis techniques for single particle analysis of atmospheric aerosol particles.

    PubMed

    Ryu, JiYeon; Ro, Chul-Un

    2009-08-15

    This work demonstrates the practical applicability of the combined use of attenuated total reflectance (ATR) FT-IR imaging and low-Z particle electron probe X-ray microanalysis (EPMA) techniques for the characterization of individual aerosol particles. These two single particle analytical techniques provide complementary information on the physicochemical characteristics of the same individual particles, that is, the low-Z particle EPMA for the information on the morphology and elemental concentration and the ATR-FT-IR imaging on the functional group, molecular species, and crystal structure. It was confirmed that the ATR-FT-IR imaging technique can provide sufficient FT-IR absorption signals to perform molecular speciation of individual particles of micrometer size when applied to artificially generated aerosol particles such as ascorbic acid and NaNO(3) aerosols. An exemplar indoor atmospheric aerosol sample was investigated to demonstrate the practical feasibility of the combined application of ATR-FT-IR imaging and low-Z particle EPMA techniques for the characterization of individual airborne particles.

  7. Surface-enhanced Raman spectroscopy for the detection of pathogenic DNA and protein in foods

    NASA Astrophysics Data System (ADS)

    Chowdhury, Mustafa H.; Atkinson, Brad; Good, Theresa; Cote, Gerard L.

    2003-07-01

    Traditional Raman spectroscopy while extremely sensitive to structure and conformation, is an ineffective tool for the detection of bioanalytes at the sub milimolar level. Surface Enhanced Raman Spectroscopy (SERS) is a technique developed more recently that has been used with applaudable success to enhance the Raman cross-section of a molecule by factors of 106 to 1014. This technique can be exploited in a nanoscale biosensor for the detection of pathogenic proteins and DNA in foods by using a biorecognition molecule to bring a target analyte in close proximity to the mental surface. This is expected to produce a SERS signal of the target analyte, thus making it possible to easily discriminate between the target analyte and possible confounders. In order for the sensor to be effective, the Raman spectra of the target analyte would have to be distinct from that of the biorecognition molecule, as both would be in close proximity to the metal surface and thus be subjected to the SERS effect. In our preliminary studies we have successfully used citrate reduced silver colloidal particles to obtain unique SERS spectra of α-helical and β-sheet bovine serum albumin (BSA) that served as models of an α helical antiobiody (biorecognition element) and a β-sheet target protein (pathogenic prion). In addition, the unique SERS spectra of double stranded and single stranded DNA were also obtained where the single stranded DNA served as the model for the biorecognition element and the double stranded DNA served as themodel for the DNA probe/target hybrid. This provides a confirmation of the feasibility of the method which opens opportunities for potentially wide spread applications in the detection of food pathogens, biowarefare agents, andother bio-analytes.

  8. Mixing of two co-directional Rayleigh surface waves in a nonlinear elastic material.

    PubMed

    Morlock, Merlin B; Kim, Jin-Yeon; Jacobs, Laurence J; Qu, Jianmin

    2015-01-01

    The mixing of two co-directional, initially monochromatic Rayleigh surface waves in an isotropic, homogeneous, and nonlinear elastic solid is investigated using analytical, finite element method, and experimental approaches. The analytical investigations show that while the horizontal velocity component can form a shock wave, the vertical velocity component can form a pulse independent of the specific ratios of the fundamental frequencies and amplitudes that are mixed. This analytical model is then used to simulate the development of the fundamentals, second harmonics, and the sum and difference frequency components over the propagation distance. The analytical model is further extended to include diffraction effects in the parabolic approximation. Finally, the frequency and amplitude ratios of the fundamentals are identified which provide maximum amplitudes of the second harmonics as well as of the sum and difference frequency components, to help guide effective material characterization; this approach should make it possible to measure the acoustic nonlinearity of a solid not only with the second harmonics, but also with the sum and difference frequency components. Results of the analytical investigations are then confirmed using the finite element method and the experimental feasibility of the proposed technique is validated for an aluminum specimen.

  9. Kinematic synthesis of adjustable robotic mechanisms

    NASA Astrophysics Data System (ADS)

    Chuenchom, Thatchai

    1993-01-01

    Conventional hard automation, such as a linkage-based or a cam-driven system, provides high speed capability and repeatability but not the flexibility required in many industrial applications. The conventional mechanisms, that are typically single-degree-of-freedom systems, are being increasingly replaced by multi-degree-of-freedom multi-actuators driven by logic controllers. Although this new trend in sophistication provides greatly enhanced flexibility, there are many instances where the flexibility needs are exaggerated and the associated complexity is unnecessary. Traditional mechanism-based hard automation, on the other hand, neither can fulfill multi-task requirements nor are cost-effective mainly due to lack of methods and tools to design-in flexibility. This dissertation attempts to bridge this technological gap by developing Adjustable Robotic Mechanisms (ARM's) or 'programmable mechanisms' as a middle ground between high speed hard automation and expensive serial jointed-arm robots. This research introduces the concept of adjustable robotic mechanisms towards cost-effective manufacturing automation. A generalized analytical synthesis technique has been developed to support the computational design of ARM's that lays the theoretical foundation for synthesis of adjustable mechanisms. The synthesis method developed in this dissertation, called generalized adjustable dyad and triad synthesis, advances the well-known Burmester theory in kinematics to a new level. While this method provides planar solutions, a novel patented scheme is utilized for converting prescribed three-dimensional motion specifications into sets of planar projections. This provides an analytical and a computational tool for designing adjustable mechanisms that satisfy multiple sets of three-dimensional motion specifications. Several design issues were addressed, including adjustable parameter identification, branching defect, and mechanical errors. An efficient mathematical scheme for identification of adjustable member was also developed. The analytical synthesis techniques developed in this dissertation were successfully implemented in a graphic-intensive user-friendly computer program. A physical prototype of a general purpose adjustable robotic mechanism has been constructed to serve as a proof-of-concept model.

  10. Considerations for standardizing predictive molecular pathology for cancer prognosis.

    PubMed

    Fiorentino, Michelangelo; Scarpelli, Marina; Lopez-Beltran, Antonio; Cheng, Liang; Montironi, Rodolfo

    2017-01-01

    Molecular tests that were once ancillary to the core business of cyto-histopathology are becoming the most relevant workload in pathology departments after histopathology/cytopathology and before autopsies. This has resulted from innovations in molecular biology techniques, which have developed at an incredibly fast pace. Areas covered: Most of the current widely used techniques in molecular pathology such as FISH, direct sequencing, pyrosequencing, and allele-specific PCR will be replaced by massive parallel sequencing that will not be considered next generation, but rather, will be considered to be current generation sequencing. The pre-analytical steps of molecular techniques such as DNA extraction or sample preparation will be largely automated. Moreover, all the molecular pathology instruments will be part of an integrated workflow that traces the sample from extraction to the analytical steps until the results are reported; these steps will be guided by expert laboratory information systems. In situ hybridization and immunohistochemistry for quantification will be largely digitalized as much as histology will be mostly digitalized rather than viewed using microscopy. Expert commentary: This review summarizes the technical and regulatory issues concerning the standardization of molecular tests in pathology. A vision of the future perspectives of technological changes is also provided.

  11. Quantitative Determination of Noa (Naturally Occurring Asbestos) in Rocks : Comparison Between Pcom and SEM Analysis

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Amodeo, Francesco; Giorgis, Ilaria; Vitaliti, Martina

    2017-04-01

    The quantification of NOA (Naturally Occurring Asbestos) in a rock or soil matrix is complex and subject to numerous errors. The purpose of this study is to compare two fundamental methodologies used for the analysis: the first one uses Phase Contrast Optical Microscope (PCOM) while the second one uses Scanning Electron Microscope (SEM). The two methods, although they provide the same result, which is the asbestos mass to total mass ratio, have completely different characteristics and both present pros and cons. The current legislation in Italy involves the use of SEM, DRX, FTIR, PCOM (DM 6/9/94) for the quantification of asbestos in bulk materials and soils and the threshold beyond which the material is considered as hazardous waste is a concentration of asbestos fiber of 1000 mg/kg.(DM 161/2012). The most used technology is the SEM which is the one among these with the better analytical sensitivity.(120mg/Kg DM 6 /9/94) The fundamental differences among the analyses are mainly: - Amount of analyzed sample portion - Representativeness of the sample - Resolution - Analytical precision - Uncertainty of the methodology - Operator errors Due to the problem of quantification of DRX and FTIR (1% DM 6/9/94) our Asbestos Laboratory (DIATI POLITO) since more than twenty years apply the PCOM methodology and in the last years the SEM methodology for quantification of asbestos content. The aim of our research is to compare the results obtained from a PCOM analysis with the results provided by SEM analysis on the base of more than 100 natural samples both from cores (tunnel-boring or explorative-drilling) and from tunnelling excavation . The results obtained show, in most cases, a good correlation between the two techniques. Of particular relevance is the fact that both techniques are reliable for very low quantities of asbestos, even lower than the analytical sensitivity. This work highlights the comparison between the two techniques emphasizing strengths and weaknesses of the two procedures and suggests how an integrated approach, together with the skills and experience of the operator may be the best way forward in order to obtain a constructive improvement of analysis techniques.

  12. Ion transfer through solvent polymeric membranes driven by an exponential current flux.

    PubMed

    Molina, A; Torralba, E; González, J; Serna, C; Ortuño, J A

    2011-03-21

    General analytical equations which govern ion transfer through liquid membranes with one and two polarized interfaces driven by an exponential current flux are derived. Expressions for the transient and stationary E-t, dt/dE-E and dI/dE-E curves are obtained, and the evolution from transient to steady behaviour has been analyzed in depth. We have also shown mathematically that the voltammetric and stationary chronopotentiometric I(N)-E curves are identical (with E being the applied potential for voltammetric techniques and the measured potential for chronopotentiometric techniques), and hence, their derivatives provide identical information.

  13. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  14. Fluorescence excitation-emission matrix spectroscopy for degradation monitoring of machinery lubricants

    NASA Astrophysics Data System (ADS)

    Sosnovski, Oleg; Suresh, Pooja; Dudelzak, Alexander E.; Green, Benjamin

    2018-02-01

    Lubrication oil is a vital component of heavy rotating machinery defining the machine's health, operational safety and effectiveness. Recently, the focus has been on developing sensors that provide real-time/online monitoring of oil condition/lubricity. Industrial practices and standards for assessing oil condition involve various analytical methods. Most these techniques are unsuitable for online applications. The paper presents the results of studying degradation of antioxidant additives in machinery lubricants using Fluorescence Excitation-Emission Matrix (EEM) Spectroscopy and Machine Learning techniques. EEM Spectroscopy is capable of rapid and even standoff sensing; it is potentially applicable to real-time online monitoring.

  15. Dynamic characterization, monitoring and control of rotating flexible beam-mass structures via piezo-embedded techniques

    NASA Technical Reports Server (NTRS)

    Lai, Steven H.-Y.

    1992-01-01

    A variational principle and a finite element discretization technique were used to derive the dynamic equations for a high speed rotating flexible beam-mass system embedded with piezo-electric materials. The dynamic equation thus obtained allows the development of finite element models which accommodate both the original structural element and the piezoelectric element. The solutions of finite element models provide system dynamics needed to design a sensing system. The characterization of gyroscopic effect and damping capacity of smart rotating devices are addressed. Several simulation examples are presented to validate the analytical solution.

  16. Microorganism Response to Stressed Terrestrial Environments: A Raman Spectroscopic Perspective of Extremophilic Life Strategies

    NASA Astrophysics Data System (ADS)

    Jorge-Villar, Susana E.; Edwards, Howell G. M.

    2013-03-01

    Raman spectroscopy is a valuable analytical technique for the identification of biomolecules and minerals in natural samples, which involves little or minimal sample manipulation. In this paper, we evaluate the advantages and disadvantages of this technique applied to the study of extremophiles. Furthermore, we provide a review of the results published, up to the present point in time, of the bio- and geo-strategies adopted by different types of extremophile colonies of microorganisms. We also show the characteristic Raman signatures for the identification of pigments and minerals, which appear in those complex samples.

  17. Protein-centric N-glycoproteomics analysis of membrane and plasma membrane proteins.

    PubMed

    Sun, Bingyun; Hood, Leroy

    2014-06-06

    The advent of proteomics technology has transformed our understanding of biological membranes. The challenges for studying membrane proteins have inspired the development of many analytical and bioanalytical tools, and the techniques of glycoproteomics have emerged as an effective means to enrich and characterize membrane and plasma-membrane proteomes. This Review summarizes the development of various glycoproteomics techniques to overcome the hurdles formed by the unique structures and behaviors of membrane proteins with a focus on N-glycoproteomics. Example contributions of N-glycoproteomics to the understanding of membrane biology are provided, and the areas that require future technical breakthroughs are discussed.

  18. A workflow learning model to improve geovisual analytics utility

    PubMed Central

    Roth, Robert E; MacEachren, Alan M; McCabe, Craig A

    2011-01-01

    Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545

  19. A workflow learning model to improve geovisual analytics utility.

    PubMed

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  20. Engagement vs Performance: Using Electronic Portfolios to Predict First Semester Engineering Student Persistence

    ERIC Educational Resources Information Center

    Aguiar, Everaldo; Ambrose, G. Alex; Chawla, Nitesh V.; Goodrich, Victoria; Brockman, Jay

    2014-01-01

    As providers of higher education begin to harness the power of big data analytics, one very fitting application for these new techniques is that of predicting student attrition. The ability to pinpoint students who might soon decide to drop out, or who may be following a suboptimal path to success, allows those in charge not only to understand the…

  1. Pesticide Multiresidue Analysis in Cereal Grains Using Modified QuEChERS Method Combined with Automated Direct Sample Introduction GC-TOFMS and UPLC-MS/MS Techniques

    USDA-ARS?s Scientific Manuscript database

    The QuEChERS (quick, easy, cheap, effective, rugged, and safe) sample preparation method was modified to accommodate various cereal grain matrices (corn, oat, rice and wheat) and provide good analytical results (recoveries in the range of 70-120% and RSDs <20%) for the majority of the target pestici...

  2. Comparison of methods for determination of total oil sands-derived naphthenic acids in water samples.

    PubMed

    Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed

    2017-11-01

    There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observedmore » galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.« less

  4. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  5. Sensitive screening of abused drugs in dried blood samples using ultra-high-performance liquid chromatography-ion booster-quadrupole time-of-flight mass spectrometry.

    PubMed

    Chepyala, Divyabharathi; Tsai, I-Lin; Liao, Hsiao-Wei; Chen, Guan-Yuan; Chao, Hsi-Chun; Kuo, Ching-Hua

    2017-03-31

    An increased rate of drug abuse is a major social problem worldwide. The dried blood spot (DBS) sampling technique offers many advantages over using urine or whole blood sampling techniques. This study developed a simple and efficient ultra-high-performance liquid chromatography-ion booster-quadrupole time-of-flight mass spectrometry (UHPLC-IB-QTOF-MS) method for the analysis of abused drugs and their metabolites using DBS. Fifty-seven compounds covering the most commonly abused drugs, including amphetamines, opioids, cocaine, benzodiazepines, barbiturates, and many other new and emerging abused drugs, were selected as the target analytes of this study. An 80% acetonitrile solvent with a 5-min extraction by Geno grinder was used for sample extraction. A Poroshell column was used to provide efficient separation, and under optimal conditions, the analytical times were 15 and 5min in positive and negative ionization modes, respectively. Ionization parameters of both electrospray ionization source and ion booster (IB) source containing an extra heated zone were optimized to achieve the best ionization efficiency of the investigated abused drugs. In spite of their structural diversity, most of the abused drugs showed an enhanced mass response with the high temperature ionization from an extra heated zone of IB source. Compared to electrospray ionization, the ion booster (IB) greatly improved the detection sensitivity for 86% of the analytes by 1.5-14-fold and allowed the developed method to detect trace amounts of compounds on the DBS cards. The validation results showed that the coefficients of variation of intra-day and inter-day precision in terms of the signal intensity were lower than 19.65%. The extraction recovery of all analytes was between 67.21 and 115.14%. The limits of detection of all analytes were between 0.2 and 35.7ngmL -1 . The stability study indicated that 7% of compounds showed poor stability (below 50%) on the DBS cards after 6 months of storage at room temperature and -80°C. The reported method provides a new direction for abused drug screening using DBS. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Analytical Modeling of Herschel-Quincke Concept Applied to Inlet Turbofan Engines

    NASA Technical Reports Server (NTRS)

    Hallez, Raphael F.; Burdisso, Ricardo A.; Gerhold, Carl H. (Technical Monitor)

    2002-01-01

    This report summarizes the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the period from January 1999 to December 2000 on the project 'Investigation of an Adaptive Herschel-Quincke Tube Concept for the Reduction of Tonal and Broadband Noise from Turbofan Engines', funded by NASA Langley Research Center. The Herschel-Quincke (HQ) tube concept is a developing technique the consists of circumferential arrays of tubes around the duct. The analytical model is developed to provide prediction and design guidelines for application of the HQ concept to turbofan engine inlets. An infinite duct model is developed and used to provide insight into attenuation mechanisms and design strategies. Based on this early model, the NASA-developed TBIEM3D code is modified for the HQ system. This model allows for investigation of the HQ system combined with a passive liner.

  7. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  8. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  9. Bounded Linear Stability Analysis - A Time Delay Margin Estimation Approach for Adaptive Control

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Ishihara, Abraham K.; Krishnakumar, Kalmanje Srinlvas; Bakhtiari-Nejad, Maryam

    2009-01-01

    This paper presents a method for estimating time delay margin for model-reference adaptive control of systems with almost linear structured uncertainty. The bounded linear stability analysis method seeks to represent the conventional model-reference adaptive law by a locally bounded linear approximation within a small time window using the comparison lemma. The locally bounded linear approximation of the combined adaptive system is cast in a form of an input-time-delay differential equation over a small time window. The time delay margin of this system represents a local stability measure and is computed analytically by a matrix measure method, which provides a simple analytical technique for estimating an upper bound of time delay margin. Based on simulation results for a scalar model-reference adaptive control system, both the bounded linear stability method and the matrix measure method are seen to provide a reasonably accurate and yet not too conservative time delay margin estimation.

  10. Exploring the potential of high resolution mass spectrometry for the investigation of lignin-derived phenol substitutes in phenolic resin syntheses.

    PubMed

    Dier, Tobias K F; Fleckenstein, Marco; Militz, Holger; Volmer, Dietrich A

    2017-05-01

    Chemical degradation is an efficient method to obtain bio-oils and other compounds from lignin. Lignin bio-oils are potential substitutes for the phenol component of phenol formaldehyde (PF) resins. Here, we developed an analytical method based on high resolution mass spectrometry that provided structural information for the synthesized lignin-derived resins and supported the prediction of their properties. Different model resins based on typical lignin degradation products were analyzed by electrospray ionization in negative ionization mode. Utilizing enhanced mass defect filter techniques provided detailed structural information of the lignin-based model resins and readily complemented the analytical data from differential scanning calorimetry and thermogravimetric analysis. Relative reactivity and chemical diversity of the phenol substitutes were significant determinants of the outcome of the PF resin synthesis and thus controlled the areas of application of the resulting polymers. Graphical abstract ᅟ.

  11. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.

  12. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  13. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  14. Ionic liquids in chromatographic and electrophoretic techniques: toward additional improvements in the separation of natural compounds

    PubMed Central

    Freire, Carmen S. R.; Coutinho, João A. P.; Silvestre, Armando J. D.; Freire, Mara G.

    2016-01-01

    Due to their unique properties, in recent years, ionic liquids (ILs) have been largely investigated in the field of analytical chemistry. Particularly during the last sixteen years, they have been successfully applied in the chromatographic and electrophoretic analysis of value-added compounds extracted from biomass. Considering the growing interest in the use of ILs in this field, this critical review provides a comprehensive overview on the improvements achieved using ILs as constituents of mobile or stationary phases in analytical techniques, namely in capillary electrophoresis and its different modes, in high performance liquid chromatography, and in gas chromatography, for the separation and analysis of natural compounds. The impact of the IL chemical structure and the influence of secondary parameters, such as the IL concentration, temperature, pH, voltage and analysis time (when applied), are also critically addressed regarding the achieved separation improvements. Major conclusions on the role of ILs in the separation mechanisms and the performance of these techniques in terms of efficiency, resolution and selectivity are provided. Based on a critical analysis of all published results, some target-oriented ILs are suggested. Finally, current drawbacks and future challenges in the field are highlighted. In particular, the design and use of more benign and effective ILs as well as the development of integrated (and thus more sustainable) extraction–separation processes using IL aqueous solutions are suggested within a green chemistry perspective. PMID:27667965

  15. Trace analysis of energetic materials via direct analyte-probed nanoextraction coupled to direct analysis in real time mass spectrometry.

    PubMed

    Clemons, Kristina; Dake, Jeffrey; Sisco, Edward; Verbeck, Guido F

    2013-09-10

    Direct analysis in real time mass spectrometry (DART-MS) has proven to be a useful forensic tool for the trace analysis of energetic materials. While other techniques for detecting trace amounts of explosives involve extraction, derivatization, solvent exchange, or sample clean-up, DART-MS requires none of these. Typical DART-MS analyses directly from a solid sample or from a swab have been quite successful; however, these methods may not always be an optimal sampling technique in a forensic setting. For example, if the sample were only located in an area which included a latent fingerprint of interest, direct DART-MS analysis or the use of a swab would almost certainly destroy the print. To avoid ruining such potentially invaluable evidence, another method has been developed which will leave the fingerprint virtually untouched. Direct analyte-probed nanoextraction coupled to nanospray ionization-mass spectrometry (DAPNe-NSI-MS) has demonstrated excellent sensitivity and repeatability in forensic analyses of trace amounts of illicit drugs from various types of surfaces. This technique employs a nanomanipulator in conjunction with bright-field microscopy to extract single particles from a surface of interest and has provided a limit of detection of 300 attograms for caffeine. Combining DAPNe with DART-MS provides another level of flexibility in forensic analysis, and has proven to be a sufficient detection method for trinitrotoluene (TNT), RDX, and 1-methylaminoanthraquinone (MAAQ). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  16. New constraints on deformation processes in serpentinite from sub-micron Raman Spectroscopy and TEM

    NASA Astrophysics Data System (ADS)

    Smith, S. A. F.; Tarling, M.; Rooney, J. S.; Gordon, K. C.; Viti, C.

    2017-12-01

    Extensive work has been performed to characterize the mineralogical and mechanical properties of the various serpentine minerals (i.e. antigorite, lizardite, chrysotile, polyhedral and polygonal serpentine). However, correct identification of serpentine minerals is often difficult or impossible using conventional analytical techniques such as optical- and SEM-based microscopy, X-ray diffraction and infrared spectroscopy. Transmission Electron Microscopy (TEM) is the best analytical technique to identify the serpentine minerals, but TEM requires complex sample preparation and typically results in very small analysis areas. Sub-micron confocal Raman spectroscopy mapping of polished thin sections provides a quick and relatively inexpensive way of unambiguously distinguishing the main serpentine minerals within their in-situ microstructural context. The combination of high spatial resolution (with a diffraction-limited system, 366 nm), large-area coverage (up to hundreds of microns in each dimension) and ability to map directly on thin sections allows intricate fault rock textures to be imaged at a sample-scale, which can then form the target of more focused TEM work. The potential of sub-micron Raman Spectroscopy + TEM is illustrated by examining sub-micron-scale mineral intergrowths and deformation textures in scaly serpentinites (e.g. dissolution seams, mineral growth in pressure shadows), serpentinite crack-seal veins and polished fault slip surfaces from a serpentinite-bearing mélange in New Zealand. The microstructural information provided by these techniques has yielded new insights into coseismic dehydration and amorphization processes and the interplay between creep and localised rupture in serpentinite shear zones.

  17. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  18. Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975

    DTIC Science & Technology

    1975-09-01

    and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the

  19. Accurate analytical modeling of junctionless DG-MOSFET by green's function approach

    NASA Astrophysics Data System (ADS)

    Nandi, Ashutosh; Pandey, Nilesh

    2017-11-01

    An accurate analytical model of Junctionless double gate MOSFET (JL-DG-MOSFET) in the subthreshold regime of operation is developed in this work using green's function approach. The approach considers 2-D mixed boundary conditions and multi-zone techniques to provide an exact analytical solution to 2-D Poisson's equation. The Fourier coefficients are calculated correctly to derive the potential equations that are further used to model the channel current and subthreshold slope of the device. The threshold voltage roll-off is computed from parallel shifts of Ids-Vgs curves between the long channel and short-channel devices. It is observed that the green's function approach of solving 2-D Poisson's equation in both oxide and silicon region can accurately predict channel potential, subthreshold current (Isub), threshold voltage (Vt) roll-off and subthreshold slope (SS) of both long & short channel devices designed with different doping concentrations and higher as well as lower tsi/tox ratio. All the analytical model results are verified through comparisons with TCAD Sentaurus simulation results. It is observed that the model matches quite well with TCAD device simulations.

  20. Electrospun polyvinyl alcohol ultra-thin layer chromatography of amino acids.

    PubMed

    Lu, Tian; Olesik, Susan V

    2013-01-01

    Electrospun polyvinyl alcohol (PVA) ultrathin layer chromatographic (UTLC) plates were fabricated using in situ crosslinking electrospinning technique. The value of these ULTC plates were characterized using the separation of fluorescein isothiocyanate (FITC) labeled amino acids and the separation of amino acids followed visualization using ninhydrin. The in situ crosslinked electrospun PVA plates showed enhanced stability in water and were stable when used for the UTLC study. The selectivity of FITC labeled amino acids on PVA plate was compared with that on commercial Si-Gel plate. The efficiency of the separation varied with analyte concentration, size of capillary analyte applicator, analyte volume, and mat thickness. The concentration of 7mM or less, 50μm i.d. capillary applicator, minimum volume of analyte solution and three-layered mat provides the best efficiency of FITC-labeled amino acids on PVA UTLC plate. The efficiency on PVA plate was greatly improved compared to the efficiency on Si-Gel HPTLC plate. The hydrolysis products of aspartame in diet coke, aspartic acid and phenylalanine, were also successfully analyzed using PVA-UTLC plate. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  2. Improving the trust in results of numerical simulations and scientific data analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cappello, Franck; Constantinescu, Emil; Hovland, Paul

    This white paper investigates several key aspects of the trust that a user can give to the results of numerical simulations and scientific data analytics. In this document, the notion of trust is related to the integrity of numerical simulations and data analytics applications. This white paper complements the DOE ASCR report on Cybersecurity for Scientific Computing Integrity by (1) exploring the sources of trust loss; (2) reviewing the definitions of trust in several areas; (3) providing numerous cases of result alteration, some of them leading to catastrophic failures; (4) examining the current notion of trust in numerical simulation andmore » scientific data analytics; (5) providing a gap analysis; and (6) suggesting two important research directions and their respective research topics. To simplify the presentation without loss of generality, we consider that trust in results can be lost (or the results’ integrity impaired) because of any form of corruption happening during the execution of the numerical simulation or the data analytics application. In general, the sources of such corruption are threefold: errors, bugs, and attacks. Current applications are already using techniques to deal with different types of corruption. However, not all potential corruptions are covered by these techniques. We firmly believe that the current level of trust that a user has in the results is at least partially founded on ignorance of this issue or the hope that no undetected corruptions will occur during the execution. This white paper explores the notion of trust and suggests recommendations for developing a more scientifically grounded notion of trust in numerical simulation and scientific data analytics. We first formulate the problem and show that it goes beyond previous questions regarding the quality of results such as V&V, uncertainly quantification, and data assimilation. We then explore the complexity of this difficult problem, and we sketch complementary general approaches to address it. This paper does not focus on the trust that the execution will actually complete. The product of simulation or of data analytic executions is the final element of a potentially long chain of transformations, where each stage has the potential to introduce harmful corruptions. These corruptions may produce results that deviate from the user-expected accuracy without notifying the user of this deviation. There are many potential sources of corruption before and during the execution; consequently, in this white paper we do not focus on the protection of the end result after the execution.« less

  3. Surface enhanced Raman spectroscopy based nanoparticle assays for rapid, point-of-care diagnostics

    NASA Astrophysics Data System (ADS)

    Driscoll, Ashley J.

    Nucleotide and immunoassays are important tools for disease diagnostics. Many of the current laboratory-based analytical diagnostic techniques require multiple assay steps and long incubation times before results are acquired. In the development of bioassays designed for detecting the emergence and spread of diseases in point-of-care (POC) and remote settings, more rapid and portable analytical methods are necessary. Nanoparticles provide simple and reproducible synthetic methods for the preparation of substrates that can be applied in colloidal assays, providing gains in kinetics due to miniaturization and plasmonic substrates for surface enhanced spectroscopies. Specifically, surface enhanced Raman spectroscopy (SERS) is finding broad application as a signal transduction method in immunological and nucleotide assays due to the production of narrow spectral peaks from the scattering molecules and the potential for simultaneous multiple analyte detection. The application of SERS to a no-wash, magnetic capture assay for the detection of West Nile Virus Envelope and Rift Valley Fever Virus N antigens is described. The platform utilizes colloid based capture of the target antigen in solution, magnetic collection of the immunocomplexes and acquisition of SERS spectra by a handheld Raman spectrometer. The reagents for a core-shell nanoparticle, SERS based assay designed for the capture of target microRNA implicated in acute myocardial infarction are also characterized. Several new, small molecule Raman scatterers are introduced and used to analyze the enhancing properties of the synthesized gold coated-magnetic nanoparticles. Nucleotide and immunoassay platforms have shown improvements in speed and analyte capture through the miniaturization of the capture surface and particle-based capture systems can provide a route to further surface miniaturization. A reaction-diffusion model of the colloidal assay platform is presented to understand the interplay of system parameters such as particle diameter, initial analyte concentration and dissociation constants. The projected sensitivities over a broad range of assay conditions are examined and the governing regime of particle systems reported. The results provide metrics in the design of more robust analytics that are of particular interest for POC diagnostics.

  4. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  5. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.

  6. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  7. The discrimination of geoforensic trace material from close proximity locations by organic profiling using HPLC and plant wax marker analysis by GC.

    PubMed

    McCulloch, G; Dawson, L A; Ross, J M; Morgan, R M

    2018-07-01

    There is a need to develop a wider empirical research base to expand the scope for utilising the organic fraction of soil in forensic geoscience, and to demonstrate the capability of the analytical techniques used in forensic geoscience to discriminate samples at close proximity locations. The determination of wax markers from soil samples by GC analysis has been used extensively in court and is known to be effective in discriminating samples from different land use types. A new HPLC method for the analysis of the organic fraction of forensic sediment samples has also been shown recently to add value in conjunction with existing inorganic techniques for the discrimination of samples derived from close proximity locations. This study compares the ability of these two organic techniques to discriminate samples derived from close proximity locations and finds the GC technique to provide good discrimination at this scale, providing quantification of known compounds, whilst the HPLC technique offered a shorter and simpler sample preparation method and provided very good discrimination between groups of samples of different provenance in most cases. The use of both data sets together gave further improved accuracy rates in some cases, suggesting that a combined organic approach can provide added benefits in certain case scenarios and crime reconstruction contexts. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  8. A Hertzian contact mechanics based formulation to improve ultrasound elastography assessment of uterine cervical tissue stiffness.

    PubMed

    Briggs, Brandi N; Stender, Michael E; Muljadi, Patrick M; Donnelly, Meghan A; Winn, Virginia D; Ferguson, Virginia L

    2015-06-25

    Clinical practice requires improved techniques to assess human cervical tissue properties, especially at the internal os, or orifice, of the uterine cervix. Ultrasound elastography (UE) holds promise for non-invasively monitoring cervical stiffness throughout pregnancy. However, this technique provides qualitative strain images that cannot be linked to a material property (e.g., Young's modulus) without knowledge of the contact pressure under a rounded transvaginal transducer probe and correction for the resulting non-uniform strain dissipation. One technique to standardize elastogram images incorporates a material of known properties and uses one-dimensional, uniaxial Hooke's law to calculate Young's modulus within the compressed material half-space. However, this method does not account for strain dissipation and the strains that evolve in three-dimensional space. We demonstrate that an analytical approach based on 3D Hertzian contact mechanics provides a reasonable first approximation to correct for UE strain dissipation underneath a round transvaginal transducer probe and thus improves UE-derived estimates of tissue modulus. We validate the proposed analytical solution and evaluate sources of error using a finite element model. As compared to 1D uniaxial Hooke's law, the Hertzian contact-based solution yields significantly improved Young's modulus predictions in three homogeneous gelatin tissue phantoms possessing different moduli. We also demonstrate the feasibility of using this technique to image human cervical tissue, where UE-derived moduli estimations for the uterine cervix anterior lip agreed well with published, experimentally obtained values. Overall, UE with an attached reference standard and a Hertzian contact-based correction holds promise for improving quantitative estimates of cervical tissue modulus. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  10. Electrical field-induced extraction and separation techniques: promising trends in analytical chemistry--a review.

    PubMed

    Yamini, Yadollah; Seidi, Shahram; Rezazadeh, Maryam

    2014-03-03

    Sample preparation is an important issue in analytical chemistry, and is often a bottleneck in chemical analysis. So, the major incentive for the recent research has been to attain faster, simpler, less expensive, and more environmentally friendly sample preparation methods. The use of auxiliary energies, such as heat, ultrasound, and microwave, is one of the strategies that have been employed in sample preparation to reach the above purposes. Application of electrical driving force is the current state-of-the-art, which presents new possibilities for simplifying and shortening the sample preparation process as well as enhancing its selectivity. The electrical driving force has scarcely been utilized in comparison with other auxiliary energies. In this review, the different roles of electrical driving force (as a powerful auxiliary energy) in various extraction techniques, including liquid-, solid-, and membrane-based methods, have been taken into consideration. Also, the references have been made available, relevant to the developments in separation techniques and Lab-on-a-Chip (LOC) systems. All aspects of electrical driving force in extraction and separation methods are too specific to be treated in this contribution. However, the main aim of this review is to provide a brief knowledge about the different fields of analytical chemistry, with an emphasis on the latest efforts put into the electrically assisted membrane-based sample preparation systems. The advantages and disadvantages of these approaches as well as the new achievements in these areas have been discussed, which might be helpful for further progress in the future. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.

    PubMed

    Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.

  12. Tiopronin Gold Nanoparticle Precursor Forms Aurophilic Ring Tetramer

    PubMed Central

    Simpson, Carrie A.; Farrow, Christopher L.; Tian, Peng; Billinge, Simon J.L.; Huffman, Brian J.; Harkness, Kellen M.; Cliffel, David E.

    2010-01-01

    In the two step synthesis of thiolate-monolayer protected clusters (MPCs), the first step of the reaction is a mild reduction of gold(III) by thiols that generates gold(I) thiolate complexes as intermediates. Using tiopronin (Tio) as the thiol reductant, the characterization of the intermediate Au4Tio4 complex was accomplished with various analytical and structural techniques. Nuclear magnetic resonance (NMR), elemental analysis, thermogravimetric analysis (TGA), and matrix-assisted laser desorption/ionization-mass spectrometry (MALDI-MS) were all consistent with a cyclic gold(I)-thiol tetramer structure, and final structural analysis was gathered through the use of powder diffraction and pair distribution functions (PDF). Crystallographic data has proved challenging for almost all previous gold(I)-thiolate complexes. Herein, a novel characterization technique when combined with standard analytical assessment to elucidate structure without crystallographic data proved invaluable to the study of these complexes. This in conjunction with other analytical techniques, in particular mass spectrometry, can elucidate a structure when crystallographic data is unavailable. In addition, luminescent properties provided evidence of aurophilicity within the molecule. The concept of aurophilicity has been introduced to describe a select group of gold-thiolate structures, which possess unique characteristics, mainly red photoluminescence and a distinct Au-Au intramolecular distance indicating a weak metal-metal bond as also evidenced by the structural model of the tetramer. Significant features of both the tetrameric and aurophilic properties of the intermediate gold(I) tiopronin complex are retained after borohydride reduction to form the MPC, including gold(I) tiopronin partial rings as capping motifs, or “staples”, and weak red photoluminescence that extends into the Near Infrared region. PMID:21067183

  13. Graph Theoretic Foundations of Multibody Dynamics Part I: Structural Properties

    PubMed Central

    Jain, Abhinandan

    2011-01-01

    This is the first part of two papers that use concepts from graph theory to obtain a deeper understanding of the mathematical foundations of multibody dynamics. The key contribution is the development of a unifying framework that shows that key analytical results and computational algorithms in multibody dynamics are a direct consequence of structural properties and require minimal assumptions about the specific nature of the underlying multibody system. This first part focuses on identifying the abstract graph theoretic structural properties of spatial operator techniques in multibody dynamics. The second part paper exploits these structural properties to develop a broad spectrum of analytical results and computational algorithms. Towards this, we begin with the notion of graph adjacency matrices and generalize it to define block-weighted adjacency (BWA) matrices and their 1-resolvents. Previously developed spatial operators are shown to be special cases of such BWA matrices and their 1-resolvents. These properties are shown to hold broadly for serial and tree topology multibody systems. Specializations of the BWA and 1-resolvent matrices are referred to as spatial kernel operators (SKO) and spatial propagation operators (SPO). These operators and their special properties provide the foundation for the analytical and algorithmic techniques developed in the companion paper. We also use the graph theory concepts to study the topology induced sparsity structure of these operators and the system mass matrix. Similarity transformations of these operators are also studied. While the detailed development is done for the case of rigid-link multibody systems, the extension of these techniques to a broader class of systems (e.g. deformable links) are illustrated. PMID:22102790

  14. Qualitative evaluation of maternal milk and commercial infant formulas via LIBS.

    PubMed

    Abdel-Salam, Z; Al Sharnoubi, J; Harith, M A

    2013-10-15

    This study focuses on the use of laser-induced breakdown spectroscopy (LIBS) for the evaluation of the nutrients in maternal milk and some commercially available infant formulas. The results of such evaluation are vital for adequate and healthy feeding for babies during lactation period. Laser-induced breakdown spectroscopy offers special advantages in comparison to the other conventional analytical techniques. Specifically, LIBS is a straightforward technique that can be used in situ to provide qualitative analytical information in few minutes for the samples under investigation without preparation processes. The samples studied in the current work were maternal milk samples collected during the first 3 months of lactation (not colostrum milk) and samples from six different types of commercially available infant formulas. The samples' elemental composition has been compared with respect to the relative abundance of the elements of nutrition importance, namely Mg, Ca, Na, and Fe using their spectral emission lines in the relevant LIBS spectra. In addition, CN and C2 molecular emission bands in the same spectra have been studied as indicators of proteins content in the samples. The obtained analytical results demonstrate the higher elemental contents of the maternal milk compared with the commercial formulas samples. Similar results have been obtained as for the proteins content. It has been also shown that calcium and proteins have similar relative concentration trends in the studied samples. This work demonstrates the feasibility of adopting LIBS as a fast, safe, less costly technique evaluating qualitatively the nutrients content of both maternal and commercial milk samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.

    2016-11-01

    Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.

  16. Foodomics: MS-based strategies in modern food science and nutrition.

    PubMed

    Herrero, Miguel; Simó, Carolina; García-Cañas, Virginia; Ibáñez, Elena; Cifuentes, Alejandro

    2012-01-01

    Modern research in food science and nutrition is moving from classical methodologies to advanced analytical strategies in which MS-based techniques play a crucial role. In this context, Foodomics has been recently defined as a new discipline that studies food and nutrition domains through the application of advanced omics technologies in which MS techniques are considered indispensable. Applications of Foodomics include the genomic, transcriptomic, proteomic, and/or metabolomic study of foods for compound profiling, authenticity, and/or biomarker-detection related to food quality or safety; the development of new transgenic foods, food contaminants, and whole toxicity studies; new investigations on food bioactivity, food effects on human health, etc. This review work does not intend to provide an exhaustive revision of the many works published so far on food analysis using MS techniques. The aim of the present work is to provide an overview of the different MS-based strategies that have been (or can be) applied in the new field of Foodomics, discussing their advantages and drawbacks. Besides, some ideas about the foreseen development and applications of MS-techniques in this new discipline are also provided. Copyright © 2011 Wiley Periodicals, Inc.

  17. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    NASA Astrophysics Data System (ADS)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any converter. This conceivably aids the sizing and design of output passive filters. An analytical estimation method has been presented to achieve this purpose for am IMC. Knowledge of the fundamental component in output voltage can be utilized to calculate its Total Harmonic Distortion (THD). The effectiveness of the proposed SVPWM algorithms and the analytical estimation technique is substantiated by simulations in MATLAB / Simulink and experiments on a laboratory prototype of the IMC. Proper comparison plots have been provided to contrast the performance of the proposed methods with the conventional SVPWM method. The behavior of output voltage distortion and CMV with variation in operating parameters like modulation index and output frequency has also been analyzed.

  18. Extraction, chemical characterization and biological activity determination of broccoli health promoting compounds.

    PubMed

    Ares, Ana M; Nozal, María J; Bernal, José

    2013-10-25

    Broccoli (Brassica oleracea L. var. Italica) contains substantial amount of health-promoting compounds such as vitamins, glucosinolates, phenolic compounds, and dietary essential minerals; thus, it benefits health beyond providing just basic nutrition, and consumption of broccoli has been increasing over the years. This review gives an overview on the extraction and separation techniques, as well as the biological activity of some of the above mentioned compounds which have been published in the period January 2008 to January 2013. The work has been distributed according to the different families of health promoting compounds discussing the extraction procedures and the analytical techniques employed for their characterization. Finally, information about the different biological activities of these compounds has been also provided. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Fault tolerance analysis and applications to microwave modules and MMIC's

    NASA Astrophysics Data System (ADS)

    Boggan, Garry H.

    A project whose objective was to provide an overview of built-in-test (BIT) considerations applicable to microwave systems, modules, and MMICs (monolithic microwave integrated circuits) is discussed. Available analytical techniques and software for assessing system failure characteristics were researched, and the resulting investigation provides a review of two techniques which have applicability to microwave systems design. A system-level approach to fault tolerance and redundancy management is presented in its relationship to the subsystem/element design. An overview of the microwave BIT focus from the Air Force Integrated Diagnostics program is presented. The technical reports prepared by the GIMADS team were reviewed for applicability to microwave modules and components. A review of MIMIC (millimeter and microwave integrated circuit) program activities relative to BIT/BITE is given.

  20. Procedures for analysis of debris relative to Space Shuttle systems

    NASA Technical Reports Server (NTRS)

    Kim, Hae Soo; Cummings, Virginia J.

    1993-01-01

    Debris samples collected from various Space Shuttle systems have been submitted to the Microchemical Analysis Branch. This investigation was initiated to develop optimal techniques for the analysis of debris. Optical microscopy provides information about the morphology and size of crystallites, particle sizes, amorphous phases, glass phases, and poorly crystallized materials. Scanning electron microscopy with energy dispersive spectrometry is utilized for information on surface morphology and qualitative elemental content of debris. Analytical electron microscopy with wavelength dispersive spectrometry provides information on the quantitative elemental content of debris.

Top