Sample records for analytical techniques required

  1. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  2. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  3. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  4. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  5. Surface-Enhanced Raman Spectroscopy.

    ERIC Educational Resources Information Center

    Garrell, Robin L.

    1989-01-01

    Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)

  6. Analytic and subjective assessments of operator workload imposed by communications tasks in transport aircraft

    NASA Technical Reports Server (NTRS)

    Eckel, J. S.; Crabtree, M. S.

    1984-01-01

    Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.

  7. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  8. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  9. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  10. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  11. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  12. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  13. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  14. Investigation of the feasibility of an analytical method of accounting for the effects of atmospheric drag on satellite motion

    NASA Technical Reports Server (NTRS)

    Bozeman, Robert E.

    1987-01-01

    An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.

  15. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  16. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  17. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  18. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  19. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  20. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  2. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  3. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  4. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Analytical techniques for the study of some parameters of multispectral scanner systems for remote sensing

    NASA Technical Reports Server (NTRS)

    Wiswell, E. R.; Cooper, G. R. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.

  6. Frequency band justifications for passive sensors 10.0 to 385 GHz, chapter 2. [for monitoring earth resources and the environment

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Sensitivity requirements of the various measurements obtained by microwave sensors, and radiometry techniques are described. Analytical techniques applied to detailed sharing analyses are discussed. A bibliography of publications pertinent to the scientific justification of frequency requirements for passive microwave remote sensing is included.

  7. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  8. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  9. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  10. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  11. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Determination of Residual Chlorine and Turbidity in Drinking Water. Instructor's Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This instructor's guide presents analytical methods for residual chlorine and turbidity. Topics include sample handling, permissable concentration levels, substitution of residual chlorine for bacteriological work, public notification, and the required analytical techniques to determine residual chlorine and turbidity. This publication is intended…

  13. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  14. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  15. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  16. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  17. Determination of Residual Chlorine and Turbidity in Drinking Water. Student Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This student's manual covers analytical methods for residual chlorine and turbidity. Topics include sample handling, permissable concentration levels, substitution of residual chlorine for bacteriological work, public notification, and the required analytical techniques to determine residual chlorine and turbidity. The publication is intended for…

  18. Fluorescence polarization immunoassays for rapid, accurate, and sensitive determination of mycotoxins

    USDA-ARS?s Scientific Manuscript database

    Analytical methods for the determination of mycotoxins in foods are commonly based on chromatographic techniques (GC, HPLC or LC-MS). Although these methods permit a sensitive and accurate determination of the analyte, they require skilled personnel and are time-consuming, expensive, and unsuitable ...

  19. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  20. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  1. Social Data Analytics Using Tensors and Sparse Techniques

    ERIC Educational Resources Information Center

    Zhang, Miao

    2014-01-01

    The development of internet and mobile technologies is driving an earthshaking social media revolution. They bring the internet world a huge amount of social media content, such as images, videos, comments, etc. Those massive media content and complicate social structures require the analytic expertise to transform those flood of information into…

  2. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  3. 40 CFR Table 5 to Subpart Uuuuu of... - Performance Testing Requirements

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to ASTM D6348-03, Sections A1 through A8 are mandatory; (2) For ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent (%) R must be determined for each target analyte (see Equation A5.5); (3) For the ASTM D6348-03 test data to be acceptable for a target analyte, %R must be 70% ≥ R ≤ 130%; and...

  4. 40 CFR Table 5 to Subpart Uuuuu of... - Performance Testing Requirements

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to ASTM D6348-03, Sections A1 through A8 are mandatory; (2) For ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent (%)R must be determined for each target analyte (see Equation A5.5); (3) For the ASTM D6348-03 test data to be acceptable for a target analyte, %R must be 70% ≤R ≤130%; and (4...

  5. 40 CFR Table 5 to Subpart Uuuuu of... - Performance Testing Requirements

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to ASTM D6348-03, Sections A1 through A8 are mandatory; (2) For ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent (%)R must be determined for each target analyte (see Equation A5.5); (3) For the ASTM D6348-03 test data to be acceptable for a target analyte, %R must be 70%≤ R ≤ 130%; and...

  6. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  7. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  8. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  9. Using Photocatalytic Oxidation and Analytic Techniques to Remediate Lab Wastewater Containing Methanol

    ERIC Educational Resources Information Center

    Xiong, Qing; Luo, Mingliang; Bao, Xiaoming; Deng, Yurong; Qin, Song; Pu, Xuemei

    2018-01-01

    This experiment is dedicated to second-year and above undergraduates who are in their experimental session of the analytical chemistry course. Grouped students are required to use a TiO[subscript 2] photocatalytic oxidation process to treat the methanol-containing wastewater that resulted from their previous HPLC experiments. Students learn to…

  10. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routinemore » standard analyses to unique problems that require significant development of methods and techniques.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standardmore » analyses to unique problems that require significant development of methods and techniques.« less

  13. [The requirements of standard and conditions of interchangeability of medical articles].

    PubMed

    Men'shikov, V V; Lukicheva, T I

    2013-11-01

    The article deals with possibility to apply specific approaches under evaluation of interchangeability of medical articles for laboratory analysis. The development of standardized analytical technologies of laboratory medicine and formulation of requirements of standards addressed to manufacturers of medical articles the clinically validated requirements are to be followed. These requirements include sensitivity and specificity of techniques, accuracy and precision of research results, stability of reagents' quality in particular conditions of their transportation and storage. The validity of requirements formulated in standards and addressed to manufacturers of medical articles can be proved using reference system, which includes master forms and standard samples, reference techniques and reference laboratories. This approach is supported by data of evaluation of testing systems for measurement of level of thyrotrophic hormone, thyroid hormones and glycated hemoglobin HB A1c. The versions of testing systems can be considered as interchangeable only in case of results corresponding to the results of reference technique and comparable with them. In case of absence of functioning reference system the possibilities of the Joined committee of traceability in laboratory medicine make it possible for manufacturers of reagent sets to apply the certified reference materials under development of manufacturing of sets for large listing of analytes.

  14. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  15. Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.

    PubMed

    Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel

    2015-01-01

    Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.

  16. Dual nozzle aerodynamic and cooling analysis study

    NASA Technical Reports Server (NTRS)

    Meagher, G. M.

    1981-01-01

    Analytical models to predict performance and operating characteristics of dual nozzle concepts were developed and improved. Aerodynamic models are available to define flow characteristics and bleed requirements for both the dual throat and dual expander concepts. Advanced analytical techniques were utilized to provide quantitative estimates of the bleed flow, boundary layer, and shock effects within dual nozzle engines. Thermal analyses were performed to define cooling requirements for baseline configurations, and special studies of unique dual nozzle cooling problems defined feasible means of achieving adequate cooling.

  17. Efficiency of unconstrained minimization techniques in nonlinear analysis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Knight, N. F., Jr.

    1978-01-01

    Unconstrained minimization algorithms have been critically evaluated for their effectiveness in solving structural problems involving geometric and material nonlinearities. The algorithms have been categorized as being zeroth, first, or second order depending upon the highest derivative of the function required by the algorithm. The sensitivity of these algorithms to the accuracy of derivatives clearly suggests using analytically derived gradients instead of finite difference approximations. The use of analytic gradients results in better control of the number of minimizations required for convergence to the exact solution.

  18. Shape optimization of disc-type flywheels

    NASA Technical Reports Server (NTRS)

    Nizza, R. S.

    1976-01-01

    Techniques were developed for presenting an analytical and graphical means for selecting an optimum flywheel system design, based on system requirements, geometric constraints, and weight limitations. The techniques for creating an analytical solution are formulated from energy and structural principals. The resulting flywheel design relates stress and strain pattern distribution, operating speeds, geometry, and specific energy levels. The design techniques incorporate the lowest stressed flywheel for any particular application and achieve the highest specific energy per unit flywheel weight possible. Stress and strain contour mapping and sectional profile plotting reflect the results of the structural behavior manifested under rotating conditions. This approach toward flywheel design is applicable to any metal flywheel, and permits the selection of the flywheel design to be based solely on the criteria of the system requirements that must be met, those that must be optimized, and those system parameters that may be permitted to vary.

  19. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Required, Practical, or Unnecessary? An Examination and Demonstration of Propensity Score Matching Using Longitudinal Secondary Data

    ERIC Educational Resources Information Center

    Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.

    2010-01-01

    The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…

  1. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  2. Using Analytical Techniques to Interpret Financial Statements.

    ERIC Educational Resources Information Center

    Walters, Donald L.

    1986-01-01

    Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)

  3. Laser desorption ionization mass spectrometry: Recent progress in matrix-free and label-assisted techniques.

    PubMed

    Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit

    2017-10-13

    The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.

  4. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  5. Bioanalytical Applications of Fluorescence Line-Narrowing and Non-Line-Narrowing Spectroscopy Interfaced with Capillary Electrophoresis and High-Performance Liquid Chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Kenneth Paul

    Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less

  6. Pulsed plane wave analytic solutions for generic shapes and the validation of Maxwell's equations solvers

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Vastano, John A.; Lomax, Harvard

    1992-01-01

    Generic shapes are subjected to pulsed plane waves of arbitrary shape. The resulting scattered electromagnetic fields are determined analytically. These fields are then computed efficiently at field locations for which numerically determined EM fields are required. Of particular interest are the pulsed waveform shapes typically utilized by radar systems. The results can be used to validate the accuracy of finite difference time domain Maxwell's equations solvers. A two-dimensional solver which is second- and fourth-order accurate in space and fourth-order accurate in time is examined. Dielectric media properties are modeled by a ramping technique which simplifies the associated gridding of body shapes. The attributes of the ramping technique are evaluated by comparison with the analytic solutions.

  7. Microfluidic-Based sample chips for radioactive solutions

    DOE PAGES

    Tripp, J. L.; Law, J. D.; Smith, T. E.; ...

    2015-01-01

    Historical nuclear fuel cycle process sampling techniques required sample volumes ranging in the tens of milliliters. The radiation levels experienced by analytical personnel and equipment, in addition to the waste volumes generated from analysis of these samples, have been significant. These sample volumes also impacted accountability inventories of required analytes during process operations. To mitigate radiation dose and other issues associated with the historically larger sample volumes, a microcapillary sample chip was chosen for further investigation. The ability to obtain microliter volume samples coupled with a remote automated means of sample loading, tracking, and transporting to the analytical instrument wouldmore » greatly improve analytical efficiency while reducing both personnel exposure and radioactive waste volumes. Sample chip testing was completed to determine the accuracy, repeatability, and issues associated with the use of microfluidic sample chips used to supply µL sample volumes of lanthanide analytes dissolved in nitric acid for introduction to an analytical instrument for elemental analysis.« less

  8. A Study of a Standard BIT Circuit.

    DTIC Science & Technology

    1977-02-01

    IENDED BIT APPROACHES FOR QED MODULES AND APPLICATION OF THE ANALYTIC MEASURES 36 4.1 Built-In-Test for Memory Class Modules 37 4.1.1 Random Access...Implementation 68 4.1.5.5 Criti cal Parameters 68 4.1.5.6 QED Module Test Equipment Requirements 68 4.1.6 Application of Analytic Measures to the...Microprocessor BIT Techniques.. 121 4.2.9 Application of Analytic Measures to the Recommended BIT App roaches 125 4.2.10 Process Class BIT by Partial

  9. Iterative categorization (IC): a systematic technique for analysing qualitative data

    PubMed Central

    2016-01-01

    Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155

  10. An analytical technique for predicting the characteristics of a flexible wing equipped with an active flutter-suppression system and comparison with wind-tunnel data

    NASA Technical Reports Server (NTRS)

    Abel, I.

    1979-01-01

    An analytical technique for predicting the performance of an active flutter-suppression system is presented. This technique is based on the use of an interpolating function to approximate the unsteady aerodynamics. The resulting equations are formulated in terms of linear, ordinary differential equations with constant coefficients. This technique is then applied to an aeroelastic model wing equipped with an active flutter-suppression system. Comparisons between wind-tunnel data and analysis are presented for the wing both with and without active flutter suppression. Results indicate that the wing flutter characteristics without flutter suppression can be predicted very well but that a more adequate model of wind-tunnel turbulence is required when the active flutter-suppression system is used.

  11. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  12. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  13. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.; Anderson, M. R.

    1985-01-01

    Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Graczyk, D.G.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1991 (October 1990 through September 1991). This is the eighth annual report for the ACL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handlesmore » a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques.« less

  15. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  16. 40 CFR 501.15 - Requirements for permitting.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... individual(s) who performed the analyses; (E) The analytical techniques or methods used; and (F) The results... monitoring device or method required to be maintained under this permit shall, upon conviction, be punished... permittee's use or disposal methods is promulgated under section 405(d) of the CWA before the expiration of...

  17. 40 CFR 501.15 - Requirements for permitting.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... individual(s) who performed the analyses; (E) The analytical techniques or methods used; and (F) The results... monitoring device or method required to be maintained under this permit shall, upon conviction, be punished... permittee's use or disposal methods is promulgated under section 405(d) of the CWA before the expiration of...

  18. 40 CFR 501.15 - Requirements for permitting.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... individual(s) who performed the analyses; (E) The analytical techniques or methods used; and (F) The results... monitoring device or method required to be maintained under this permit shall, upon conviction, be punished... permittee's use or disposal methods is promulgated under section 405(d) of the CWA before the expiration of...

  19. 40 CFR 501.15 - Requirements for permitting.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... individual(s) who performed the analyses; (E) The analytical techniques or methods used; and (F) The results... monitoring device or method required to be maintained under this permit shall, upon conviction, be punished... permittee's use or disposal methods is promulgated under section 405(d) of the CWA before the expiration of...

  20. 40 CFR 501.15 - Requirements for permitting.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... individual(s) who performed the analyses; (E) The analytical techniques or methods used; and (F) The results... monitoring device or method required to be maintained under this permit shall, upon conviction, be punished... permittee's use or disposal methods is promulgated under section 405(d) of the CWA before the expiration of...

  1. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  2. The general 2-D moments via integral transform method for acoustic radiation and scattering

    NASA Astrophysics Data System (ADS)

    Smith, Jerry R.; Mirotznik, Mark S.

    2004-05-01

    The moments via integral transform method (MITM) is a technique to analytically reduce the 2-D method of moments (MoM) impedance double integrals into single integrals. By using a special integral representation of the Green's function, the impedance integral can be analytically simplified to a single integral in terms of transformed shape and weight functions. The reduced expression requires fewer computations and reduces the fill times of the MoM impedance matrix. Furthermore, the resulting integral is analytic for nearly arbitrary shape and weight function sets. The MITM technique is developed for mixed boundary conditions and predictions with basic shape and weight function sets are presented. Comparisons of accuracy and speed between MITM and brute force are presented. [Work sponsored by ONR and NSWCCD ILIR Board.

  3. Two-dimensional fracture analysis of piezoelectric material based on the scaled boundary node method

    NASA Astrophysics Data System (ADS)

    Shen-Shen, Chen; Juan, Wang; Qing-Hua, Li

    2016-04-01

    A scaled boundary node method (SBNM) is developed for two-dimensional fracture analysis of piezoelectric material, which allows the stress and electric displacement intensity factors to be calculated directly and accurately. As a boundary-type meshless method, the SBNM employs the moving Kriging (MK) interpolation technique to an approximate unknown field in the circumferential direction and therefore only a set of scattered nodes are required to discretize the boundary. As the shape functions satisfy Kronecker delta property, no special techniques are required to impose the essential boundary conditions. In the radial direction, the SBNM seeks analytical solutions by making use of analytical techniques available to solve ordinary differential equations. Numerical examples are investigated and satisfactory solutions are obtained, which validates the accuracy and simplicity of the proposed approach. Project supported by the National Natural Science Foundation of China (Grant Nos. 11462006 and 21466012), the Foundation of Jiangxi Provincial Educational Committee, China (Grant No. KJLD14041), and the Foundation of East China Jiaotong University, China (Grant No. 09130020).

  4. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    PubMed

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Interactive Management and Updating of Spatial Data Bases

    NASA Technical Reports Server (NTRS)

    French, P.; Taylor, M.

    1982-01-01

    The decision making process, whether for power plant siting, load forecasting or energy resource planning, invariably involves a blend of analytical methods and judgement. Management decisions can be improved by the implementation of techniques which permit an increased comprehension of results from analytical models. Even where analytical procedures are not required, decisions can be aided by improving the methods used to examine spatially and temporally variant data. How the use of computer aided planning (CAP) programs and the selection of a predominant data structure, can improve the decision making process is discussed.

  6. Bioinformatics Symposium of the Analytical Division of the American Chemical Society Meeting. Final Technical Report from 03/15/2000 to 03/14/2001 [sample pages of agenda, abstracts, index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Robert T.

    Sparked by the Human Genome Project, biological and biomedical research has become an information science. Information tools are now being generated for proteins, cell modeling, and genomics. The opportunity for analytical chemistry in this new environment is profound. New analytical techniques that can provide the information on genes, SNPs, proteins, protein modifications, cells, and cell chemistry are required. In this symposium, we brought together both informatics experts and leading analytical chemists to discuss this interface. Over 200 people attended this highly successful symposium.

  7. Method of multi-dimensional moment analysis for the characterization of signal peaks

    DOEpatents

    Pfeifer, Kent B; Yelton, William G; Kerr, Dayle R; Bouchier, Francis A

    2012-10-23

    A method of multi-dimensional moment analysis for the characterization of signal peaks can be used to optimize the operation of an analytical system. With a two-dimensional Peclet analysis, the quality and signal fidelity of peaks in a two-dimensional experimental space can be analyzed and scored. This method is particularly useful in determining optimum operational parameters for an analytical system which requires the automated analysis of large numbers of analyte data peaks. For example, the method can be used to optimize analytical systems including an ion mobility spectrometer that uses a temperature stepped desorption technique for the detection of explosive mixtures.

  8. ASD FieldSpec Calibration Setup and Techniques

    NASA Technical Reports Server (NTRS)

    Olive, Dan

    2001-01-01

    This paper describes the Analytical Spectral Devices (ASD) Fieldspec Calibration Setup and Techniques. The topics include: 1) ASD Fieldspec FR Spectroradiometer; 2) Components of Calibration; 3) Equipment list; 4) Spectral Setup; 5) Spectral Calibration; 6) Radiometric and Linearity Setup; 7) Radiometric setup; 8) Datadets Required; 9) Data files; and 10) Field of View Measurement. This paper is in viewgraph form.

  9. Energy distributions and radiation transport in uranium plasmas

    NASA Technical Reports Server (NTRS)

    Miley, G. H.; Bathke, C.; Maceda, E.; Choi, C.

    1976-01-01

    An approximate analytic model, based on continuous electron slowing, has been used for survey calculations. Where more accuracy is required, a Monte Carlo technique is used which combines an analytic representation of Coulombic collisions with a random walk treatment of inelastic collisions. The calculated electron distributions have been incorporated into another code that evaluates both the excited atomic state densities within the plasma and the radiative flux emitted from the plasma.

  10. Practical guidelines for the characterization and quality control of pure drug nanoparticles and nano-cocrystals in the pharmaceutical industry.

    PubMed

    Peltonen, Leena

    2018-06-16

    The number of poorly soluble drug candidates is increasing, and this is also seen in the research interest towards drug nanoparticles and (nano-)cocrystals; improved solubility is the most important application of these nanosystems. In order to confirm the functionality of these nanoparticles throughout their lifecycle, repeatability of the formulation processes, functional performance of the formed systems in pre-determined way and system stability, a thorough physicochemical understanding with the aid of necessary analytical techniques is needed. Even very minor deviations in for example particle size or size deviation in nanoscale can alter the product bioavailability, and the effect is even more dramatic with the smallest particle size fractions. Also, small particle size sets special requirements for the analytical techniques. In this review most important physicochemical properties of drug nanocrystals and nano-cocrystals are presented, suitable analytical techniques, their pros and cons, are described with the extra input on practical point of view. Copyright © 2018. Published by Elsevier B.V.

  11. Evaluation of available analytical techniques for monitoring the quality of space station potable water

    NASA Technical Reports Server (NTRS)

    Geer, Richard D.

    1989-01-01

    To assure the quality of potable water (PW) on the Space Station (SS) a number of chemical and physical tests must be conducted routinely. After reviewing the requirements for potable water, both direct and indirect analytical methods are evaluated that could make the required tests and improvements compatible with the Space Station operation. A variety of suggestions are made to improve the analytical techniques for SS operation. The most important recommendations are: (1) the silver/silver chloride electrode (SB) method of removing I sub 2/I (-) biocide from the water, since it may interfere with analytical procedures for PW and also its end uses; (2) the orbital reactor (OR) method of carrying out chemistry and electrochemistry in microgravity by using a disk shaped reactor on an orbital table to impart artificial G force to the contents, allowing solution mixing and separation of gases and liquids; and (3) a simple ultra low volume highly sensitive electrochemical/conductivity detector for use with a capillary zone electrophoresis apparatus. It is also recommended, since several different conductivity and resistance measurements are made during the analysis of PW, that the bipolar pulse measuring circuit be used in all these applications for maximum compatibility and redundancy of equipment.

  12. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  13. Engineering Analysis of Stresses in Railroad Rails.

    DOT National Transportation Integrated Search

    1981-10-01

    One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...

  14. Preliminary Description of Stresses in Railroad Rail

    DOT National Transportation Integrated Search

    1976-11-01

    One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...

  15. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Analytical advances in pharmaceutical impurity profiling.

    PubMed

    Holm, René; Elder, David P

    2016-05-25

    Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Heinrich, R.R.; Jensen, K.J.

    The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standard analyses to unique problems that require significant development of methods and techniques. The purpose of this report is to summarize the technical and administrative activities of the Analytical Chemistry Laboratory (ACL) atmore » Argonne National Laboratory (ANL) for Fiscal Year 1985 (October 1984 through September 1985). This is the second annual report for the ACL. 4 figs., 1 tab.« less

  18. An analytical technique for approximating unsteady aerodynamics in the time domain

    NASA Technical Reports Server (NTRS)

    Dunn, H. J.

    1980-01-01

    An analytical technique is presented for approximating unsteady aerodynamic forces in the time domain. The order of elements of a matrix Pade approximation was postulated, and the resulting polynomial coefficients were determined through a combination of least squares estimates for the numerator coefficients and a constrained gradient search for the denominator coefficients which insures stable approximating functions. The number of differential equations required to represent the aerodynamic forces to a given accuracy tends to be smaller than that employed in certain existing techniques where the denominator coefficients are chosen a priori. Results are shown for an aeroelastic, cantilevered, semispan wing which indicate a good fit to the aerodynamic forces for oscillatory motion can be achieved with a matrix Pade approximation having fourth order numerator and second order denominator polynomials.

  19. Iterative categorization (IC): a systematic technique for analysing qualitative data.

    PubMed

    Neale, Joanne

    2016-06-01

    The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.

  20. Spectroscopic analysis of solar and cosmic X-ray spectra. 1: The nature of cosmic X-ray spectra and proposed analytical techniques

    NASA Technical Reports Server (NTRS)

    Walker, A. B. C., Jr.

    1975-01-01

    Techniques for the study of the solar corona are reviewed as an introduction to a discussion of modifications required for the study of cosmic sources. Spectroscopic analysis of individual sources and the interstellar medium is considered. The latter was studied via analysis of its effect on the spectra of selected individual sources. The effects of various characteristics of the ISM, including the presence of grains, molecules, and ionization, are first discussed, and the development of ISM models is described. The expected spectral structure of individual cosmic sources is then reviewed with emphasis on supernovae remnants and binary X-ray sources. The observational and analytical requirements imposed by the characteristics of these sources are identified, and prospects for the analysis of abundances and the study of physical parameters within them are assessed. Prospects for the spectroscopic study of other classes of X-ray sources are also discussed.

  1. The electromagnetic modeling of thin apertures using the finite-difference time-domain technique

    NASA Technical Reports Server (NTRS)

    Demarest, Kenneth R.

    1987-01-01

    A technique which computes transient electromagnetic responses of narrow apertures in complex conducting scatterers was implemented as an extension of previously developed Finite-Difference Time-Domain (FDTD) computer codes. Although these apertures are narrow with respect to the wavelengths contained within the power spectrum of excitation, this technique does not require significantly more computer resources to attain the increased resolution at the apertures. In the report, an analytical technique which utilizes Babinet's principle to model the apertures is developed, and an FDTD computer code which utilizes this technique is described.

  2. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  4. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  6. Analytical methods for determination of mycotoxins: a review.

    PubMed

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  7. Structural analyses for the modification and verification of the Viking aeroshell

    NASA Technical Reports Server (NTRS)

    Stephens, W. B.; Anderson, M. S.

    1976-01-01

    The Viking aeroshell is an extremely lightweight flexible shell structure that has undergone thorough buckling analyses in the course of its development. The analytical tools and modeling technique required to reveal the structural behavior are presented. Significant results are given which illustrate the complex failure modes not usually observed in simple models and analyses. Both shell-of-revolution analysis for the pressure loads and thermal loads during entry and a general shell analysis for concentrated tank loads during launch were used. In many cases fixes or alterations to the structure were required, and the role of the analytical results in determining these modifications is indicated.

  8. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  9. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  10. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI

    NASA Astrophysics Data System (ADS)

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-06-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  12. Investigations of Some Liquid Matrixes for Analyte Quantification by MALDI.

    PubMed

    Moon, Jeong Hee; Park, Kyung Man; Ahn, Sung Hee; Lee, Seong Hoon; Kim, Myung Soo

    2015-10-01

    Sample inhomogeneity is one of the obstacles preventing the generation of reproducible mass spectra by MALDI and to their use for the purpose of analyte quantification. As a potential solution to this problem, we investigated MALDI with some liquid matrixes prepared by nonstoichiometric mixing of acids and bases. Out of 27 combinations of acids and bases, liquid matrixes could be produced from seven. When the overall spectral features were considered, two liquid matrixes using α-cyano-4-hydroxycinnamic acid as the acid and 3-aminoquinoline and N,N-diethylaniline as bases were the best choices. In our previous study of MALDI with solid matrixes, we found that three requirements had to be met for the generation of reproducible spectra and for analyte quantification: (1) controlling the temperature by fixing the total ion count, (2) plotting the analyte-to-matrix ion ratio versus the analyte concentration as the calibration curve, and (3) keeping the matrix suppression below a critical value. We found that the same requirements had to be met in MALDI with liquid matrixes as well. In particular, although the liquid matrixes tested here were homogeneous, they failed to display spot-to-spot spectral reproducibility unless the first requirement above was met. We also found that analyte-derived ions could not be produced efficiently by MALDI with the above liquid matrixes unless the analyte was sufficiently basic. In this sense, MALDI processes with solid and liquid matrixes should be regarded as complementary techniques rather than as competing ones.

  13. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.; Price, D. Marvin

    1991-01-01

    Spacecraft designers have always been concerned about the effects of meteoroid impacts on mission safety. The engineering solution to this problem has generally been to erect a bumper or shield placed outboard from the spacecraft wall to disrupt/deflect the incoming projectiles. Spacecraft designers have a number of tools at their disposal to aid in the design process. These include hypervelocity impact testing, analytic impact predictors, and hydrodynamic codes. Analytic impact predictors generally provide the best quick-look estimate of design tradeoffs. The most complete way to determine the characteristics of an analytic impact predictor is through optimization of the protective structures design problem formulated with the predictor of interest. Space Station Freedom protective structures design insight is provided through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. Major results are presented.

  14. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  15. The space shuttle payload planning working groups. Volume 8: Earth and ocean physics

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The findings and recommendations of the Earth and Ocean Physics working group of the space shuttle payload planning activity are presented. The requirements for the space shuttle mission are defined as: (1) precision measurement for earth and ocean physics experiments, (2) development and demonstration of new and improved sensors and analytical techniques, (3) acquisition of surface truth data for evaluation of new measurement techniques, (4) conduct of critical experiments to validate geophysical phenomena and instrumental results, and (5) development and validation of analytical/experimental models for global ocean dynamics and solid earth dynamics/earthquake prediction. Tables of data are presented to show the flight schedule estimated costs, and the mission model.

  16. Fan noise prediction assessment

    NASA Technical Reports Server (NTRS)

    Bent, Paul H.

    1995-01-01

    This report is an evaluation of two techniques for predicting the fan noise radiation from engine nacelles. The first is a relatively computational intensive finite element technique. The code is named ARC, an abbreviation of Acoustic Radiation Code, and was developed by Eversman. This is actually a suite of software that first generates a grid around the nacelle, then solves for the potential flowfield, and finally solves the acoustic radiation problem. The second approach is an analytical technique requiring minimal computational effort. This is termed the cutoff ratio technique and was developed by Rice. Details of the duct geometry, such as the hub-to-tip ratio and Mach number of the flow in the duct, and modal content of the duct noise are required for proper prediction.

  17. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  18. Application of artificial intelligence to impulsive orbital transfers

    NASA Technical Reports Server (NTRS)

    Burns, Rowland E.

    1987-01-01

    A generalized technique for the numerical solution of any given class of problems is presented. The technique requires the analytic (or numerical) solution of every applicable equation for all variables that appear in the problem. Conditional blocks are employed to rapidly expand the set of known variables from a minimum of input. The method is illustrated via the use of the Hohmann transfer problem from orbital mechanics.

  19. Analytical Chemistry Laboratory. Progress report for FY 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less

  20. Bioimaging of cells and tissues using accelerator-based sources.

    PubMed

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (< 1 mum) or tissue (<1 mm) analyses, (2) a temporal resolution to follow molecular dynamics, and (3) a sensitivity in the micromolar to nanomolar range, thus allowing true investigations on biological dynamics. Third-generation synchrotrons now offer the opportunity of bioanalytical measurements at nanometer resolutions with incredible sensitivity. Linear accelerators are more specialized in their physical features but may exceed synchrotron performances. All these techniques have become irreplaceable tools for developing knowledge in biology. This review highlights the pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  1. Rapid detection of terbufos in stomach contents using desorption electrospray ionization mass spectrometry.

    PubMed

    Wilson, Christina R; Mulligan, Christopher C; Strueh, Kurt D; Stevenson, Gregory W; Hooser, Stephen B

    2014-05-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) is an emerging analytical technique that permits the rapid and direct analysis of biological or environmental samples under ambient conditions. Highlighting the versatility of this technique, DESI-MS has been used for the rapid detection of illicit drugs, chemical warfare agents, agricultural chemicals, and pharmaceuticals from a variety of sample matrices. In diagnostic veterinary toxicology, analyzing samples using traditional analytical instrumentation typically includes extensive sample extraction procedures, which can be time consuming and labor intensive. Therefore, efforts to expedite sample analyses are a constant goal for diagnostic toxicology laboratories. In the current report, DESI-MS was used to directly analyze stomach contents from a dog exposed to the organophosphate insecticide terbufos. The total DESI-MS analysis time required to confirm the presence of terbufos and diagnose organophosphate poisoning in this case was approximately 5 min. This highlights the potential of this analytical technique in the field of veterinary toxicology for the rapid diagnosis and detection of toxicants in biological samples. © 2014 The Author(s).

  2. Geospatial methods and data analysis for assessing distribution of grazing livestock

    USDA-ARS?s Scientific Manuscript database

    Free-ranging livestock research must begin with a well conceived problem statement and employ appropriate data acquisition tools and analytical techniques to accomplish the research objective. These requirements are especially critical in addressing animal distribution. Tools and statistics used t...

  3. Ultrasensitive detection of target analyte-induced aggregation of gold nanoparticles using laser-induced nanoparticle Rayleigh scattering.

    PubMed

    Lin, Jia-Hui; Tseng, Wei-Lung

    2015-01-01

    Detection of salt- and analyte-induced aggregation of gold nanoparticles (AuNPs) mostly relies on costly and bulky analytical instruments. To response this drawback, a portable, miniaturized, sensitive, and cost-effective detection technique is urgently required for rapid field detection and monitoring of target analyte via the use of AuNP-based sensor. This study combined a miniaturized spectrometer with a 532-nm laser to develop a laser-induced Rayleigh scattering technique, allowing the sensitive and selective detection of Rayleigh scattering from the aggregated AuNPs. Three AuNP-based sensing systems, including salt-, thiol- and metal ion-induced aggregation of the AuNPs, were performed to examine the sensitivity of laser-induced Rayleigh scattering technique. Salt-, thiol-, and metal ion-promoted NP aggregation were exemplified by the use of aptamer-adsorbed, fluorosurfactant-stabilized, and gallic acid-capped AuNPs for probing K(+), S-adenosylhomocysteine hydrolase-induced hydrolysis of S-adenosylhomocysteine, and Pb(2+), in sequence. Compared to the reported methods for monitoring the aggregated AuNPs, the proposed system provided distinct advantages of sensitivity. Laser-induced Rayleigh scattering technique was improved to be convenient, cheap, and portable by replacing a diode laser and a miniaturized spectrometer with a laser pointer and a smart-phone. Using this smart-phone-based detection platform, we can determine whether or not the Pb(2+) concentration exceed the maximum allowable level of Pb(2+) in drinking water. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. The analytical representation of viscoelastic material properties using optimization techniques

    NASA Technical Reports Server (NTRS)

    Hill, S. A.

    1993-01-01

    This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.

  5. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  6. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrificationmore » campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).« less

  7. Automated measurement of respiratory gas exchange by an inert gas dilution technique

    NASA Technical Reports Server (NTRS)

    Sawin, C. F.; Rummel, J. A.; Michel, E. L.

    1974-01-01

    A respiratory gas analyzer (RGA) has been developed wherein a mass spectrometer is the sole transducer required for measurement of respiratory gas exchange. The mass spectrometer maintains all signals in absolute phase relationships, precluding the need to synchronize flow and gas composition as required in other systems. The RGA system was evaluated by comparison with the Douglas bag technique. The RGA system established the feasibility of the inert gas dilution method for measuring breath-by-breath respiratory gas exchange. This breath-by-breath analytical capability permits detailed study of transient respiratory responses to exercise.

  8. Chemical and Biological Dynamics Using Droplet-Based Microfluidics.

    PubMed

    Dressler, Oliver J; Casadevall I Solvas, Xavier; deMello, Andrew J

    2017-06-12

    Recent years have witnessed an increased use of droplet-based microfluidic techniques in a wide variety of chemical and biological assays. Nevertheless, obtaining dynamic data from these platforms has remained challenging, as this often requires reading the same droplets (possibly thousands of them) multiple times over a wide range of intervals (from milliseconds to hours). In this review, we introduce the elemental techniques for the formation and manipulation of microfluidic droplets, together with the most recent developments in these areas. We then discuss a wide range of analytical methods that have been successfully adapted for analyte detection in droplets. Finally, we highlight a diversity of studies where droplet-based microfluidic strategies have enabled the characterization of dynamic systems that would otherwise have remained unexplorable.

  9. Analytical Techniques and the Air Force Logistics Readiness Officer

    DTIC Science & Technology

    2008-03-01

    valuable within business schools (Parker, Pettitjohn and Keillor, 1999) and among leaders of the transportation and logistics industry (Parker, Kent and...Brown, 2001). Parker, Pettitjohn and Keillor (1999) found that at least 90% of undergraduate business schools required either one or two statistics

  10. CHEMICAL ANALYSIS OF WET SCRUBBERS UTILIZING ION CHROMATOGRAPHY

    EPA Science Inventory

    The report describes the key elements required to develop a sampling and analysis program for a wet scrubber using ion chromatography as the main analytical technique. The first part of the report describes a sampling program for two different types of wet scrubbers: the venturi/...

  11. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-10-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.

  12. Electrospray Modifications for Advancing Mass Spectrometric Analysis

    PubMed Central

    Meher, Anil Kumar; Chen, Yu-Chie

    2017-01-01

    Generation of analyte ions in gas phase is a primary requirement for mass spectrometric analysis. One of the ionization techniques that can be used to generate gas phase ions is electrospray ionization (ESI). ESI is a soft ionization method that can be used to analyze analytes ranging from small organics to large biomolecules. Numerous ionization techniques derived from ESI have been reported in the past two decades. These ion sources are aimed to achieve simplicity and ease of operation. Many of these ionization methods allow the flexibility for elimination or minimization of sample preparation steps prior to mass spectrometric analysis. Such ion sources have opened up new possibilities for taking scientific challenges, which might be limited by the conventional ESI technique. Thus, the number of ESI variants continues to increase. This review provides an overview of ionization techniques based on the use of electrospray reported in recent years. Also, a brief discussion on the instrumentation, underlying processes, and selected applications is also presented. PMID:28573082

  13. A technique for computation of noise temperature due to a beam waveguide shroud

    NASA Technical Reports Server (NTRS)

    Veruttipong, W.; Franco, M. M.

    1993-01-01

    Direct analytical computation of the noise temperature of real beam waveguide (BWG) systems, including all mirrors and the surrounding shroud, is an extremely complex problem and virtually impossible to achieve. Yet the DSN antennas are required to be ultra low-noise in order to be effective, and a reasonably accurate prediction is essential. This article presents a relatively simple technique to compute a real BWG system noise temperature by combining analytical techniques with data from experimental tests. Specific expressions and parameters for X-band (8.45-GHz) BWG noise computation are obtained for DSS 13 and DSS 24, now under construction. These expressions are also valid for various conditions of the BWG feed systems, including horn sizes and positions, and mirror sizes, curvatures, and positions. Parameters for S- and Ka-bands (2.3 and 32.0 GHz) have not been determined; however, those can be obtained following the same procedure as for X-band.

  14. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less

  16. Comparison of Commercial Electromagnetic Interface Test Techniques to NASA Electromagnetic Interference Test Techniques

    NASA Astrophysics Data System (ADS)

    Smith, V.

    2000-11-01

    This report documents the development of analytical techniques required for interpreting and comparing space systems electromagnetic interference test data with commercial electromagnetic interference test data using NASA Specification SSP 30237A "Space Systems Electromagnetic Emission and Susceptibility Requirements for Electromagnetic Compatibility." The PSpice computer simulation results and the laboratory measurements for the test setups under study compare well. The study results, however, indicate that the transfer function required to translate test results of one setup to another is highly dependent on cables and their actual layout in the test setup. Since cables are equipment specific and are not specified in the test standards, developing a transfer function that would cover all cable types (random, twisted, or coaxial), sizes (gauge number and length), and layouts (distance from the ground plane) is not practical.

  17. Comparison of Commercial Electromagnetic Interface Test Techniques to NASA Electromagnetic Interference Test Techniques

    NASA Technical Reports Server (NTRS)

    Smith, V.; Minor, J. L. (Technical Monitor)

    2000-01-01

    This report documents the development of analytical techniques required for interpreting and comparing space systems electromagnetic interference test data with commercial electromagnetic interference test data using NASA Specification SSP 30237A "Space Systems Electromagnetic Emission and Susceptibility Requirements for Electromagnetic Compatibility." The PSpice computer simulation results and the laboratory measurements for the test setups under study compare well. The study results, however, indicate that the transfer function required to translate test results of one setup to another is highly dependent on cables and their actual layout in the test setup. Since cables are equipment specific and are not specified in the test standards, developing a transfer function that would cover all cable types (random, twisted, or coaxial), sizes (gauge number and length), and layouts (distance from the ground plane) is not practical.

  18. Recent Advances in Bioprinting and Applications for Biosensing

    PubMed Central

    Dias, Andrew D.; Kingsley, David M.; Corr, David T.

    2014-01-01

    Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write) that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic. PMID:25587413

  19. Philosophical Inquiry: (An Investigation of Basic Philosophical Presuppositions) Teacher's Manual.

    ERIC Educational Resources Information Center

    Institute for Services to Education, Inc., Washington, DC.

    This guide provides teaching techniques for an undergraduate philosophy course. Students examine specific philosophic issues related to the black person's experience. They are required to apply critical and analytical procedures leading to philosophical investigations of topics of both philosophical and nonphilosophical origins. The teaching…

  20. New techniques for imaging and analyzing lung tissue.

    PubMed Central

    Roggli, V L; Ingram, P; Linton, R W; Gutknecht, W F; Mastin, P; Shelburne, J D

    1984-01-01

    The recent technological revolution in the field of imaging techniques has provided pathologists and toxicologists with an expanding repertoire of analytical techniques for studying the interaction between the lung and the various exogenous materials to which it is exposed. Analytical problems requiring elemental sensitivity or specificity beyond the range of that offered by conventional scanning electron microscopy and energy dispersive X-ray analysis are particularly appropriate for the application of these newer techniques. Electron energy loss spectrometry, Auger electron spectroscopy, secondary ion mass spectrometry, and laser microprobe mass analysis each offer unique advantages in this regard, but also possess their own limitations and disadvantages. Diffraction techniques provide crystalline structural information available through no other means. Bulk chemical techniques provide useful cross-checks on the data obtained by microanalytical approaches. It is the purpose of this review to summarize the methodology of these techniques, acknowledge situations in which they have been used in addressing problems in pulmonary toxicology, and comment on the relative advantages and disadvantages of each approach. It is necessary for an investigator to weigh each of these factors when deciding which technique is best suited for any given analytical problem; often it is useful to employ a combination of two or more of the techniques discussed. It is anticipated that there will be increasing utilization of these technologies for problems in pulmonary toxicology in the decades to come. Images FIGURE 3. A FIGURE 3. B FIGURE 3. C FIGURE 3. D FIGURE 4. FIGURE 5. FIGURE 7. A FIGURE 7. B FIGURE 8. A FIGURE 8. B FIGURE 8. C FIGURE 9. A FIGURE 9. B FIGURE 10. PMID:6090115

  1. Nanostructured solid substrates for efficient laser desorption/ionization mass spectrometry (LDI-MS) of low molecular weight compounds.

    PubMed

    Silina, Yuliya E; Volmer, Dietrich A

    2013-12-07

    Analytical applications often require rapid measurement of compounds from complex sample mixtures. High-speed mass spectrometry approaches frequently utilize techniques based on direct ionization of the sample by laser irradiation, mostly by means of matrix-assisted laser desorption/ionization (MALDI). Compounds of low molecular weight are difficult to analyze by MALDI, however, because of severe interferences in the low m/z range from the organic matrix used for desorption/ionization. In recent years, surface-assisted laser desorption/ionization (SALDI) techniques have shown promise for small molecule analysis, due to the unique properties of nanostructured surfaces, in particular, the lack of a chemical background in the low m/z range and enhanced production of analyte ions by SALDI. This short review article presents a summary of the most promising recent developments in SALDI materials for MS analysis of low molecular weight analytes, with emphasis on nanostructured materials based on metals and semiconductors.

  2. Guidance to Achieve Accurate Aggregate Quantitation in Biopharmaceuticals by SV-AUC.

    PubMed

    Arthur, Kelly K; Kendrick, Brent S; Gabrielson, John P

    2015-01-01

    The levels and types of aggregates present in protein biopharmaceuticals must be assessed during all stages of product development, manufacturing, and storage of the finished product. Routine monitoring of aggregate levels in biopharmaceuticals is typically achieved by size exclusion chromatography (SEC) due to its high precision, speed, robustness, and simplicity to operate. However, SEC is error prone and requires careful method development to ensure accuracy of reported aggregate levels. Sedimentation velocity analytical ultracentrifugation (SV-AUC) is an orthogonal technique that can be used to measure protein aggregation without many of the potential inaccuracies of SEC. In this chapter, we discuss applications of SV-AUC during biopharmaceutical development and how characteristics of the technique make it better suited for some applications than others. We then discuss the elements of a comprehensive analytical control strategy for SV-AUC. Successful implementation of these analytical control elements ensures that SV-AUC provides continued value over the long time frames necessary to bring biopharmaceuticals to market. © 2015 Elsevier Inc. All rights reserved.

  3. Comments on "A Closed-Form Solution to Tensor Voting: Theory and Applications".

    PubMed

    Maggiori, Emmanuel; Lotito, Pablo; Manterola, Hugo Luis; del Fresno, Mariana

    2014-12-01

    We comment on a paper that describes a closed-form formulation to Tensor Voting, a technique to perceptually group clouds of points, usually applied to infer features in images. The authors proved an analytic solution to the technique, a highly relevant contribution considering that the original formulation required numerical integration, a time-consuming task. Their work constitutes the first closed-form expression for the Tensor Voting framework. In this work we first observe that the proposed formulation leads to unexpected results which do not satisfy the constraints for a Tensor Voting output, hence they cannot be interpreted. Given that the closed-form expression is said to be an analytic equivalent solution, unexpected outputs should not be encountered unless there are flaws in the proof. We analyzed the underlying math to find which were the causes of these unexpected results. In this commentary we show that their proposal does not in fact provide a proper analytic solution to Tensor Voting and we indicate the flaws in the proof.

  4. Tutorial in medical decision modeling incorporating waiting lines and queues using discrete event simulation.

    PubMed

    Jahn, Beate; Theurl, Engelbert; Siebert, Uwe; Pfeiffer, Karl-Peter

    2010-01-01

    In most decision-analytic models in health care, it is assumed that there is treatment without delay and availability of all required resources. Therefore, waiting times caused by limited resources and their impact on treatment effects and costs often remain unconsidered. Queuing theory enables mathematical analysis and the derivation of several performance measures of queuing systems. Nevertheless, an analytical approach with closed formulas is not always possible. Therefore, simulation techniques are used to evaluate systems that include queuing or waiting, for example, discrete event simulation. To include queuing in decision-analytic models requires a basic knowledge of queuing theory and of the underlying interrelationships. This tutorial introduces queuing theory. Analysts and decision-makers get an understanding of queue characteristics, modeling features, and its strength. Conceptual issues are covered, but the emphasis is on practical issues like modeling the arrival of patients. The treatment of coronary artery disease with percutaneous coronary intervention including stent placement serves as an illustrative queuing example. Discrete event simulation is applied to explicitly model resource capacities, to incorporate waiting lines and queues in the decision-analytic modeling example.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vincenti, H.; Vay, J. -L.

    Due to discretization effects and truncation to finite domains, many electromagnetic simulations present non-physical modifications of Maxwell's equations in space that may generate spurious signals affecting the overall accuracy of the result. Such modifications for instance occur when Perfectly Matched Layers (PMLs) are used at simulation domain boundaries to simulate open media. Another example is the use of arbitrary order Maxwell solver with domain decomposition technique that may under some condition involve stencil truncations at subdomain boundaries, resulting in small spurious errors that do eventually build up. In each case, a careful evaluation of the characteristics and magnitude of themore » errors resulting from these approximations, and their impact at any frequency and angle, requires detailed analytical and numerical studies. To this end, we present a general analytical approach that enables the evaluation of numerical discretization errors of fully three-dimensional arbitrary order finite-difference Maxwell solver, with arbitrary modification of the local stencil in the simulation domain. The analytical model is validated against simulations of domain decomposition technique and PMLs, when these are used with very high-order Maxwell solver, as well as in the infinite order limit of pseudo-spectral solvers. Results confirm that the new analytical approach enables exact predictions in each case. It also confirms that the domain decomposition technique can be used with very high-order Maxwell solver and a reasonably low number of guard cells with negligible effects on the whole accuracy of the simulation.« less

  6. A fast analytical undulator model for realistic high-energy FEL simulations

    NASA Astrophysics Data System (ADS)

    Tatchyn, R.; Cremer, T.

    1997-02-01

    A number of leading FEL simulation codes used for modeling gain in the ultralong undulators required for SASE saturation in the <100 Å range employ simplified analytical models both for field and error representations. Although it is recognized that both the practical and theoretical validity of such codes could be enhanced by incorporating realistic undulator field calculations, the computational cost of doing this can be prohibitive, especially for point-to-point integration of the equations of motion through each undulator period. In this paper we describe a simple analytical model suitable for modeling realistic permanent magnet (PM), hybrid/PM, and non-PM undulator structures, and discuss selected techniques for minimizing computation time.

  7. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  8. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  9. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Determining the Kinetic Parameters Characteristic of Microalgal Growth.

    ERIC Educational Resources Information Center

    Martinez Sancho, Maria Eugenie; And Others

    1991-01-01

    An activity in which students obtain a growth curve for algae, identify the exponential and linear growth phases, and calculate the parameters which characterize both phases is described. The procedure, a list of required materials, experimental conditions, analytical technique, and a discussion of the interpretations of individual results are…

  11. Effectiveness of Web-Based Psychological Interventions for Depression: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cowpertwait, Louise; Clarke, Dave

    2013-01-01

    Web-based psychological interventions aim to make psychological treatments more accessible and minimize clinician input, but their effectiveness requires further examination. The purposes of the present study are to evaluate the outcomes of web-based interventions for treating depressed adults using meta-analytic techniques, and to examine…

  12. An Application of Trimethylsilyl Derivatives with Temperature Programmed Gas Chromatography to the Senior Analytical Laboratory.

    ERIC Educational Resources Information Center

    Kelter, Paul B.; Carr, James D.

    1983-01-01

    Describes an experiment designed to teach temperature programed gas chromatography (TPGC) techniques and importance of derivatizing many classes of substrated to be separated. Includes equipment needed, procedures for making trimethylsilyl derivatives, applications, sample calculations, and typical results. Procedure required one, three-hour…

  13. Human Modelling for Military Application (Applications militaires de la modelisation humaine)

    DTIC Science & Technology

    2010-10-01

    techniques (rooted in the mathematics-centered analytic methods arising from World War I analyses by Lanchester 2 ). Recent requirements for research and...34Dry Shooting for Airplane Gunners - Popular Science Monthly". January 1919. p. 13-14. 2 Lanchester F.W., Mathematics in Warfare in The World of

  14. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  15. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  16. DOSY Analysis of Micromolar Analytes: Resolving Dilute Mixtures by SABRE Hyperpolarization.

    PubMed

    Reile, Indrek; Aspers, Ruud L E G; Tyburn, Jean-Max; Kempf, James G; Feiters, Martin C; Rutjes, Floris P J T; Tessari, Marco

    2017-07-24

    DOSY is an NMR spectroscopy technique that resolves resonances according to the analytes' diffusion coefficients. It has found use in correlating NMR signals and estimating the number of components in mixtures. Applications of DOSY in dilute mixtures are, however, held back by excessively long measurement times. We demonstrate herein, how the enhanced NMR sensitivity provided by SABRE hyperpolarization allows DOSY analysis of low-micromolar mixtures, thus reducing the concentration requirements by at least 100-fold. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. "Cork taint" responsible compounds. Determination of haloanisoles and halophenols in cork matrix: A review.

    PubMed

    Tarasov, Andrii; Rauhut, Doris; Jung, Rainer

    2017-12-01

    Analytical methods of haloanisoles and halophenols quantification in cork matrix are summarized in the current review. Sample-preparation and sample-treatment techniques have been compared and discussed from the perspective of their efficiency, time- and extractant-optimization, easiness of performance. Primary interest of these analyses usually addresses to 2,4,6-trichloroanisole (TCA), which is a major wine contaminant among haloanisoles. Two concepts of TCA determination are described in the review: releasable TCA and total TCA analyses. Chromatographic, bioanalytical and sensorial methods were compared according to their application in the cork industry and in scientific investigations. Finally, it was shown that modern analytical techniques are able to provide required sensitivity, selectivity and repeatability for haloanisoles and halophenols determination. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Advanced analytical electron microscopy for alkali-ion batteries

    DOE PAGES

    Qian, Danna; Ma, Cheng; Meng, Ying Shirley; ...

    2015-06-26

    Lithium-ion batteries are a leading candidate for electric vehicle and smart grid applications. However, further optimizations of the energy/power density, coulombic efficiency and cycle life are still needed, and this requires a thorough understanding of the dynamic evolution of each component and their synergistic behaviors during battery operation. With the capability of resolving the structure and chemistry at an atomic resolution, advanced analytical transmission electron microscopy (AEM) is an ideal technique for this task. The present review paper focuses on recent contributions of this important technique to the fundamental understanding of the electrochemical processes of battery materials. A detailed reviewmore » of both static (ex situ) and real-time (in situ) studies will be given, and issues that still need to be addressed will be discussed.« less

  19. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Comprehensive two-dimensional gas chromatography and food sensory properties: potential and challenges.

    PubMed

    Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo

    2015-01-01

    Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.

  2. Influence of cross section variations on the structural behaviour of composite rotor blades

    NASA Astrophysics Data System (ADS)

    Rapp, Helmut; Woerndle, Rudolf

    1991-09-01

    A highly sophisticated structural analysis is required for helicopter rotor blades with nonhomogeneous cross sections made from nonisotropic material. Combinations of suitable analytical techniques with FEM-based techniques permit a cost effective and sufficiently accurate analysis of these complicated structures. It is determined that in general the 1D engineering theory of bending combined with 2D theories for determining the cross section properties is sufficient to describe the structural blade behavior.

  3. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  4. Application of the radioisotope excited X-ray fluorescence technique in charge optimization during thermite smelting of Fe-Ni, Fe-cr, and Fe-Ti alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, I.G.; Joseph, D.; Lal, M.

    1995-10-01

    A wide range of ferroalloys are used to facilitate the addition of different alloying elements to molten steel. High-carbon ferroalloys are produced on a tonnage basis by carbothermic smelting in an electric furnace, and an aluminothermic route is generally adopted for small scale production of low-carbon varieties. The physicochemical principles of carbothermy and aluminothermy have been well documented in the literature. However, limited technical data are reported on the production of individual ferroalloys of low-carbon varieties from their selected resources. The authors demonstrate her the application of an energy dispersive X-ray fluorescence (EDXRF) technique in meeting the analytical requirements ofmore » a thermite smelting campaign, carried out with the aim of preparing low-carbon-low-nitrogen Fe-Ni, Fe-Cr, and Fe-Ti alloys from indigenously available nickel bearing spent catalyst, mineral chromite, and ilmenite/rutile, respectively. They have chosen the EDXRF technique to meet the analytical requirements because of its capability to analyze samples of ore, minerals, a metal, and alloys in different forms, such as powder, sponge, as-smelted, or as-cast, to obtain rapid multielement analyses with ease. Rapid analyses of thermite feed and product by this technique have aided in the appropriate alterations of the charge constitutents to obtain optimum charge consumption.« less

  5. Many-core graph analytics using accelerated sparse linear algebra routines

    NASA Astrophysics Data System (ADS)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  6. Sensitive molecular diagnostics using surface-enhanced resonance Raman scattering (SERRS)

    NASA Astrophysics Data System (ADS)

    Faulds, Karen; Graham, Duncan; McKenzie, Fiona; MacRae, Douglas; Ricketts, Alastair; Dougan, Jennifer

    2009-02-01

    Surface enhanced resonance Raman scattering (SERRS) is an analytical technique with several advantages over competitive techniques in terms of improved sensitivity and multiplexing. We have made great progress in the development of SERRS as a quantitative analytical method, in particular for the detection of DNA. SERRS is an extremely sensitive and selective technique which when applied to the detection of labelled DNA sequences allows detection limits to be obtained which rival, and in most cases, are better than fluorescence. Here the conditions are explored which will enable the successful detection of DNA using SERRS. The enhancing surface which is used is crucial and in this case suspensions of nanoparticles were used as they allow quantitative behaviour to be achieved and allow analogous systems to current fluorescence based systems to be made. The aggregation conditions required to obtain SERRS of DNA are crucial and herein we describe the use of spermine as an aggregating agent. The nature of the label which is used, be it fluorescent, positively or negatively charged also effects the SERRS response and these conditions are again explored here. We have clearly demonstrated the ability to identify the components of a mixture of 5 analytes in solution by using two different excitation wavelengths and also of a 6-plex using data analysis techniques. These conditions will allow the use of SERRS for the detection of target DNA in a meaningful diagnostic assay.

  7. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  8. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  9. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.

  10. Analytical, Numerical, and Experimental Investigation on a Non-Contact Method for the Measurements of Creep Properties of Ultra-High-Temperature Materials

    NASA Technical Reports Server (NTRS)

    Lee, Jonghyun; Hyers, Robert W.; Rogers, Jan R.; Rathz, Thomas J.; Choo, Hahn; Liaw, Peter

    2006-01-01

    Responsive access to space requires re-use of components such as rocket nozzles that operate at extremely high temperatures. For such applications, new ultra-hightemperature materials that can operate over 2,000 C are required. At the temperatures higher than the fifty percent of the melting temperature, the characterization of creep properties is indispensable. Since conventional methods for the measurement of creep is limited below 1,700 C, a new technique that can be applied at higher temperatures is strongly demanded. This research develops a non-contact method for the measurement of creep at the temperatures over 2,300 C. Using the electrostatic levitator in NASA MSFC, a spherical sample was rotated to cause creep deformation by centrifugal acceleration. The deforming sample was captured with a digital camera and analyzed to measure creep deformation. Numerical and analytical analyses have also been conducted to compare the experimental results. Analytical, numerical, and experimental results showed a good agreement with one another.

  11. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and alpha-spectrometry.

    PubMed

    Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.

  12. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  13. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  14. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  16. Orbiter Avionics Radiation Handbook

    NASA Technical Reports Server (NTRS)

    Reddell, Brandon D.

    1999-01-01

    This handbook was assembled to document he radiation environment for design of Orbiter avionics. It also maps the environment through vehicle shielding and mission usage into discrete requirements such as total dose. Some details of analytical techniques for calculating radiation effects are provided. It is anticipated that appropriate portions of this document will be added to formal program specifications.

  17. 75 FR 28616 - Agilent Technologies, Inc.; Analysis of the Agreement Containing Consent Order to Aid Public Comment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... equipment used to test cell phones and communications equipment, machines that determine the contents of... employ various analytical techniques to test samples of many types, are used by academic researchers... require the sensitivity provided by ICP-MS, and because many customers perform tests pursuant to...

  18. Social Networks and Smoking: Exploring the Effects of Peer Influence and Smoker Popularity through Simulations

    ERIC Educational Resources Information Center

    Schaefer, David R.; adams, jimi; Haas, Steven A.

    2013-01-01

    Adolescent smoking and friendship networks are related in many ways that can amplify smoking prevalence. Understanding and developing interventions within such a complex system requires new analytic approaches. We draw on recent advances in dynamic network modeling to develop a technique that explores the implications of various intervention…

  19. "Open-Box" Approach to Measuring Fluorescence Quenching Using an iPad Screen and Digital SLR Camera

    ERIC Educational Resources Information Center

    Koenig, Michael H.; Yi, Eun P.; Sandridge, Matthew J.; Mathew, Alexander S.; Demas, James N.

    2015-01-01

    Fluorescence quenching is an analytical technique and a common undergraduate laboratory exercise. Unfortunately, a typical quenching experiment requires the use of an expensive fluorometer that measures the relative fluorescence intensity of a single sample in a closed compartment unseen by the experimenter. To overcome these shortcomings, we…

  20. SUBSURFACE CHARACTERIZATION AND MONITORING TECHNIQUES: A DESK REFERENCE GUIDE - VOLUME II: THE VADOSE ZONE, FIELD SCREENING AND ANALYTICAL METHODS - APPENDICES C AND D

    EPA Science Inventory

    Many EPA programs, including those under the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Response, Compensation, and Liability Act (CERCLA), require subsurface characterization and monitoring to detect ground-water contamination and provide data to deve...

  1. Computational aspects of sensitivity calculations in linear transient structural analysis. Ph.D. Thesis - Virginia Polytechnic Inst. and State Univ.

    NASA Technical Reports Server (NTRS)

    Greene, William H.

    1990-01-01

    A study was performed focusing on the calculation of sensitivities of displacements, velocities, accelerations, and stresses in linear, structural, transient response problems. One significant goal of the study was to develop and evaluate sensitivity calculation techniques suitable for large-order finite element analyses. Accordingly, approximation vectors such as vibration mode shapes are used to reduce the dimensionality of the finite element model. Much of the research focused on the accuracy of both response quantities and sensitivities as a function of number of vectors used. Two types of sensitivity calculation techniques were developed and evaluated. The first type of technique is an overall finite difference method where the analysis is repeated for perturbed designs. The second type of technique is termed semi-analytical because it involves direct, analytical differentiation of the equations of motion with finite difference approximation of the coefficient matrices. To be computationally practical in large-order problems, the overall finite difference methods must use the approximation vectors from the original design in the analyses of the perturbed models. In several cases this fixed mode approach resulted in very poor approximations of the stress sensitivities. Almost all of the original modes were required for an accurate sensitivity and for small numbers of modes, the accuracy was extremely poor. To overcome this poor accuracy, two semi-analytical techniques were developed. The first technique accounts for the change in eigenvectors through approximate eigenvector derivatives. The second technique applies the mode acceleration method of transient analysis to the sensitivity calculations. Both result in accurate values of the stress sensitivities with a small number of modes and much lower computational costs than if the vibration modes were recalculated and then used in an overall finite difference method.

  2. Teleoperator system man-machine interface requirements for satellite retrieval and satellite servicing. Volume 1: Requirements

    NASA Technical Reports Server (NTRS)

    Malone, T. B.

    1972-01-01

    Requirements were determined analytically for the man machine interface for a teleoperator system performing on-orbit satellite retrieval and servicing. Requirements are basically of two types; mission/system requirements, and design requirements or design criteria. Two types of teleoperator systems were considered: a free flying vehicle, and a shuttle attached manipulator. No attempt was made to evaluate the relative effectiveness or efficiency of the two system concepts. The methodology used entailed an application of the Essex Man-Systems analysis technique as well as a complete familiarization with relevant work being performed at government agencies and by private industry.

  3. CLOSED-LOOP STRIPPING ANALYSIS (CLSA) OF ...

    EPA Pesticide Factsheets

    Synthetic musk compounds have been found in surface water, fish tissues, and human breast milk. Current techniques for separating these compounds from fish tissues require tedious sample clean-upprocedures A simple method for the deterrnination of these compounds in fish tissues has been developed. Closed-loop stripping of saponified fish tissues in a I -L Wheaton purge-and-trap vessel is used to strip compounds with high vapor pressures such as synthetic musks from the matrix onto a solid sorbent (Abselut Nexus). This technique is useful for screening biological tissues that contain lipids for musk compounds. Analytes are desorbed from the sorbent trap sequentially with polar and nonpolar solvents, concentrated, and directly analyzed by high resolution gas chromatography coupled to a mass spectrometer operating in the selected ion monitoring mode. In this paper, we analyzed two homogenized samples of whole fish tissues with spiked synthetic musk compounds using closed-loop stripping analysis (CLSA) and pressurized liquid extraction (PLE). The analytes were not recovered quantitatively but the extraction yield was sufficiently reproducible for at least semi-quantitative purposes (screening). The method was less expensive to implement and required significantly less sample preparation than the PLE technique. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water,

  4. Efficient computational nonlinear dynamic analysis using modal modification response technique

    NASA Astrophysics Data System (ADS)

    Marinone, Timothy; Avitabile, Peter; Foley, Jason; Wolfson, Janet

    2012-08-01

    Generally, structural systems contain nonlinear characteristics in many cases. These nonlinear systems require significant computational resources for solution of the equations of motion. Much of the model, however, is linear where the nonlinearity results from discrete local elements connecting different components together. Using a component mode synthesis approach, a nonlinear model can be developed by interconnecting these linear components with highly nonlinear connection elements. The approach presented in this paper, the Modal Modification Response Technique (MMRT), is a very efficient technique that has been created to address this specific class of nonlinear problem. By utilizing a Structural Dynamics Modification (SDM) approach in conjunction with mode superposition, a significantly smaller set of matrices are required for use in the direct integration of the equations of motion. The approach will be compared to traditional analytical approaches to make evident the usefulness of the technique for a variety of test cases.

  5. MALDI mass spectrometry imaging, from its origins up to today: the state of the art.

    PubMed

    Francese, Simona; Dani, Francesca R; Traldi, Pietro; Mastrobuoni, Guido; Pieraccini, Giuseppe; Moneti, Gloriano

    2009-02-01

    Mass Spectrometry (MS) has a number of features namely sensitivity, high dynamic range, high resolution, and versatility which make it a very powerful analytical tool for a wide spectrum of applications spanning all the life science fields. Among all the MS techniques, MALDI Imaging mass spectrometry (MALDI MSI) is currently one of the most exciting both for its rapid technological improvements, and for its great potential in high impact bioscience fields. Here, MALDI MSI general principles are described along with technical and instrumental details as well as application examples. Imaging MS instruments and imaging mass spectrometric techniques other than MALDI, are presented along with examples of their use. As well as reporting MSI successes in several bioscience fields, an attempt is made to take stock of what has been achieved so far with this technology and to discuss the analytical and technological advances required for MSI to be applied as a routine technique in clinical diagnostics, clinical monitoring and in drug discovery.

  6. Developments and advances concerning the hyperpolarisation technique SABRE.

    PubMed

    Mewis, Ryan E

    2015-10-01

    To overcome the inherent sensitivity issue in NMR and MRI, hyperpolarisation techniques are used. Signal Amplification By Reversible Exchange (SABRE) is a hyperpolarisation technique that utilises parahydrogen, a molecule that possesses a nuclear singlet state, as the source of polarisation. A metal complex is required to break the singlet order of parahydrogen and, by doing so, facilitates polarisation transfer to analyte molecules ligated to the same complex through the J-coupled network that exists. The increased signal intensities that the analyte molecules possess as a result of this process have led to investigations whereby their potential as MRI contrast agents has been probed and to understand the fundamental processes underpinning the polarisation transfer mechanism. As well as discussing literature relevant to both of these areas, the chemical structure of the complex, the physical constraints of the polarisation transfer process and the successes of implementing SABRE at low and high magnetic fields are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  7. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  8. Standard addition with internal standardisation as an alternative to using stable isotope labelled internal standards to correct for matrix effects-Comparison and validation using liquid chromatography-​tandem mass spectrometric assay of vitamin D.

    PubMed

    Hewavitharana, Amitha K; Abu Kassim, Nur Sofiah; Shaw, Paul Nicholas

    2018-06-08

    With mass spectrometric detection in liquid chromatography, co-eluting impurities affect the analyte response due to ion suppression/enhancement. Internal standard calibration method, using co-eluting stable isotope labelled analogue of each analyte as the internal standard, is the most appropriate technique available to correct for these matrix effects. However, this technique is not without drawbacks, proved to be expensive because separate internal standard for each analyte is required, and the labelled compounds are expensive or require synthesising. Traditionally, standard addition method has been used to overcome the matrix effects in atomic spectroscopy and was a well-established method. This paper proposes the same for mass spectrometric detection, and demonstrates that the results are comparable to those with the internal standard method using labelled analogues, for vitamin D assay. As conventional standard addition procedure does not address procedural errors, we propose the inclusion of an additional internal standard (not co-eluting). Recoveries determined on human serum samples show that the proposed method of standard addition yields more accurate results than the internal standardisation using stable isotope labelled analogues. The precision of the proposed method of standard addition is superior to the conventional standard addition method. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Constitutive parameter measurements of lossy materials

    NASA Technical Reports Server (NTRS)

    Dominek, A.; Park, A.

    1989-01-01

    The electrical constitutive parameters of lossy materials are considered. A discussion of the NRL arch for lossy coatings is presented involving analytical analyses of the reflected field using the geometrical theory of diffraction (GTD) and physical optics (PO). The actual values for these parameters can be obtained through a traditional transmission technique which is examined from an error analysis standpoint. Alternate sample geometries are suggested for this technique to reduce sample tolerance requirements for accurate parameter determination. The performance for one alternate geometry is given.

  10. Information support for decision making on dispatching control of water distribution in irrigation

    NASA Astrophysics Data System (ADS)

    Yurchenko, I. F.

    2018-05-01

    The research has been carried out on developing the technique of supporting decision making for on-line control, operational management of water allocation for the interfarm irrigation projects basing on the analytical patterns of dispatcher control. This technique provides an increase of labour productivity as well as higher management quality due to the improved level of automation, as well as decision making optimization taking into account diagnostics of the issues, solutions classification, information being required to the decision makers.

  11. Application of LANDSAT data to monitor land reclamation progress in Belmont County, Ohio

    NASA Technical Reports Server (NTRS)

    Bloemer, H. H. L.; Brumfield, J. O.; Campbell, W. J.; Witt, R. G.; Bly, B. G.

    1981-01-01

    Strip and contour mining techniques are reviewed as well as some studies conducted to determine the applicability of LANDSAT and associated digital image processing techniques to the surficial problems associated with mining operations. A nontraditional unsupervised classification approach to multispectral data is considered which renders increased classification separability in land cover analysis of surface mined areas. The approach also reduces the dimensionality of the data and requires only minimal analytical skills in digital data processing.

  12. Status and Needs Research for On-line Monitoring of VOCs Emissions from Stationary Sources

    NASA Astrophysics Data System (ADS)

    Zhou, Gang; Wang, Qiang; Zhong, Qi; Zhao, Jinbao; Yang, Kai

    2018-01-01

    Based on atmospheric volatile organic compounds (VOCs) pollution control requirements during the twelfth-five year plan and the current status of monitoring and management at home and abroad, instrumental architecture and technical characteristics of continuous emission monitoring systems (CEMS) for VOCs emission from stationary sources are investigated and researched. Technological development needs of VOCs emission on-line monitoring techniques for stationary sources in china are proposed from the system sampling pretreatment technology and analytical measurement techniques.

  13. Methods for determination of inorganic substances in water and fluvial sediments

    USGS Publications Warehouse

    Fishman, Marvin J.; Friedman, Linda C.

    1989-01-01

    Chapter Al of the laboratory manual contains methods used by the U.S. Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, the total recoverable and total of constituents in water-suspended sediment samples, and the recoverable and total concentrations of constituents in samples of bottom material. The introduction to the manual includes essential definitions and a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including the accuracy and precision of analyses, the use of standard-reference water samples, and the operation of an effective quality-assurance program. Methods for sample preparation and pretreatment are given also. A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods of these techniques are arranged alphabetically by constituent. For each method, the general topics covered are the application, the principle of the method, the interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 126 methods are given for the determination of 70 inorganic constituents and physical properties of water, suspended sediment, and bottom material.

  14. Methods for determination of inorganic substances in water and fluvial sediments

    USGS Publications Warehouse

    Fishman, Marvin J.; Friedman, Linda C.

    1985-01-01

    Chapter Al of the laboratory manual contains methods used by the Geological Survey to analyze samples of water, suspended sediments, and bottom material for their content of inorganic constituents. Included are methods for determining the concentration of dissolved constituents in water, total recoverable and total of constituents in water-suspended sediment samples, and recoverable and total concentrations of constituents in samples of bottom material. Essential definitions are included in the introduction to the manual, along with a brief discussion of the use of significant figures in calculating and reporting analytical results. Quality control in the water-analysis laboratory is discussed, including accuracy and precision of analyses, the use of standard reference water samples, and the operation of an effective quality assurance program. Methods for sample preparation and pretreatment are given also.A brief discussion of the principles of the analytical techniques involved and their particular application to water and sediment analysis is presented. The analytical methods involving these techniques are arranged alphabetically according to constituent. For each method given, the general topics covered are application, principle of the method, interferences, apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data, when available. More than 125 methods are given for the determination of 70 different inorganic constituents and physical properties of water, suspended sediment, and bottom material.

  15. Model of separation performance of bilinear gradients in scanning format counter-flow gradient electrofocusing techniques.

    PubMed

    Shameli, Seyed Mostafa; Glawdel, Tomasz; Ren, Carolyn L

    2015-03-01

    Counter-flow gradient electrofocusing allows the simultaneous concentration and separation of analytes by generating a gradient in the total velocity of each analyte that is the sum of its electrophoretic velocity and the bulk counter-flow velocity. In the scanning format, the bulk counter-flow velocity is varying with time so that a number of analytes with large differences in electrophoretic mobility can be sequentially focused and passed by a single detection point. Studies have shown that nonlinear (such as a bilinear) velocity gradients along the separation channel can improve both peak capacity and separation resolution simultaneously, which cannot be realized by using a single linear gradient. Developing an effective separation system based on the scanning counter-flow nonlinear gradient electrofocusing technique usually requires extensive experimental and numerical efforts, which can be reduced significantly with the help of analytical models for design optimization and guiding experimental studies. Therefore, this study focuses on developing an analytical model to evaluate the separation performance of scanning counter-flow bilinear gradient electrofocusing methods. In particular, this model allows a bilinear gradient and a scanning rate to be optimized for the desired separation performance. The results based on this model indicate that any bilinear gradient provides a higher separation resolution (up to 100%) compared to the linear case. This model is validated by numerical studies. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. On the Contribution of Raman Spectroscopy to Forensic Science

    NASA Astrophysics Data System (ADS)

    Buzzini, Patrick; Massonnet, Genevieve

    2010-08-01

    Raman spectroscopy has only recently sparked interest from forensic laboratories. The Raman technique has demonstrated important advantages such as its nondestructive nature, its fast analysis time, and especially the possibility of performing microscopical in situ analyses. In forensic applications, it is a versatile technique that covers a wide spectrum of substances such as trace evidence, illicit drugs and inks. An overview of the recent developments of Raman spectroscopy in forensic science will be discussed. Also, the requirements for an analytical technique for the examination of physical evidence will be described. Examples of casework will be depicted.

  17. Detailed analysis of the effects of stencil spatial variations with arbitrary high-order finite-difference Maxwell solver

    DOE PAGES

    Vincenti, H.; Vay, J. -L.

    2015-11-22

    Due to discretization effects and truncation to finite domains, many electromagnetic simulations present non-physical modifications of Maxwell's equations in space that may generate spurious signals affecting the overall accuracy of the result. Such modifications for instance occur when Perfectly Matched Layers (PMLs) are used at simulation domain boundaries to simulate open media. Another example is the use of arbitrary order Maxwell solver with domain decomposition technique that may under some condition involve stencil truncations at subdomain boundaries, resulting in small spurious errors that do eventually build up. In each case, a careful evaluation of the characteristics and magnitude of themore » errors resulting from these approximations, and their impact at any frequency and angle, requires detailed analytical and numerical studies. To this end, we present a general analytical approach that enables the evaluation of numerical discretization errors of fully three-dimensional arbitrary order finite-difference Maxwell solver, with arbitrary modification of the local stencil in the simulation domain. The analytical model is validated against simulations of domain decomposition technique and PMLs, when these are used with very high-order Maxwell solver, as well as in the infinite order limit of pseudo-spectral solvers. Results confirm that the new analytical approach enables exact predictions in each case. It also confirms that the domain decomposition technique can be used with very high-order Maxwell solver and a reasonably low number of guard cells with negligible effects on the whole accuracy of the simulation.« less

  18. Characterization of carrier erythrocytes for biosensing applications

    NASA Astrophysics Data System (ADS)

    Bustamante López, Sandra C.; Meissner, Kenith E.

    2017-09-01

    Erythrocyte abundance, mobility, and carrying capacity make them attractive as a platform for blood analyte sensing as well as for drug delivery. Sensor-loaded erythrocytes, dubbed erythrosensors, could be reinfused into the bloodstream, excited noninvasively through the skin, and used to provide measurement of analyte levels in the bloodstream. Several techniques to load erythrocytes, thus creating carrier erythrocytes, exist. However, their cellular characteristics remain largely unstudied. Changes in cellular characteristics lead to removal from the bloodstream. We hypothesize that erythrosensors need to maintain native erythrocytes' (NEs) characteristics to serve as a long-term sensing platform. Here, we investigate two loading techniques and the properties of the resulting erythrosensors. For loading, hypotonic dilution requires a hypotonic solution while electroporation relies on electrical pulses to perforate the erythrocyte membrane. We analyze the resulting erythrosensor signal, size, morphology, and hemoglobin content. Although the resulting erythrosensors exhibit morphological changes, their size was comparable with NEs. The hypotonic dilution technique was found to load erythrosensors much more efficiently than electroporation, and the sensors were loaded throughout the volume of the erythrosensors. Finally, both techniques resulted in significant loss of hemoglobin. This study points to the need for continued development of loading techniques that better preserve NE characteristics.

  19. Toward interactive search in remote sensing imagery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Porter, Reid B; Hush, Do; Harvey, Neal

    2010-01-01

    To move from data to information in almost all science and defense applications requires a human-in-the-loop to validate information products, resolve inconsistencies, and account for incomplete and potentially deceptive sources of information. This is a key motivation for visual analytics which aims to develop techniques that complement and empower human users. By contrast, the vast majority of algorithms developed in machine learning aim to replace human users in data exploitation. In this paper we describe a recently introduced machine learning problem, called rare category detection, which may be a better match to visual analytic environments. We describe a new designmore » criteria for this problem, and present comparisons to existing techniques with both synthetic and real-world datasets. We conclude by describing an application in broad-area search of remote sensing imagery.« less

  20. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  1. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  2. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    NASA Astrophysics Data System (ADS)

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-11-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes--including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH--in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media.

  3. Functional Interfaces Constructed by Controlled/Living Radical Polymerization for Analytical Chemistry.

    PubMed

    Wang, Huai-Song; Song, Min; Hang, Tai-Jun

    2016-02-10

    The high-value applications of functional polymers in analytical science generally require well-defined interfaces, including precisely synthesized molecular architectures and compositions. Controlled/living radical polymerization (CRP) has been developed as a versatile and powerful tool for the preparation of polymers with narrow molecular weight distributions and predetermined molecular weights. Among the CRP system, atom transfer radical polymerization (ATRP) and reversible addition-fragmentation chain transfer (RAFT) are well-used to develop new materials for analytical science, such as surface-modified core-shell particles, monoliths, MIP micro- or nanospheres, fluorescent nanoparticles, and multifunctional materials. In this review, we summarize the emerging functional interfaces constructed by RAFT and ATRP for applications in analytical science. Various polymers with precisely controlled architectures including homopolymers, block copolymers, molecular imprinted copolymers, and grafted copolymers were synthesized by CRP methods for molecular separation, retention, or sensing. We expect that the CRP methods will become the most popular technique for preparing functional polymers that can be broadly applied in analytical chemistry.

  4. Current Status of Mycotoxin Analysis: A Critical Review.

    PubMed

    Shephard, Gordon S

    2016-07-01

    It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.

  5. Application of Data Provenance in Healthcare Analytics Software: Information Visualisation of User Activities

    PubMed Central

    Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa

    2018-01-01

    Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084

  6. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    PubMed

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  7. Methods for determination of radioactive substances in water and fluvial sediments

    USGS Publications Warehouse

    Thatcher, Leland Lincoln; Janzer, Victor J.; Edwards, Kenneth W.

    1977-01-01

    Analytical methods for the determination of some of the more important components of fission or neutron activation product radioactivity and of natural radioactivity found in water are reported. The report for each analytical method includes conditions for application of the method, a summary of the method, interferences, required apparatus and reagents, analytical procedures, calculations, reporting of results, and estimation of precision. The fission product isotopes considered are cesium-137, strontium-90, and ruthenium-106. The natural radioelements and isotopes considered are uranium, lead-210, radium-226, radium-228, tritium, and carbon-14. A gross radioactivity survey method and a uranium isotope ratio method are given. When two analytical methods are in routine use for an individual isotope, both methods are reported with identification of the specific areas of application of each. Techniques for the collection and preservation of water samples to be analyzed for radioactivity are discussed.

  8. Meromorphic solutions of recurrence relations and DRA method for multicomponent master integrals

    NASA Astrophysics Data System (ADS)

    Lee, Roman N.; Mingulov, Kirill T.

    2018-04-01

    We formulate a method to find the meromorphic solutions of higher-order recurrence relations in the form of the sum over poles with coefficients defined recursively. Several explicit examples of the application of this technique are given. The main advantage of the described approach is that the analytical properties of the solutions are very clear (the position of poles is explicit, the behavior at infinity can be easily determined). These are exactly the properties that are required for the application of the multiloop calculation method based on dimensional recurrence relations and analyticity (the DRA method).

  9. [Validation of an in-house method for the determination of zinc in serum: Meeting the requirements of ISO 17025].

    PubMed

    Llorente Ballesteros, M T; Navarro Serrano, I; López Colón, J L

    2015-01-01

    The aim of this report is to propose a scheme for validation of an analytical technique according to ISO 17025. According to ISO 17025, the fundamental parameters tested were: selectivity, calibration model, precision, accuracy, uncertainty of measurement, and analytical interference. A protocol has been developed that has been applied successfully to quantify zinc in serum by atomic absorption spectrometry. It is demonstrated that our method is selective, linear, accurate, and precise, making it suitable for use in routine diagnostics. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  10. Full Flight Envelope Direct Thrust Measurement on a Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Conners, Timothy R.; Sims, Robert L.

    1998-01-01

    Direct thrust measurement using strain gages offers advantages over analytically-based thrust calculation methods. For flight test applications, the direct measurement method typically uses a simpler sensor arrangement and minimal data processing compared to analytical techniques, which normally require costly engine modeling and multisensor arrangements throughout the engine. Conversely, direct thrust measurement has historically produced less than desirable accuracy because of difficulty in mounting and calibrating the strain gages and the inability to account for secondary forces that influence the thrust reading at the engine mounts. Consequently, the strain-gage technique has normally been used for simple engine arrangements and primarily in the subsonic speed range. This paper presents the results of a strain gage-based direct thrust-measurement technique developed by the NASA Dryden Flight Research Center and successfully applied to the full flight envelope of an F-15 aircraft powered by two F100-PW-229 turbofan engines. Measurements have been obtained at quasi-steady-state operating conditions at maximum non-augmented and maximum augmented power throughout the altitude range of the vehicle and to a maximum speed of Mach 2.0 and are compared against results from two analytically-based thrust calculation methods. The strain-gage installation and calibration processes are also described.

  11. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  12. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general.

  13. BiSet: Semantic Edge Bundling with Biclusters for Sensemaking.

    PubMed

    Sun, Maoyuan; Mi, Peng; North, Chris; Ramakrishnan, Naren

    2016-01-01

    Identifying coordinated relationships is an important task in data analytics. For example, an intelligence analyst might want to discover three suspicious people who all visited the same four cities. Existing techniques that display individual relationships, such as between lists of entities, require repetitious manual selection and significant mental aggregation in cluttered visualizations to find coordinated relationships. In this paper, we present BiSet, a visual analytics technique to support interactive exploration of coordinated relationships. In BiSet, we model coordinated relationships as biclusters and algorithmically mine them from a dataset. Then, we visualize the biclusters in context as bundled edges between sets of related entities. Thus, bundles enable analysts to infer task-oriented semantic insights about potentially coordinated activities. We make bundles as first class objects and add a new layer, "in-between", to contain these bundle objects. Based on this, bundles serve to organize entities represented in lists and visually reveal their membership. Users can interact with edge bundles to organize related entities, and vice versa, for sensemaking purposes. With a usage scenario, we demonstrate how BiSet supports the exploration of coordinated relationships in text analytics.

  14. Experimental design of an interlaboratory study for trace metal analysis of liquid fluids. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1983-01-01

    The accurate determination of trace metals and fuels is an important requirement in much of the research into and development of alternative fuels for aerospace applications. Recognizing the detrimental effects of certain metals on fuel performance and fuel systems at the part per million and in some cases part per billion levels requires improved accuracy in determining these low concentration elements. Accurate analyses are also required to ensure interchangeability of analysis results between vendor, researcher, and end use for purposes of quality control. Previous interlaboratory studies have demonstrated the inability of different laboratories to agree on the results of metal analysis, particularly at low concentration levels, yet typically good precisions are reported within a laboratory. An interlaboratory study was designed to gain statistical information about the sources of variation in the reported concentrations. Five participant laboratories were used on a fee basis and were not informed of the purpose of the analyses. The effects of laboratory, analytical technique, concentration level, and ashing additive were studied in four fuel types for 20 elements of interest. The prescribed sample preparation schemes (variations of dry ashing) were used by all of the laboratories. The analytical data were statistically evaluated using a computer program for the analysis of variance technique.

  15. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  16. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  17. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  18. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  19. Solventless and solvent-minimized sample preparation techniques for determining currently used pesticides in water samples: a review.

    PubMed

    Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek

    2011-10-30

    The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. An Inquiry-Based Project Focused on the X-Ray Powder Diffraction Analysis of Common Household Solids

    ERIC Educational Resources Information Center

    Hulien, Molly L.; Lekse, Jonathan W.; Rosmus, Kimberly A.; Devlin, Kasey P.; Glenn, Jennifer R.; Wisneski, Stephen D.; Wildfong, Peter; Lake, Charles H.; MacNeil, Joseph H.; Aitken, Jennifer A.

    2015-01-01

    While X-ray powder diffraction (XRPD) is a fundamental analytical technique used by solid-state laboratories across a breadth of disciplines, it is still underrepresented in most undergraduate curricula. In this work, we incorporate XRPD analysis into an inquiry-based project that requires students to identify the crystalline component(s) of…

  1. Microorganisms as Analytical Indicators. Experimental Methods and Techniques,

    DTIC Science & Technology

    1980-01-01

    Representa- tives of the genera Bacillus, Micrococcus, Escherichia, Pseudomonas, Aspergillus , and Penicillium are most frequently encountered...necessary for synthesis of prodigiosin, and magnesium is required for synthesis of bacteriochlorophylls. A change in the color of aspergillus spores...mesentericus niger and Bac. subtilis niger as a function of the concentration of phosphonium salts in the nutrient medium. The degree of

  2. Hertzian Dipole Radiation over Isotropic Magnetodielectric Substrates

    DTIC Science & Technology

    2015-03-01

    Analytical and numerical techniques in the Green’s function treatment of microstrip antennas and scatterers. IEE Proceedings. March 1983:130(2). 3...public release; distribution unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This report investigates dipole antennas printed on grounded...engineering of thin planar antennas . Since these materials often require complicated constitutive equations to describe their properties rigorously, the

  3. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  4. A Single-Molecule Barcoding System using Nanoslits for DNA Analysis

    NASA Astrophysics Data System (ADS)

    Jo, Kyubong; Schramm, Timothy M.; Schwartz, David C.

    Single DNA molecule approaches are playing an increasingly central role in the analytical genomic sciences because single molecule techniques intrinsically provide individualized measurements of selected molecules, free from the constraints of bulk techniques, which blindly average noise and mask the presence of minor analyte components. Accordingly, a principal challenge that must be addressed by all single molecule approaches aimed at genome analysis is how to immobilize and manipulate DNA molecules for measurements that foster construction of large, biologically relevant data sets. For meeting this challenge, this chapter discusses an integrated approach for microfabricated and nanofabricated devices for the manipulation of elongated DNA molecules within nanoscale geometries. Ideally, large DNA coils stretch via nanoconfinement when channel dimensions are within tens of nanometers. Importantly, stretched, often immobilized, DNA molecules spanning hundreds of kilobase pairs are required by all analytical platforms working with large genomic substrates because imaging techniques acquire sequence information from molecules that normally exist in free solution as unrevealing random coils resembling floppy balls of yarn. However, nanoscale devices fabricated with sufficiently small dimensions fostering molecular stretching make these devices impractical because of the requirement of exotic fabrication technologies, costly materials, and poor operational efficiencies. In this chapter, such problems are addressed by discussion of a new approach to DNA presentation and analysis that establishes scaleable nanoconfinement conditions through reduction of ionic strength; stiffening DNA molecules thus enabling their arraying for analysis using easily fabricated devices that can also be mass produced. This new approach to DNA nanoconfinement is complemented by the development of a novel labeling scheme for reliable marking of individual molecules with fluorochrome labels, creating molecular barcodes, which are efficiently read using fluorescence resonance energy transfer techniques for minimizing noise from unincorporated labels. As such, our integrative approach for the realization of genomic analysis through nanoconfinement, named nanocoding, was demonstrated through the barcoding and mapping of bacterial artificial chromosomal molecules, thereby providing the basis for a high-throughput platform competent for whole genome investigations.

  5. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    PubMed

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  6. The Role of Teamwork in the Analysis of Big Data: A Study of Visual Analytics and Box Office Prediction.

    PubMed

    Buchanan, Verica; Lu, Yafeng; McNeese, Nathan; Steptoe, Michael; Maciejewski, Ross; Cooke, Nancy

    2017-03-01

    Historically, domains such as business intelligence would require a single analyst to engage with data, develop a model, answer operational questions, and predict future behaviors. However, as the problems and domains become more complex, organizations are employing teams of analysts to explore and model data to generate knowledge. Furthermore, given the rapid increase in data collection, organizations are struggling to develop practices for intelligence analysis in the era of big data. Currently, a variety of machine learning and data mining techniques are available to model data and to generate insights and predictions, and developments in the field of visual analytics have focused on how to effectively link data mining algorithms with interactive visuals to enable analysts to explore, understand, and interact with data and data models. Although studies have explored the role of single analysts in the visual analytics pipeline, little work has explored the role of teamwork and visual analytics in the analysis of big data. In this article, we present an experiment integrating statistical models, visual analytics techniques, and user experiments to study the role of teamwork in predictive analytics. We frame our experiment around the analysis of social media data for box office prediction problems and compare the prediction performance of teams, groups, and individuals. Our results indicate that a team's performance is mediated by the team's characteristics such as openness of individual members to others' positions and the type of planning that goes into the team's analysis. These findings have important implications for how organizations should create teams in order to make effective use of information from their analytic models.

  7. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  8. One-step selective electrokinetic removal of inorganic anions from small volumes and its application as sample clean-up for mass spectrometric techniques.

    PubMed

    Tubaon, Ria Marni; Haddad, Paul R; Quirino, Joselito P

    2017-03-10

    The presence of inorganic anions in a sample interferes with mass spectrometric (MS) analysis. Here, a simple method to remove these ions from a liquid sample in one-step is described. The inorganic anions present in a 50μL sample were extracted into a low pH solution inside a 200μm i.d.×33cm long capillary by the use of an electric field. The selective removal of unwanted anions and retention of target analytes was accomplished by control of the apparent electrophoretic velocities of anions and analytes at a boundary that separated the sample and extraction solution. No physical barrier (e.g., membrane) was required and with the boundary situated at the tip of the capillary, efficient removal of inorganic anions (e.g., >80% removal) and good recovery of target analytes (e.g., >80% recovery) were achieved. The time required for removal of the inorganic anions was found to depend on their initial concentrations. The removal process was investigated using different concentrations of bromide and nitrate (as potassium salts) and negatively chargeable drugs as target analytes. This micro-sample clean-up technique used no organic solvents and little consumables and was studied to the determination of 0.6μg/L arsenic and 8.3μg/L vanadium in 500mg/L sodium chloride using inductively coupled plasma MS and 50μM angiotensin I in 1000mg/L sodium chloride using electrospray ionisation MS. Micro-sample clean-up was performed for 45min at 3kV in both demonstrations. The calculated recoveries for the metals at trace levels were 110-130%, and for the peptide was 103.8%. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  10. High Accuracy Evaluation of the Finite Fourier Transform Using Sampled Data

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    Many system identification and signal processing procedures can be done advantageously in the frequency domain. A required preliminary step for this approach is the transformation of sampled time domain data into the frequency domain. The analytical tool used for this transformation is the finite Fourier transform. Inaccuracy in the transformation can degrade system identification and signal processing results. This work presents a method for evaluating the finite Fourier transform using cubic interpolation of sampled time domain data for high accuracy, and the chirp Zeta-transform for arbitrary frequency resolution. The accuracy of the technique is demonstrated in example cases where the transformation can be evaluated analytically. Arbitrary frequency resolution is shown to be important for capturing details of the data in the frequency domain. The technique is demonstrated using flight test data from a longitudinal maneuver of the F-18 High Alpha Research Vehicle.

  11. Calculation of cogging force in a novel slotted linear tubular brushless permanent magnet motor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Z.Q.; Hor, P.J.; Howe, D.

    1997-09-01

    There is an increasing requirement for controlled linear motion over short and long strokes, in the factory automation and packaging industries, for example. Linear brushless PM motors could offer significant advantages over conventional actuation technologies, such as motor driven cams and linkages and pneumatic rams--in terms of efficiency, operating bandwidth, speed and thrust control, stroke and positional accuracy, and indeed over other linear motor technologies, such as induction motors. Here, a finite element/analytical based technique for the prediction of cogging force in a novel topology of slotted linear brushless permanent magnet motor has been developed and validated. The various forcemore » components, which influence cogging are pre-calculated by the finite element analysis of some basic magnetic structures, facilitate the analytical synthesis of the resultant cogging force. The technique can be used to aid design for the minimization of cogging.« less

  12. Suitability of analytical methods to measure solubility for the purpose of nanoregulation.

    PubMed

    Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike

    2016-01-01

    Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability.

  13. Terrain modeling for microwave landing system

    NASA Technical Reports Server (NTRS)

    Poulose, M. M.

    1991-01-01

    A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.

  14. Supercritical fluid chromatography: a promising alternative to current bioanalytical techniques.

    PubMed

    Dispas, Amandine; Jambo, Hugues; André, Sébastien; Tyteca, Eva; Hubert, Philippe

    2018-01-01

    During the last years, chemistry was involved in the worldwide effort toward environmental problems leading to the birth of green chemistry. In this context, green analytical tools were developed as modern Supercritical Fluid Chromatography in the field of separative techniques. This chromatographic technique knew resurgence a few years ago, thanks to its high efficiency, fastness and robustness of new generation equipment. These advantages and its easy hyphenation to MS fulfill the requirements of bioanalysis regarding separation capacity and high throughput. In the present paper, the technical aspects focused on bioanalysis specifications will be detailed followed by a critical review of bioanalytical supercritical fluid chromatography methods published in the literature.

  15. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  16. Multiplexed Paper Analytical Device for Quantification of Metals using Distance-Based Detection

    PubMed Central

    Cate, David M.; Noblitt, Scott D.; Volckens, John; Henry, Charles S.

    2015-01-01

    Exposure to metal-containing aerosols has been linked with adverse health outcomes for almost every organ in the human body. Commercially available techniques for quantifying particulate metals are time-intensive, laborious, and expensive; often sample analysis exceeds $100. We report a simple technique, based upon a distance-based detection motif, for quantifying metal concentrations of Ni, Cu, and Fe in airborne particulate matter using microfluidic paper-based analytical devices. Paper substrates are used to create sensors that are self-contained, self-timing, and require only a drop of sample for operation. Unlike other colorimetric approaches in paper microfluidics that rely on optical instrumentation for analysis, with distance-based detection, analyte is quantified visually based on the distance of a colorimetric reaction, similar to reading temperature on a thermometer. To demonstrate the effectiveness of this approach, Ni, Cu, and Fe were measured individually in single-channel devices; detection limits as low as 0.1, 0.1, and 0.05 µg were reported for Ni, Cu, and Fe. Multiplexed analysis of all three metals was achieved with detection limits of 1, 5, and 1 µg for Ni, Cu, and Fe. We also extended the dynamic range for multi-analyte detection by printing concentration gradients of colorimetric reagents using an off the shelf inkjet printer. Analyte selectivity was demonstrated for common interferences. To demonstrate utility of the method, Ni, Cu, and Fe were measured from samples of certified welding fume; levels measured with paper sensors matched known values determined gravimetrically. PMID:26009988

  17. Rotor vibration caused by external excitation and rub

    NASA Technical Reports Server (NTRS)

    Matsushita, O.; Takagi, M.; Kikuchi, K.; Kaga, M.

    1982-01-01

    For turbomachinery with low natural frequencies, considerations have been recently required for rotor vibrations caused by external forces except unbalance one, such as foundation motion, seismic wave, rub and so forth. Such a forced vibration is investigated analytically and experimentally in the present paper. Vibrations in a rotor-bearing system under a harmonic excitation are analyzed by the modal technique in the case of a linear system including gyroscopic effect. For a nonlinear system a new and powerful quasi-modal technique is developed and applied to the vibration caused by rub.

  18. Multiclass Bayes error estimation by a feature space sampling technique

    NASA Technical Reports Server (NTRS)

    Mobasseri, B. G.; Mcgillem, C. D.

    1979-01-01

    A general Gaussian M-class N-feature classification problem is defined. An algorithm is developed that requires the class statistics as its only input and computes the minimum probability of error through use of a combined analytical and numerical integration over a sequence simplifying transformations of the feature space. The results are compared with those obtained by conventional techniques applied to a 2-class 4-feature discrimination problem with results previously reported and 4-class 4-feature multispectral scanner Landsat data classified by training and testing of the available data.

  19. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.

  20. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  1. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. A strategy to determine operating parameters in tissue engineering hollow fiber bioreactors

    PubMed Central

    Shipley, RJ; Davidson, AJ; Chan, K; Chaudhuri, JB; Waters, SL; Ellis, MJ

    2011-01-01

    The development of tissue engineering hollow fiber bioreactors (HFB) requires the optimal design of the geometry and operation parameters of the system. This article provides a strategy for specifying operating conditions for the system based on mathematical models of oxygen delivery to the cell population. Analytical and numerical solutions of these models are developed based on Michaelis–Menten kinetics. Depending on the minimum oxygen concentration required to culture a functional cell population, together with the oxygen uptake kinetics, the strategy dictates the model needed to describe mass transport so that the operating conditions can be defined. If cmin ≫ Km we capture oxygen uptake using zero-order kinetics and proceed analytically. This enables operating equations to be developed that allow the user to choose the medium flow rate, lumen length, and ECS depth to provide a prescribed value of cmin. When , we use numerical techniques to solve full Michaelis–Menten kinetics and present operating data for the bioreactor. The strategy presented utilizes both analytical and numerical approaches and can be applied to any cell type with known oxygen transport properties and uptake kinetics. PMID:21370228

  3. Environmental and human monitoring of Americium-241 utilizing extraction chromatography and {alpha}-Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, S.J.; Hensley, C.A.; Armenta, C.E.

    1997-03-01

    Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for {alpha}-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of `real` environmental and bioassaymore » samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of {approx}2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously. 24 refs., 2 figs., 2 tabs.« less

  4. The potential of SNP-based PCR-RFLP capillary electrophoresis analysis to authenticate and detect admixtures of Mediterranean olive oils.

    PubMed

    Bazakos, Christos; Khanfir, Emna; Aoun, Mariem; Spano, Thodhoraq; Zein, Zeina El; Chalak, Lamis; Riachy, Milad El; Abou-Sleymane, Gretta; Ali, Sihem Ben; Grati Kammoun, Naziha; Kalaitzis, Panagiotis

    2016-07-01

    Authentication and traceability of extra virgin olive oil is a challenging research task due to the complexity of fraudulent practices. In this context, the monovarietal olive oils of Protected Designation of Origin (PDO) and Protected Geographical Indication (PGI) require new tests and cutting edge analytical technologies to detect mislabeling and misleading origin. Toward this direction, DNA-based technologies could serve as a complementary to the analytical techniques assay. Single nucleotide polymorphisms are ideal molecular markers since they require short PCR analytical targets which are a prerequisite for forensic applications in olive oil sector. In the present study, a small number of polymorphic SNPs were used with an SNP-based PCR-RFLP capillary electrophoresis platform to discriminate six out of 13 monovarietal olive oils of Mediterranean origin from three different countries, Greece, Tunisia, and Lebanon. Moreover, the high sensitivity of capillary electrophoresis in combination with the DNA extraction protocol lowered the limit of detection to 10% in an admixture of Tsounati in a Koroneiki olive oil matrix. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  6. Commodity-Free Calibration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Commodity-free calibration is a reaction rate calibration technique that does not require the addition of any commodities. This technique is a specific form of the reaction rate technique, where all of the necessary reactants, other than the sample being analyzed, are either inherent in the analyzing system or specifically added or provided to the system for a reason other than calibration. After introduction, the component of interest is exposed to other reactants or flow paths already present in the system. The instrument detector records one of the following to determine the rate of reaction: the increase in the response of the reaction product, a decrease in the signal of the analyte response, or a decrease in the signal from the inherent reactant. With this data, the initial concentration of the analyte is calculated. This type of system can analyze and calibrate simultaneously, reduce the risk of false positives and exposure to toxic vapors, and improve accuracy. Moreover, having an excess of the reactant already present in the system eliminates the need to add commodities, which further reduces cost, logistic problems, and potential contamination. Also, the calculations involved can be simplified by comparison to those of the reaction rate technique. We conducted tests with hypergols as an initial investigation into the feasiblility of the technique.

  7. Service Bundle Recommendation for Person-Centered Care Planning in Cities.

    PubMed

    Kotoulas, Spyros; Daly, Elizabeth; Tommasi, Pierpaolo; Kishimoto, Akihiro; Lopez, Vanessa; Stephenson, Martin; Botea, Adi; Sbodio, Marco; Marinescu, Radu; Rooney, Ronan

    2016-01-01

    Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning.

  8. Motion Estimation and Compensation Strategies in Dynamic Computerized Tomography

    NASA Astrophysics Data System (ADS)

    Hahn, Bernadette N.

    2017-12-01

    A main challenge in computerized tomography consists in imaging moving objects. Temporal changes during the measuring process lead to inconsistent data sets, and applying standard reconstruction techniques causes motion artefacts which can severely impose a reliable diagnostics. Therefore, novel reconstruction techniques are required which compensate for the dynamic behavior. This article builds on recent results from a microlocal analysis of the dynamic setting, which enable us to formulate efficient analytic motion compensation algorithms for contour extraction. Since these methods require information about the dynamic behavior, we further introduce a motion estimation approach which determines parameters of affine and certain non-affine deformations directly from measured motion-corrupted Radon-data. Our methods are illustrated with numerical examples for both types of motion.

  9. Clustering Patterns of Engagement in Massive Open Online Courses (MOOCs): The Use of Learning Analytics to Reveal Student Categories

    ERIC Educational Resources Information Center

    Khalil, Mohammad; Ebner, Martin

    2017-01-01

    Massive Open Online Courses (MOOCs) are remote courses that excel in their students' heterogeneity and quantity. Due to the peculiarity of being massiveness, the large datasets generated by MOOC platforms require advanced tools and techniques to reveal hidden patterns for purposes of enhancing learning and educational behaviors. This publication…

  10. Implementation of picoSpin Benchtop NMR Instruments into Organic Chemistry Teaching Laboratories through Spectral Analysis of Fischer Esterification Products

    ERIC Educational Resources Information Center

    Yearty, Kasey L.; Sharp, Joseph T.; Meehan, Emma K.; Wallace, Doyle R.; Jackson, Douglas M.; Morrison, Richard W.

    2017-01-01

    [Superscript 1]H NMR analysis is an important analytical technique presented in introductory organic chemistry courses. NMR instrument access is limited for undergraduate organic chemistry students due to the size of the instrument, price of NMR solvents, and the maintenance level required for instrument upkeep. The University of Georgia Chemistry…

  11. High frequency flow-structural interaction in dense subsonic fluids

    NASA Technical Reports Server (NTRS)

    Liu, Baw-Lin; Ofarrell, J. M.

    1995-01-01

    Prediction of the detailed dynamic behavior in rocket propellant feed systems and engines and other such high-energy fluid systems requires precise analysis to assure structural performance. Designs sometimes require placement of bluff bodies in a flow passage. Additionally, there are flexibilities in ducts, liners, and piping systems. A design handbook and interactive data base have been developed for assessing flow/structural interactions to be used as a tool in design and development, to evaluate applicable geometries before problems develop, or to eliminate or minimize problems with existing hardware. This is a compilation of analytical/empirical data and techniques to evaluate detailed dynamic characteristics of both the fluid and structures. These techniques have direct applicability to rocket engine internal flow passages, hot gas drive systems, and vehicle propellant feed systems. Organization of the handbook is by basic geometries for estimating Strouhal numbers, added mass effects, mode shapes for various end constraints, critical onset flow conditions, and possible structural response amplitudes. Emphasis is on dense fluids and high structural loading potential for fatigue at low subsonic flow speeds where high-frequency excitations are possible. Avoidance and corrective measure illustrations are presented together with analytical curve fits for predictions compiled from a comprehensive data base.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurley, D.F.; Whitehouse, J.M.

    A dedicated low-flow groundwater sample collection system was designed for implementation in a post-closure ACL monitoring program at the Yaworski Lagoon NPL site in Canterbury, Connecticut. The system includes dedicated bladder pumps with intake ports located in the screened interval of the monitoring wells. This sampling technique was implemented in the spring of 1993. The system was designed to simultaneously obtain samples directly from the screened interval of nested wells in three distinct water bearing zones. Sample collection is begun upon stabilization of field parameters. Other than line volume, no prior purging of the well is required. It was foundmore » that dedicated low-flow sampling from the screened interval provides a method of representative sample collection without the bias of suspended solids introduced by traditional techniques of pumping and bailing. Analytical data indicate that measured chemical constituents are representative of groundwater migrating through the screened interval. Upon implementation of the low-flow monitoring system, analytical results exhibited a decrease in concentrations of some organic compounds and metals. The system has also proven to be a cost effective alternative to pumping and bailing which generate large volumes of purge water requiring containment and disposal.« less

  13. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  14. Individual human cell responses to low doses of chemicals studied by synchrotron infrared spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Holman, Hoi-Ying N.; Goth-Goldstein, Regine; Blakely, Elanor A.; Bjornstad, Kathy; Martin, Michael C.; McKinney, Wayne R.

    2000-05-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in the individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR-FTIR microscopy probes intact living cells providing a composite view of all of the molecular response and the ability to monitor the response over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low- doses of chemicals. In this study we used the high spatial - resolution SR-FTIR vibrational spectromicroscopy as a sensitive analytical tool to detect chemical- and radiation- induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of dioxin. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio- compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  15. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  16. Characterization of Cyclodextrin/Volatile Inclusion Complexes: A Review.

    PubMed

    Kfoury, Miriana; Landy, David; Fourmentin, Sophie

    2018-05-17

    Cyclodextrins (CDs) are a family of cyclic oligosaccharides that constitute one of the most widely used molecular hosts in supramolecular chemistry. Encapsulation in the hydrophobic cavity of CDs positively affects the physical and chemical characteristics of the guests upon the formation of inclusion complexes. Such a property is interestingly employed to retain volatile guests and reduce their volatility. Within this scope, the starting crucial point for a suitable and careful characterization of an inclusion complex is to assess the value of the formation constant (K f ), also called stability or binding constant. This task requires the application of the appropriate analytical method and technique. Thus, the aim of the present paper is to give a general overview of the main analytical tools used for the determination of K f values for CD/volatile inclusion complexes. This review emphasizes on the advantages, inconvenients and limits of each applied method. A special attention is also dedicated to the improvement of the current methods and to the development of new techniques. Further, the applicability of each technique is illustrated by a summary of data obtained from the literature.

  17. Engineering fluidic delays in paper-based devices using laser direct-writing.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2015-10-21

    We report the use of a new laser-based direct-write technique that allows programmable and timed fluid delivery in channels within a paper substrate which enables implementation of multi-step analytical assays. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depth and/or the porosity of hydrophobic barriers which, when fabricated in the fluid path, produce controllable fluid delay. We have patterned these flow delaying barriers at pre-defined locations in the fluidic channels using either a continuous wave laser at 405 nm, or a pulsed laser operating at 266 nm. Using this delay patterning protocol we generated flow delays spanning from a few minutes to over half an hour. Since the channels and flow delay barriers can be written via a common laser-writing process, this is a distinct improvement over other methods that require specialist operating environments, or custom-designed equipment. This technique can therefore be used for rapid fabrication of paper-based microfluidic devices that can perform single or multistep analytical assays.

  18. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  19. [ABOUT UNIFICATION OF LABORATORY CRITERIA OF DIFFERENTIATION OF BACTERIAL VAGINOSIS].

    PubMed

    Mavzutov, A R; Tsvetkova, A V; Muretdinova, L A

    2015-06-01

    The article presents analysis of laboratory criteria and classifcations used to interpret results of laboratory analysis by technique of microscopy on bacterial vaginosis or dysbacteriosis of vagina. Their advantages and restrictions are demonstrated The unified criteria of evaluation are proposed concerning results of microscopy of mucosal discharge of vagina and corresponding classification. Thereafter, three degrees of bacterial vaginosis (dysbacteriosis of vagina) are differentiated: first degree--compensated dysbacteriosis of vagina, second degree--sub compensated dysbacteriosis of vagina and third degree--decompensated dysbacteriosis of vagina. The corresponding laboratory report of physician is formulated. The proposals are presented concerning development of common unified requirements to stages (pre-analytical, analytical, post-analytical) of laboratory diagnostic of bacterial vaginosis (dysbacteriosis of vagina) with purpose of their unambiguous understanding by clinicians and hence their decision making concerning necessity and tactics of management of patient.

  20. A Review of Interface Electronic Systems for AT-cut Quartz Crystal Microbalance Applications in Liquids

    PubMed Central

    Arnau, Antonio

    2008-01-01

    From the first applications of AT-cut quartz crystals as sensors in solutions more than 20 years ago, the so-called quartz crystal microbalance (QCM) sensor is becoming into a good alternative analytical method in a great deal of applications such as biosensors, analysis of biomolecular interactions, study of bacterial adhesion at specific interfaces, pathogen and microorganism detection, study of polymer film-biomolecule or cell-substrate interactions, immunosensors and an extensive use in fluids and polymer characterization and electrochemical applications among others. The appropriate evaluation of this analytical method requires recognizing the different steps involved and to be conscious of their importance and limitations. The first step involved in a QCM system is the accurate and appropriate characterization of the sensor in relation to the specific application. The use of the piezoelectric sensor in contact with solutions strongly affects its behavior and appropriate electronic interfaces must be used for an adequate sensor characterization. Systems based on different principles and techniques have been implemented during the last 25 years. The interface selection for the specific application is important and its limitations must be known to be conscious of its suitability, and for avoiding the possible error propagation in the interpretation of results. This article presents a comprehensive overview of the different techniques used for AT-cut quartz crystal microbalance in in-solution applications, which are based on the following principles: network or impedance analyzers, decay methods, oscillators and lock-in techniques. The electronic interfaces based on oscillators and phase-locked techniques are treated in detail, with the description of different configurations, since these techniques are the most used in applications for detection of analytes in solutions, and in those where a fast sensor response is necessary. PMID:27879713

  1. Moving your laboratories to the field – Advantages and limitations of the use of field portable instruments in environmental sample analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical chemistry principles. • Performance requirements in field analysis stimulate technological progress.« less

  2. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  3. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  4. Orbital Transfer Vehicle Engine Technology High Velocity Ratio Diffusing Crossover

    NASA Technical Reports Server (NTRS)

    Lariviere, Brian W.

    1992-01-01

    High speed, high efficiency head rise multistage pumps require continuous passage diffusing crossovers to effectively convey the pumped fluid from the exit of one impeller to the inlet of the next impeller. On Rocketdyne's Orbital Transfer Vehicle (OTV), the MK49-F, a three stage high pressure liquid hydrogen turbopump, utilizes a 6.23 velocity ratio diffusing crossover. This velocity ratio approaches the diffusion limits for stable and efficient flow over the operating conditions required by the OTV system. The design of the high velocity ratio diffusing crossover was based on advanced analytical techniques anchored by previous tests of stationary two-dimensional diffusers with steady flow. To secure the design and the analytical techniques, tests were required with the unsteady whirling characteristics produced by an impeller. A tester was designed and fabricated using a 2.85 times scale model of the MK49-F turbopumps first stage, including the inducer, impeller, and the diffusing crossover. Water and air tests were completed to evaluate the large scale turbulence, non-uniform velocity, and non-steady velocity on the pump and crossover head and efficiency. Suction performance tests from 80 percent to 124 percent of design flow were completed in water to assess these pump characteristics. Pump and diffuser performance from the water and air tests were compared with the actual MK49-F test data in liquid hydrogen.

  5. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  6. Bio-analytical applications of microbial fuel cell-based biosensors for onsite water quality monitoring.

    PubMed

    ElMekawy, A; Hegab, H M; Pant, D; Saint, C P

    2018-01-01

    Globally, sustainable provision of high-quality safe water is a major challenge of the 21st century. Various chemical and biological monitoring analytics are presently utilized to guarantee the availability of high-quality water. However, these techniques still face some challenges including high costs, complex design and onsite and online limitations. The recent technology of using microbial fuel cell (MFC)-based biosensors holds outstanding potential for the rapid and real-time monitoring of water source quality. MFCs have the advantages of simplicity in design and efficiency for onsite sensing. Even though some sensing applications of MFCs were previously studied, e.g. biochemical oxygen demand sensor, recently numerous research groups around the world have presented new practical applications of this technique, which combine multidisciplinary scientific knowledge in materials science, microbiology and electrochemistry fields. This review presents the most updated research on the utilization of MFCs as potential biosensors for monitoring water quality and considers the range of potentially toxic analytes that have so far been detected using this methodology. The advantages of MFCs over established technology are also considered as well as future work required to establish their routine use. © 2017 The Society for Applied Microbiology.

  7. In-orbit evaluation of the control system/structural mode interactions of the OSO-8 spacecraft

    NASA Technical Reports Server (NTRS)

    Slafer, L. I.

    1979-01-01

    The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. The paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments, and have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system.

  8. Hierarchical zwitterionic modification of a SERS substrate enables real-time drug monitoring in blood plasma

    PubMed Central

    Sun, Fang; Hung, Hsiang-Chieh; Sinclair, Andrew; Zhang, Peng; Bai, Tao; Galvan, Daniel David; Jain, Priyesh; Li, Bowen; Jiang, Shaoyi; Yu, Qiuming

    2016-01-01

    Surface-enhanced Raman spectroscopy (SERS) is an ultrasensitive analytical technique with molecular specificity, making it an ideal candidate for therapeutic drug monitoring (TDM). However, in critical diagnostic media including blood, nonspecific protein adsorption coupled with weak surface affinities and small Raman activities of many analytes hinder the TDM application of SERS. Here we report a hierarchical surface modification strategy, first by coating a gold surface with a self-assembled monolayer (SAM) designed to attract or probe for analytes and then by grafting a non-fouling zwitterionic polymer brush layer to effectively repel protein fouling. We demonstrate how this modification can enable TDM applications by quantitatively and dynamically measuring the concentrations of several analytes—including an anticancer drug (doxorubicin), several TDM-requiring antidepressant and anti-seizure drugs, fructose and blood pH—in undiluted plasma. This hierarchical surface chemistry is widely applicable to many analytes and provides a generalized platform for SERS-based biosensing in complex real-world media. PMID:27834380

  9. SFC-MS/MS as an orthogonal technique for improved screening of polar analytes in anti-doping control.

    PubMed

    Parr, Maria Kristina; Wuest, Bernhard; Naegele, Edgar; Joseph, Jan F; Wenzel, Maxi; Schmidt, Alexander H; Stanic, Mijo; de la Torre, Xavier; Botrè, Francesco

    2016-09-01

    HPLC is considered the method of choice for the separation of various classes of drugs. However, some analytes are still challenging as HPLC shows limited resolution capabilities for highly polar analytes as they interact insufficiently on conventional reversed-phase (RP) columns. Especially in combination with mass spectrometric detection, limitations apply for alterations of stationary phases. Some highly polar sympathomimetic drugs and their metabolites showed almost no retention on different RP columns. Their retention remains poor even on phenylhexyl phases that show different selectivity due to π-π interactions. Supercritical fluid chromatography (SFC) as an orthogonal separation technique to HPLC may help to overcome these issues. Selected polar drugs and metabolites were analyzed utilizing SFC separation. All compounds showed sharp peaks and good retention even for the very polar analytes, such as sulfoconjugates. Retention times and elution orders in SFC are different to both RP and HILIC separations as a result of the orthogonality. Short cycle times could be realized. As temperature and pressure strongly influence the polarity of supercritical fluids, precise regulation of temperature and backpressure is required for the stability of the retention times. As CO2 is the main constituent of the mobile phase in SFC, solvent consumption and solvent waste are considerably reduced. Graphical Abstract SFC-MS/MS vs. LC-MS/MS.

  10. Analytical electron microscopy in mineralogy; exsolved phases in pyroxenes

    USGS Publications Warehouse

    Nord, G.L.

    1982-01-01

    Analytical scanning transmission electron microscopy has been successfully used to characterize the structure and composition of lamellar exsolution products in pyroxenes. At operating voltages of 100 and 200 keV, microanalytical techniques of x-ray energy analysis, convergent-beam electron diffraction, and lattice imaging have been used to chemically and structurally characterize exsolution lamellae only a few unit cells wide. Quantitative X-ray energy analysis using ratios of peak intensities has been adopted for the U.S. Geological Survey AEM in order to study the compositions of exsolved phases and changes in compositional profiles as a function of time and temperature. The quantitative analysis procedure involves 1) removal of instrument-induced background, 2) reduction of contamination, and 3) measurement of correction factors obtained from a wide range of standard compositions. The peak-ratio technique requires that the specimen thickness at the point of analysis be thin enough to make absorption corrections unnecessary (i.e., to satisfy the "thin-foil criteria"). In pyroxenes, the calculated "maximum thicknesses" range from 130 to 1400 nm for the ratios Mg/Si, Fe/Si, and Ca/Si; these "maximum thicknesses" have been contoured in pyroxene composition space as a guide during analysis. Analytical spatial resolutions of 50-100 nm have been achieved in AEM at 200 keV from the composition-profile studies, and analytical reproducibility in AEM from homogeneous pyroxene standards is ?? 1.5 mol% endmember. ?? 1982.

  11. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  12. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  13. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  14. Need total sulfur content? Use chemiluminescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubala, S.W.; Campbell, D.N.; DiSanzo, F.P.

    Regulations issued by the United States Environmental Protection Agency require petroleum refineries to reduce or control the amount of total sulfur present in their refined products. These legislative requirements have led many refineries to search for online instrumentation that can produce accurate and repeatable total sulfur measurements within allowed levels. Several analytical methods currently exist to measure total sulfur content. They include X-ray fluorescence (XRF), microcoulometry, lead acetate tape, and pyrofluorescence techniques. Sulfur-specific chemiluminescence detection (SSCD) has recently received much attention due to its linearity, selectivity, sensitivity, and equimolar response. However, its use has been largely confined to the areamore » of gas chromatography. This article focuses on the special design considerations and analytical utility of an SSCD system developed to determine total sulfur content in gasoline. The system exhibits excellent linearity and selectivity, the ability to detect low minimum levels, and an equimolar response to various sulfur compounds. 2 figs., 2 tabs.« less

  15. Stabilization of glucose-oxidase in the graphene paste for screen-printed glucose biosensor

    NASA Astrophysics Data System (ADS)

    Pepłowski, Andrzej; Janczak, Daniel; Jakubowska, Małgorzata

    2015-09-01

    Various methods and materials for enzyme stabilization within screen-printed graphene sensor were analyzed. Main goal was to develop technology allowing immediate printing of the biosensors in single printing process. Factors being considered were: toxicity of the materials used, ability of the material to be screen-printed (squeezed through the printing mesh) and temperatures required in the fabrication process. Performance of the examined sensors was measured using chemical amperometry method, then appropriate analysis of the measurements was conducted. The analysis results were then compared with the medical requirements. Parameters calculated were: correlation coefficient between concentration of the analyte and the measured electrical current (0.986) and variation coefficient for the particular concentrations of the analyte used as the calibration points. Variation of the measured values was significant only in ranges close to 0, decreasing for the concentrations of clinical importance. These outcomes justify further development of the graphene-based biosensors fabricated through printing techniques.

  16. Tannin quantification in red grapes and wine: comparison of polysaccharide- and protein-based tannin precipitation techniques and their ability to model wine astringency.

    PubMed

    Mercurio, Meagan D; Smith, Paul A

    2008-07-23

    Quantification of red grape tannin and red wine tannin using the methyl cellulose precipitable (MCP) tannin assay and the Adams-Harbertson (A-H) tannin assay were investigated. The study allowed for direct comparison between the repeatability of the assays and for the assessment of other practical considerations such as time efficiency, ease of practice, and throughput, and assessed the relationships between tannin quantification by both analytical techniques. A strong correlation between the two analytical techniques was observed when quantifying grape tannin (r(2) = 0.96), and a good correlation was observed for wine tannins (r(2) = 0.80). However, significant differences in the reported tannin values for the analytical techniques were observed (approximately 3-fold). To explore potential reasons for the difference, investigations were undertaken to determine how several variables influenced the final tannin quantification for both assays. These variables included differences in the amount of tannin precipitated (monitored by HPLC), assay matrix variables, and the monomers used to report the final values. The relationship between tannin quantification and wine astringency was assessed for the MCP and A-H tannin assays, and both showed strong correlations with perceived wine astringency (r(2) = 0.83 and r(2) = 0.90, respectively). The work described here gives guidance to those wanting to understand how the values between the two assays relate; however, a conclusive explanation for the differences in values between the MCP and A-H tannin assays remains unclear, and further work in this area is required.

  17. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  18. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  19. Techniques of Water-Resources Investigations of the United States Geological Survey. Book 5, Laboratory Analysis. Chapter A5, Methods for Determination of Radioactive Substances in Water and Fluvial Sediments.

    ERIC Educational Resources Information Center

    Thatcher, L. L.; And Others

    Analytical methods for determining important components of fission and natural radioactivity found in water are reported. The discussion of each method includes conditions for application of the method, a summary of the method, interferences, required apparatus, procedures, calculations and estimation of precision. Isotopes considered are…

  20. 7 CFR 400.172 - Qualifying with less than two of the required ratios or ten of the analytical ratios meeting the...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... or ten of the analytical ratios meeting the specified requirements. 400.172 Section 400.172... required ratios or ten of the analytical ratios meeting the specified requirements. An insurer with less than two of the required ratios or ten of the analytical ratios meeting the specified requirements in...

  1. [Amanitine determination as an example of peptide analysis in the biological samples with HPLC-MS technique].

    PubMed

    Janus, Tomasz; Jasionowicz, Ewa; Potocka-Banaś, Barbara; Borowiak, Krzysztof

    Routine toxicological analysis is mostly focused on the identification of non-organic and organic, chemically different compounds, but generally with low mass, usually not greater than 500–600 Da. Peptide compounds with atomic mass higher than 900 Da are a specific analytical group. Several dozen of them are highly-toxic substances well known in toxicological practice, for example mushroom toxin and animal venoms. In the paper the authors present an example of alpha-amanitin to explain the analytical problems and different original solutions in identifying peptides in urine samples with the use of the universal LC MS/MS procedure. The analyzed material was urine samples collected from patients with potential mushroom intoxication, routinely diagnosed for amanitin determination. Ultra filtration with centrifuge filter tubes (limited mass cutoff 3 kDa) was used. Filtrate fluid was directly injected on the chromatographic column and analyzed with a mass detector (MS/MS). The separation of peptides as organic, amphoteric compounds from biological material with the use of the SPE technique is well known but requires dedicated, specific columns. The presented paper proved that with the fast and simple ultra filtration technique amanitin can be effectively isolated from urine, and the procedure offers satisfactory sensitivity of detection and eliminates the influence of the biological matrix on analytical results. Another problem which had to be solved was the non-characteristic fragmentation of peptides in the MS/MS procedure providing non-selective chromatograms. It is possible to use higher collision energies in the analytical procedure, which results in more characteristic mass spectres, although it offers lower sensitivity. The ultra filtration technique as a procedure of sample preparation is effective for the isolation of amanitin from the biological matrix. The monitoring of selected mass corresponding to transition with the loss of water molecule offers satisfactory sensitivity of determination.

  2. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  3. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Multiple Disconnected Brain Subcortical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2014-01-01

    We present a novel surface parameterization technique using hyperspherical harmonics (HSH) in representing compact, multiple, disconnected brain subcortical structures as a single analytic function. The proposed hyperspherical harmonic representation (HyperSPHARM) has many advantages over the widely used spherical harmonic (SPHARM) parameterization technique. SPHARM requires flattening 3D surfaces to 3D sphere which can be time consuming for large surface meshes, and can’t represent multiple disconnected objects with single parameterization. On the other hand, HyperSPHARM treats 3D object, via simple stereographic projection, as a surface of 4D hypersphere with extremely large radius, hence avoiding the computationally demanding flattening process. HyperSPHARM is shown to achieve a better reconstruction with only 5 basis compared to SPHARM that requires more than 441. PMID:24505716

  4. Deposition of zinc sulfide thin films by chemical bath process

    NASA Astrophysics Data System (ADS)

    Oladeji, Isaiah O.; Chow, Lee

    1996-11-01

    Deposition of high quality zinc sulfide (ZnS) thin film over a large area is required if it is to be effectively used in electroluminescent devices, solar cells, and other optoelectronic devices. Of all deposition techniques, chemical bath deposition (CBD) is the least costly technique that meets the above requirements. Recently it is found that the growth of ZnS film, of thickness less than 100 nm in a single dip, by CBD is facilitated by the use of ammonia and hydrazine as complexing agents. Here we report that the thickness of the deposited ZnS film can be increased if ammonium salt is used as a buffer. We also present an analytical study to explain our results and to further understand the ZnS growth process in CBD.

  5. Analysis of satellite altimeter signal characteristics and investigation of sea-truth data requirements

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Results are presented of analysis of satellite signal characteristics as influenced by ocean surface roughness and an investigation of sea truth data requirements. The first subject treated is that of postflight waveform reconstruction for the Skylab S-193 radar altimeter. Sea state estimation accuracies are derived based on analytical and hybrid computer simulation techniques. An analysis of near-normal incidence, microwave backscattering from the ocean's surface is accomplished in order to obtain the minimum sea truth data necessary for good agreement between theoretical and experimental scattering results. Sea state bias is examined from the point of view of designing an experiment which will lead to a resolution of the problem. A discussion is given of some deficiencies which were found in the theory underlying the Stilwell technique for spectral measurements.

  6. Microsystems in medicine.

    PubMed

    Wallrabe, U; Ruther, P; Schaller, T; Schomburg, W K

    1998-03-01

    The complexity of modern surgical and analytical methods requires the miniaturisation of many medical devices. The LIGA technique and also mechanical microengineering are well known for the batch fabrication of microsystems. Actuators and sensors are developed based on these techniques. The hydraulic actuation principle is advantageous for medical applications since the energy may be supplied by pressurised balanced salt solution. Some examples are turbines, pumps and valves. In addition, optical sensors and components are useful for analysis and inspection as represented by microspectrometers and spherical lenses. Finally, plastic containers with microporous bottoms allow a 3-dimensional growth of cell culture systems.

  7. Guide star targeting success for the HEAO-B observatory

    NASA Technical Reports Server (NTRS)

    Farrenkopf, R. L.; Hoffman, D. P.

    1977-01-01

    The statistics associated with the successful selection and acquisition of guide stars as attitude benchmarks for use in reorientation maneuvers of the HEAO-B observatory are considered as a function of the maneuver angle, initial attitude uncertainties, and the pertinent celestial region. Success likelihoods in excess of 0.99 are predicted assuming anticipated gyro and star tracker error sources. The maneuver technique and guide star selection constraints are described in detail. The results presented are specialized numerically to the HEAO-B observatory. However, the analytical techniques developed are considered applicable to broader classes of spacecraft requiring celestial targeting.

  8. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  9. A new method for flight test determination of propulsive efficiency and drag coefficient

    NASA Technical Reports Server (NTRS)

    Bull, G.; Bridges, P. D.

    1983-01-01

    A flight test method is described from which propulsive efficiency as well as parasite and induced drag coefficients can be directly determined using relatively simple instrumentation and analysis techniques. The method uses information contained in the transient response in airspeed for a small power change in level flight in addition to the usual measurement of power required for level flight. Measurements of pitch angle and longitudinal and normal acceleration are eliminated. The theoretical basis for the method, the analytical techniques used, and the results of application of the method to flight test data are presented.

  10. Protein-centric N-glycoproteomics analysis of membrane and plasma membrane proteins.

    PubMed

    Sun, Bingyun; Hood, Leroy

    2014-06-06

    The advent of proteomics technology has transformed our understanding of biological membranes. The challenges for studying membrane proteins have inspired the development of many analytical and bioanalytical tools, and the techniques of glycoproteomics have emerged as an effective means to enrich and characterize membrane and plasma-membrane proteomes. This Review summarizes the development of various glycoproteomics techniques to overcome the hurdles formed by the unique structures and behaviors of membrane proteins with a focus on N-glycoproteomics. Example contributions of N-glycoproteomics to the understanding of membrane biology are provided, and the areas that require future technical breakthroughs are discussed.

  11. Explicit solution techniques for impact with contact constraints

    NASA Technical Reports Server (NTRS)

    Mccarty, Robert E.

    1993-01-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  12. Explicit solution techniques for impact with contact constraints

    NASA Astrophysics Data System (ADS)

    McCarty, Robert E.

    1993-08-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  13. Integrating bio-inorganic and analytical chemistry into an undergraduate biochemistry laboratory.

    PubMed

    Erasmus, Daniel J; Brewer, Sharon E; Cinel, Bruno

    2015-01-01

    Undergraduate laboratories expose students to a wide variety of topics and techniques in a limited amount of time. This can be a challenge and lead to less exposure to concepts and activities in bio-inorganic chemistry and analytical chemistry that are closely-related to biochemistry. To address this, we incorporated a new iron determination by atomic absorption spectroscopy exercise as part of a five-week long laboratory-based project on the purification of myoglobin from beef. Students were required to prepare samples for chemical analysis, operate an atomic absorption spectrophotometer, critically evaluate their iron data, and integrate these data into a study of myoglobin. © 2015 The International Union of Biochemistry and Molecular Biology.

  14. On-orbit cryogenic fluid transfer

    NASA Technical Reports Server (NTRS)

    Aydelott, J. C.; Gille, J. P.; Eberhardt, R. N.

    1984-01-01

    A number of future NASA and DOD missions have been identified that will require, or could benefit from resupply of cryogenic liquids in orbit. The most promising approach for accomplishing cryogenic fluid transfer in the weightlessness environment of space is to use the thermodynamic filling technique. This approach involves initially reducing the receiver tank temperature by using several charge hold vent cycles followed by filling the tank without venting. Martin Marietta Denver Aerospace, under contract to the NASA Lewis Research Center, is currently developing analytical models to describe the on orbit cryogenic fluid transfer process. A detailed design of a shuttle attached experimental facility, which will provide the data necessary to verify the analytical models, is also being performed.

  15. A history of development in rotordynamics: A manufacturer's perspective

    NASA Technical Reports Server (NTRS)

    Shemeld, David E.

    1987-01-01

    The subject of rotordynamics and instability problems in high performance turbomachinery has been a topic of considerable industry discussion and debate over the last 15 or so years. This paper reviews an original equipment manufacturer's history of development of concepts and equipment as applicable to multistage centrifugal compressors. The variety of industry user compression requirements and resultant problematical situations tends to confound many of the theories and analytical techniques set forth. The experiences and examples described herein support the conclusion that the successful addressing of potential rotordynamics problems is best served by a fundamental knowledge of the specific equipment. This in addition to having the appropriate analytical tools. Also, that the final proof is in the doing.

  16. A shipboard comparison of analytic methods for ballast water compliance monitoring

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and < 50 μm: 9 techniques). Water samples were analyzed in parallel to determine whether results were similar between methods and whether rapid, indicative methods offer comparable results to standard, time- and labor-intensive detailed methods (e.g. microscopy) and high-end scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  17. Role of chromatography in the development of Standard Reference Materials for organic analysis.

    PubMed

    Wise, Stephen A; Phinney, Karen W; Sander, Lane C; Schantz, Michele M

    2012-10-26

    The certification of chemical constituents in natural-matrix Standard Reference Materials (SRMs) at the National Institute of Standards and Technology (NIST) can require the use of two or more independent analytical methods. The independence among the methods is generally achieved by taking advantage of differences in extraction, separation, and detection selectivity. This review describes the development of the independent analytical methods approach at NIST, and its implementation in the measurement of organic constituents such as contaminants in environmental materials, nutrients and marker compounds in food and dietary supplement matrices, and health diagnostic and nutritional assessment markers in human serum. The focus of this review is the important and critical role that separation science techniques play in achieving the necessary independence of the analytical steps in the measurement of trace-level organic constituents in natural matrix SRMs. Published by Elsevier B.V.

  18. Current Applications of Chromatographic Methods in the Study of Human Body Fluids for Diagnosing Disorders.

    PubMed

    Jóźwik, Jagoda; Kałużna-Czaplińska, Joanna

    2016-01-01

    Currently, analysis of various human body fluids is one of the most essential and promising approaches to enable the discovery of biomarkers or pathophysiological mechanisms for disorders and diseases. Analysis of these fluids is challenging due to their complex composition and unique characteristics. Development of new analytical methods in this field has made it possible to analyze body fluids with higher selectivity, sensitivity, and precision. The composition and concentration of analytes in body fluids are most often determined by chromatography-based techniques. There is no doubt that proper use of knowledge that comes from a better understanding of the role of body fluids requires the cooperation of scientists of diverse specializations, including analytical chemists, biologists, and physicians. This article summarizes current knowledge about the application of different chromatographic methods in analyses of a wide range of compounds in human body fluids in order to diagnose certain diseases and disorders.

  19. Immunoanalysis Methods for the Detection of Dioxins and Related Chemicals

    PubMed Central

    Tian, Wenjing; Xie, Heidi Qunhui; Fu, Hualing; Pei, Xinhui; Zhao, Bin

    2012-01-01

    With the development of biotechnology, approaches based on antibodies, such as enzyme-linked immunosorbent assay (ELISA), active aryl hydrocarbon immunoassay (Ah-I) and other multi-analyte immunoassays, have been utilized as alternatives to the conventional techniques based on gas chromatography and mass spectroscopy for the analysis of dioxin and dioxin-like compounds in environmental and biological samples. These screening methods have been verified as rapid, simple and cost-effective. This paper provides an overview on the development and application of antibody-based approaches, such as ELISA, Ah-I, and multi-analyte immunoassays, covering the sample extraction and cleanup, antigen design, antibody preparation and immunoanalysis. However, in order to meet the requirements for on-site fast detection and relative quantification of dioxins in the environment, further optimization is needed to make these immuno-analytical methods more sensitive and easy to use. PMID:23443395

  20. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    PubMed

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front.

  1. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 datamore » points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.« less

  2. CF6 jet engine diagnostics program. High pressure turbine roundness/clearance investigation

    NASA Technical Reports Server (NTRS)

    Howard, W. D.; Fasching, W. A.

    1982-01-01

    The effects of high pressure turbine clearance changes on engine and module performance was evaluated in addition to the measurement of CF6-50C high pressure turbine Stage 1 tip clearance and stator out-of-roundness during steady-state and transient operation. The results indicated a good correlation of the analytical model of round engine clearance response with measured data. The stator out-of-roundness measurements verified that the analytical technique for predicting the distortion effects of mechanical loads is accurate, whereas the technique for calculating the effects of certain circumferential thermal gradients requires some modifications. A potential for improvement in roundness was established in the order of 0.38 mm (0.015 in.), equivalent to 0.86 percent turbine efficiency which translates to a cruise SFC improvement of 0.36 percent. The HP turbine Stage 1 tip clearance performance derivative was established as 0.44 mm (17 mils) per percent of turbine efficiency at take-off power, somewhat smaller, therefore, more sensitive than predicted from previous investigations.

  3. Stress analysis of the cracked-lap-shear specimen - An ASTM round-robin

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.

    1987-01-01

    This ASTM Round Robin was conducted to evaluate the state of the art in stress analysis of adhesively bonded joint specimens. Specifically, the participants were asked to calculate the strain-energy-release rate for two different geometry cracked lap shear (CLS) specimens at four different debond lengths. The various analytical techniques consisted of 2- and 3-dimensional finite element analysis, beam theory, plate theory, and a combination of beam theory and finite element analysis. The results were examined in terms of the total strain-energy-release rate and the mode I to mode II ratio as a function of debond length for each specimen geometry. These results basically clustered into two groups: geometric linear or geometric nonlinear analysis. The geometric nonlinear analysis is required to properly analyze the CLS specimens. The 3-D finite element analysis gave indications of edge closure plus some mode III loading. Each participant described his analytical technique and results. Nine laboratories participated.

  4. Stress analysis of the cracked lap shear specimens: An ASTM round robin

    NASA Technical Reports Server (NTRS)

    Johnson, W. S.

    1986-01-01

    This ASTM Round Robin was conducted to evaluate the state of the art in stress analysis of adhesively bonded joint specimens. Specifically, the participants were asked to calculate the strain-energy-release rate for two different geometry cracked lap shear (CLS) specimens at four different debond lengths. The various analytical techniques consisted of 2- and 3-dimensional finite element analysis, beam theory, plate theory, and a combination of beam theory and finite element analysis. The results were examined in terms of the total strain-energy-release rate and the mode I to mode II ratio as a function of debond length for each specimen geometry. These results basically clustered into two groups: geometric linear or geometric nonlinear analysis. The geometric nonlinear analysis is required to properly analyze the CLS specimens. The 3-D finite element analysis gave indications of edge closure plus some mode III loading. Each participant described his analytical technique and results. Nine laboratories participated.

  5. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  6. An analytical method to simulate the H I 21-cm visibility signal for intensity mapping experiments

    NASA Astrophysics Data System (ADS)

    Sarkar, Anjan Kumar; Bharadwaj, Somnath; Marthi, Visweshwar Ram

    2018-01-01

    Simulations play a vital role in testing and validating H I 21-cm power spectrum estimation techniques. Conventional methods use techniques like N-body simulations to simulate the sky signal which is then passed through a model of the instrument. This makes it necessary to simulate the H I distribution in a large cosmological volume, and incorporate both the light-cone effect and the telescope's chromatic response. The computational requirements may be particularly large if one wishes to simulate many realizations of the signal. In this paper, we present an analytical method to simulate the H I visibility signal. This is particularly efficient if one wishes to simulate a large number of realizations of the signal. Our method is based on theoretical predictions of the visibility correlation which incorporate both the light-cone effect and the telescope's chromatic response. We have demonstrated this method by applying it to simulate the H I visibility signal for the upcoming Ooty Wide Field Array Phase I.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Afzal, Muhammad U., E-mail: muhammad.afzal@mq.edu.au; Esselle, Karu P.

    This paper presents a quasi-analytical technique to design a continuous, all-dielectric phase correcting structures (PCSs) for circularly polarized Fabry-Perot resonator antennas (FPRAs). The PCS has been realized by varying the thickness of a rotationally symmetric dielectric block placed above the antenna. A global analytical expression is derived for the PCS thickness profile, which is required to achieve nearly uniform phase distribution at the output of the PCS, despite the non-uniform phase distribution at its input. An alternative piecewise technique based on spline interpolation is also explored to design a PCS. It is shown from both far- and near-field results thatmore » a PCS tremendously improves the radiation performance of the FPRA. These improvements include an increase in peak directivity from 22 to 120 (from 13.4 dBic to 20.8 dBic) and a decrease of 3 dB beamwidth from 41.5° to 15°. The phase-corrected antenna also has a good directivity bandwidth of 1.3 GHz, which is 11% of the center frequency.« less

  8. Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.

    PubMed

    El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher

    2018-01-01

    Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.

  9. Flat-plate solar array project. Volume 6: Engineering sciences and reliability

    NASA Technical Reports Server (NTRS)

    Ross, R. G., Jr.; Smokler, M. I.

    1986-01-01

    The Flat-Plate Solar Array (FSA) Project activities directed at developing the engineering technology base required to achieve modules that meet the functional, safety, and reliability requirements of large scale terrestrial photovoltaic systems applications are reported. These activities included: (1) development of functional, safety, and reliability requirements for such applications; (2) development of the engineering analytical approaches, test techniques, and design solutions required to meet the requirements; (3) synthesis and procurement of candidate designs for test and evaluation; and (4) performance of extensive testing, evaluation, and failure analysis of define design shortfalls and, thus, areas requiring additional research and development. A summary of the approach and technical outcome of these activities are provided along with a complete bibliography of the published documentation covering the detailed accomplishments and technologies developed.

  10. Peptide interfaces with graphene: an emerging intersection of analytical chemistry, theory, and materials.

    PubMed

    Russell, Shane R; Claridge, Shelley A

    2016-04-01

    Because noncovalent interface functionalization is frequently required in graphene-based devices, biomolecular self-assembly has begun to emerge as a route for controlling substrate electronic structure or binding specificity for soluble analytes. The remarkable diversity of structures that arise in biological self-assembly hints at the possibility of equally diverse and well-controlled surface chemistry at graphene interfaces. However, predicting and analyzing adsorbed monolayer structures at such interfaces raises substantial experimental and theoretical challenges. In contrast with the relatively well-developed monolayer chemistry and characterization methods applied at coinage metal surfaces, monolayers on graphene are both less robust and more structurally complex, levying more stringent requirements on characterization techniques. Theory presents opportunities to understand early binding events that lay the groundwork for full monolayer structure. However, predicting interactions between complex biomolecules, solvent, and substrate is necessitating a suite of new force fields and algorithms to assess likely binding configurations, solvent effects, and modulations to substrate electronic properties. This article briefly discusses emerging analytical and theoretical methods used to develop a rigorous chemical understanding of the self-assembly of peptide-graphene interfaces and prospects for future advances in the field.

  11. ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION

    PubMed Central

    Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey

    2013-01-01

    MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053

  12. What We Do and Do Not Know about Teaching Medical Image Interpretation.

    PubMed

    Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F

    2017-01-01

    Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.

  13. Autonomous driving in NMR.

    PubMed

    Perez, Manuel

    2017-01-01

    The automatic analysis of NMR data has been a much-desired endeavour for the last six decades, as it is the case with any other analytical technique. This need for automation has only grown as advances in hardware; pulse sequences and automation have opened new research areas to NMR and increased the throughput of data. Full automatic analysis is a worthy, albeit hard, challenge, but in a world of artificial intelligence, instant communication and big data, it seems that this particular fight is happening with only one technique at a time (let this be NMR, MS, IR, UV or any other), when the reality of most laboratories is that there are several types of analytical instrumentation present. Data aggregation, verification and elucidation by using complementary techniques (e.g. MS and NMR) is a desirable outcome to pursue, although a time-consuming one if performed manually; hence, the use of automation to perform the heavy lifting for users is required to make the approach attractive for scientists. Many of the decisions and workflows that could be implemented under automation will depend on the two-way communication with databases that understand analytical data, because it is desirable not only to query these databases but also to grow them in as much of an automatic manner as possible. How these databases are designed, set up and the data inside classified will determine what workflows can be implemented. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Forensic toxicology.

    PubMed

    Drummer, Olaf H

    2010-01-01

    Forensic toxicology has developed as a forensic science in recent years and is now widely used to assist in death investigations, in civil and criminal matters involving drug use, in drugs of abuse testing in correctional settings and custodial medicine, in road and workplace safety, in matters involving environmental pollution, as well as in sports doping. Drugs most commonly targeted include amphetamines, benzodiazepines, cannabis, cocaine and the opiates, but can be any other illicit substance or almost any over-the-counter or prescribed drug, as well as poisons available to the community. The discipline requires high level skills in analytical techniques with a solid knowledge of pharmacology and pharmacokinetics. Modern techniques rely heavily on immunoassay screening analyses and mass spectrometry (MS) for confirmatory analyses using either high-performance liquid chromatography or gas chromatography as the separation technique. Tandem MS has become more and more popular compared to single-stage MS. It is essential that analytical systems are fully validated and fit for the purpose and the assay batches are monitored with quality controls. External proficiency programs monitor both the assay and the personnel performing the work. For a laboratory to perform optimally, it is vital that the circumstances and context of the case are known and the laboratory understands the limitations of the analytical systems used, including drug stability. Drugs and poisons can change concentration postmortem due to poor or unequal quality of blood and other specimens, anaerobic metabolism and redistribution. The latter provides the largest handicap in the interpretation of postmortem results.

  15. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  16. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  17. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  18. Environmental Quality Standards Research on Wastewaters of Army Ammunition Plants

    DTIC Science & Technology

    1978-06-01

    characterization of nitrocellulose wastewaters. We are grateful to LTC Leroy H. Reuter and LTC Robert P. Carnahan of the US Army Medical Research and...analytical methodology was required to characterize the wastes. The techniques used for fingerprinting (showing that the same compound exists although its...examination of the NC wastewaters has somewhat clarified the problem of characterizing the NC and has caused us to change or modify previous

  19. Water sprays in space retrieval operations

    NASA Technical Reports Server (NTRS)

    Freesland, D. C.

    1977-01-01

    Experiments were conducted in a ground based vacuum chamber to determine physical properties of water-ice in a space-like environment. Additional ices, alcohol and ammonia, were also studied. An analytical analysis based on the conservation of angular momentum, resulted in despin performance parameters, i.e., total water mass requirements and despin times. The despin and retrieval of a disabled spacecraft was considered to illustrate a potential application of the water spray technique.

  20. Space Shuttle propulsion parameter estimation using optimal estimation techniques

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This fourth monthly progress report again contains corrections and additions to the previously submitted reports. The additions include a simplified SRB model that is directly incorporated into the estimation algorithm and provides the required partial derivatives. The resulting partial derivatives are analytical rather than numerical as would be the case using the SOBER routines. The filter and smoother routine developments have continued. These routines are being checked out.

  1. A study of multiplex data bus techniques for the space shuttle

    NASA Technical Reports Server (NTRS)

    Kearney, R. J.; Kalange, M. A.

    1972-01-01

    A comprehensive technology base for the design of a multiplexed data bus subsystem is provided. Extensive analyses, both analytical and empirical, were performed. Subjects covered are classified under the following headings: requirements identification and analysis; transmission media studies; signal design and detection studies; synchronization, timing, and control studies; user-subsystem interface studies; operational reliability analyses; design of candidate data bus configurations; and evaluation of candidate data bus designs.

  2. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  3. A novel magnet focusing plate for matrix-assisted laser desorption/ionization analysis of magnetic bead-bound analytes.

    PubMed

    Gode, David; Volmer, Dietrich A

    2013-05-15

    Magnetic beads are often used for serum profiling of peptide and protein biomarkers. In these assays, the bead-bound analytes are eluted from the beads prior to mass spectrometric analysis. This study describes a novel matrix-assisted laser desorption/ionization (MALDI) technique for direct application and focusing of magnetic beads to MALDI plates by means of dedicated micro-magnets as sample spots. Custom-made MALDI plates with magnetic focusing spots were made using small nickel-coated neodymium micro-magnets integrated into a stainless steel plate in a 16 × 24 (384) pattern. For demonstrating the proof-of-concept, commercial C-18 magnetic beads were used for the extraction of a test compound (reserpine) from aqueous solution. Experiments were conducted to study focusing abilities, the required laser energies, the influence of a matrix compound, dispensing techniques, solvent choice and the amount of magnetic beads. Dispensing the magnetic beads onto the micro-magnet sample spots resulted in immediate and strong binding to the magnetic surface. Light microscope images illustrated the homogeneous distribution of beads across the surfaces of the magnets, when the entire sample volume containing the beads was pipetted onto the surface. Subsequent MALDI analysis of the bead-bound analyte demonstrated excellent and reproducible ionization yields. The surface-assisted laser desorption/ionization (SALDI) properties of the strongly light-absorbing γ-Fe2O3-based beads resulted in similar ionization efficiencies to those obtained from experiments with an additional MALDI matrix compound. This feasibility study successfully demonstrated the magnetic focusing abilities for magnetic bead-bound analytes on a novel MALDI plate containing small micro-magnets as sample spots. One of the key advantages of this integrated approach is that no elution steps from magnetic beads were required during analyses compared with conventional bead experiments. Copyright © 2013 John Wiley & Sons, Ltd.

  4. Method optimization and quality assurance in speciation analysis using high performance liquid chromatography with detection by inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Larsen, Erik H.

    1998-02-01

    Achievement of optimum selectivity, sensitivity and robustness in speciation analysis using high performance liquid chromatography (HPLC) with inductively coupled mass spectrometry (ICP-MS) detection requires that each instrumental component is selected and optimized with a view to the ideal operating characteristics of the entire hyphenated system. An isocratic HPLC system, which employs an aqueous mobile phase with organic buffer constituents, is well suited for introduction into the ICP-MS because of the stability of the detector response and high degree of analyte sensitivity attained. Anion and cation exchange HPLC systems, which meet these requirements, were used for the seperation of selenium and arsenic species in crude extracts of biological samples. Furthermore, the signal-to-noise ratios obtained for these incompletely ionized elements in the argon ICP were further enhanced by a factor of four by continously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed. The optimized anion and cation exchange HPLC-ICP-MS systems were used for arsenic speciation in contaminated ground water and in an in-house shrimp reference sample. For the purpose of verification, HPLC coupled with tandem mass spectrometry with electrospray ionization was additionally used for arsenic speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing a mass balance of the analytes in each fraction of the extraction procedure, by recovery of spikes and by employing and comparing independent techniques. The urgent need for reference materials certified for elemental species is stressed.

  5. Analytical solutions to non-Fickian subsurface dispersion in uniform groundwater flow

    USGS Publications Warehouse

    Zou, S.; Xia, J.; Koussis, Antonis D.

    1996-01-01

    Analytical solutions are obtained by the Fourier transform technique for the one-, two-, and three-dimensional transport of a conservative solute injected instantaneously in a uniform groundwater flow. These solutions account for dispersive non-linearity caused by the heterogeneity of the hydraulic properties of aquifer systems and can be used as building blocks to construct solutions by convolution (principle of superposition) for source conditions other than slug injection. The dispersivity is assumed to vary parabolically with time and is thus constant for the entire system at any given time. Two approaches for estimating time-dependent dispersion parameters are developed for two-dimensional plumes. They both require minimal field tracer test data and, therefore, represent useful tools for assessing real-world aquifer contamination sites. The first approach requires mapped plume-area measurements at two specific times after the tracer injection. The second approach requires concentration-versus-time data from two sampling wells through which the plume passes. Detailed examples and comparisons with other procedures show that the methods presented herein are sufficiently accurate and easier to use than other available methods.

  6. Analytical interferences of mercuric chloride preservative in environmental water samples: Determination of organic compounds isolated by continuous liquid-liquid extraction or closed-loop stripping

    USGS Publications Warehouse

    Foreman, W.T.; Zaugg, S.D.; Falres, L.M.; Werner, M.G.; Leiker, T.J.; Rogerson, P.F.

    1992-01-01

    Analytical interferences were observed during the determination of organic compounds in groundwater samples preserved with mercuric chloride. The nature of the interference was different depending on the analytical isolation technique employed. (1) Water samples extracted with dichloromethane by continuous liquid-liquid extraction (CLLE) and analyzed by gas chromatography/mass spectrometry revealed a broad HgCl2 'peak' eluting over a 3-5-min span which interfered with the determination of coeluting organic analytes. Substitution of CLLE for separatory funnel extraction in EPA method 508 also resulted in analytical interferences from the use of HgCl2 preservative. (2) Mercuric chloride was purged, along with organic contaminants, during closed-loop stripping (CLS) of groundwater samples and absorbed onto the activated charcoal trap. Competitive sorption of the HgCl2 by the trap appeared to contribute to the observed poor recoveries for spiked organic contaminants. The HgCl2 was not displaced from the charcoal with the dichloromethane elution solvent and required strong nitric acid to achieve rapid, complete displacement. Similar competitive sorption mechanisms might also occur in other purge and trap methods when this preservative is used.

  7. Assessment of sample preservation techniques for pharmaceuticals, personal care products, and steroids in surface and drinking water.

    PubMed

    Vanderford, Brett J; Mawhinney, Douglas B; Trenholm, Rebecca A; Zeigler-Holady, Janie C; Snyder, Shane A

    2011-02-01

    Proper collection and preservation techniques are necessary to ensure sample integrity and maintain the stability of analytes until analysis. Data from improperly collected and preserved samples could lead to faulty conclusions and misinterpretation of the occurrence and fate of the compounds being studied. Because contaminants of emerging concern, such as pharmaceuticals and personal care products (PPCPs) and steroids, generally occur in surface and drinking water at ng/L levels, these compounds in particular require such protocols to accurately assess their concentrations. In this study, sample bottle types, residual oxidant quenching agents, preservation agents, and hold times were assessed for 21 PPCPs and steroids in surface water and finished drinking water. Amber glass bottles were found to have the least effect on target analyte concentrations, while high-density polyethylene bottles had the most impact. Ascorbic acid, sodium thiosulfate, and sodium sulfite were determined to be acceptable quenching agents and preservation with sodium azide at 4 °C led to the stability of the most target compounds. A combination of amber glass bottles, ascorbic acid, and sodium azide preserved analyte concentrations for 28 days in the tested matrices when held at 4 °C. Samples without a preservation agent were determined to be stable for all but two of the analytes when stored in amber glass bottles at 4 °C for 72 h. Results suggest that if improper protocols are utilized, reported concentrations of target PPCPs and steroids may be inaccurate.

  8. Novel conformal technique to reduce staircasing artifacts at material boundaries for FDTD modeling of the bioheat equation.

    PubMed

    Neufeld, E; Chavannes, N; Samaras, T; Kuster, N

    2007-08-07

    The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.

  9. Development of Novel Method for Rapid Extract of Radionuclides from Solution Using Polymer Ligand Film

    NASA Astrophysics Data System (ADS)

    Rim, Jung H.

    Accurate and fast determination of the activity of radionuclides in a sample is critical for nuclear forensics and emergency response. Radioanalytical techniques are well established for radionuclides measurement, however, they are slow and labor intensive, requiring extensive radiochemical separations and purification prior to analysis. With these limitations of current methods, there is great interest for a new technique to rapidly process samples. This dissertation describes a new analyte extraction medium called Polymer Ligand Film (PLF) developed to rapidly extract radionuclides. Polymer Ligand Film is a polymer medium with ligands incorporated in its matrix that selectively and rapidly extract analytes from a solution. The main focus of the new technique is to shorten and simplify the procedure necessary to chemically isolate radionuclides for determination by alpha spectrometry or beta counting. Five different ligands were tested for plutonium extraction: bis(2-ethylhexyl) methanediphosphonic acid (H2DEH[MDP]), di(2-ethyl hexyl) phosphoric acid (HDEHP), trialkyl methylammonium chloride (Aliquat-336), 4,4'(5')-di-t-butylcyclohexano 18-crown-6 (DtBuCH18C6), and 2-ethylhexyl 2-ethylhexylphosphonic acid (HEH[EHP]). The ligands that were effective for plutonium extraction further studied for uranium extraction. The plutonium recovery by PLFs has shown dependency on nitric acid concentration and ligand to total mass ratio. H2DEH[MDP] PLFs performed best with 1:10 and 1:20 ratio PLFs. 50.44% and 47.61% of plutonium were extracted on the surface of PLFs with 1M nitric acid for 1:10 and 1:20 PLF, respectively. HDEHP PLF provided the best combination of alpha spectroscopy resolution and plutonium recovery with 1:5 PLF when used with 0.1M nitric acid. The overall analyte recovery was lower than electrodeposited samples, which typically has recovery above 80%. However, PLF is designed to be a rapid field deployable screening technique and consistency is more important than recovery. PLFs were also tested using blind quality control samples and the activities were accurately measured. It is important to point out that PLFs were consistently susceptible to analytes penetrating and depositing below the surface. The internal radiation within the body of PLF is mostly contained and did not cause excessive self-attenuation and peak broadening in alpha spectroscopy. The analyte penetration issue was beneficial in the destructive analysis. H2DEH[MDP] PLF was tested with environmental samples to fully understand the capabilities and limitations of the PLF in relevant environments. The extraction system was very effective in extracting plutonium from environmental water collected from Mortandad Canyon at Los Alamos National Laboratory with minimal sample processing. Soil samples were tougher to process than the water samples. Analytes were first leached from the soil matrixes using nitric acid before processing with PLF. This approach had a limitation in extracting plutonium using PLF. The soil samples from Mortandad Canyon, which are about 1% iron by weight, were effectively processed with the PLF system. Even with certain limitations of the PLF extraction system, this technique was able to considerably decrease the sample analysis time. The entire environmental sample was analyzed within one to two days. The decrease in time can be attributed to the fact that PLF is replacing column chromatography and electrodeposition with a single step for preparing alpha spectrometry samples. The two-step process of column chromatography and electrodeposition takes a couple days to a week to complete depending on the sample. The decrease in time and the simplified procedure make this technique a unique solution for application to nuclear forensics and emergency response. A large number of samples can be quickly analyzed and selective samples can be further analyzed with more sensitive techniques based on the initial data. The deployment of a PLF system as a screening method will greatly reduce a total analysis time required to gain meaningful isotopic data for the nuclear forensics application. (Abstract shortened by UMI.)

  10. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  11. CALIBRATION OF SEMI-ANALYTIC MODELS OF GALAXY FORMATION USING PARTICLE SWARM OPTIMIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruiz, Andrés N.; Domínguez, Mariano J.; Yaryura, Yamila

    2015-03-10

    We present a fast and accurate method to select an optimal set of parameters in semi-analytic models of galaxy formation and evolution (SAMs). Our approach compares the results of a model against a set of observables applying a stochastic technique called Particle Swarm Optimization (PSO), a self-learning algorithm for localizing regions of maximum likelihood in multidimensional spaces that outperforms traditional sampling methods in terms of computational cost. We apply the PSO technique to the SAG semi-analytic model combined with merger trees extracted from a standard Lambda Cold Dark Matter N-body simulation. The calibration is performed using a combination of observedmore » galaxy properties as constraints, including the local stellar mass function and the black hole to bulge mass relation. We test the ability of the PSO algorithm to find the best set of free parameters of the model by comparing the results with those obtained using a MCMC exploration. Both methods find the same maximum likelihood region, however, the PSO method requires one order of magnitude fewer evaluations. This new approach allows a fast estimation of the best-fitting parameter set in multidimensional spaces, providing a practical tool to test the consequences of including other astrophysical processes in SAMs.« less

  12. Desorption electrospray ionization-mass spectrometry for the detection of analytes extracted by thin-film molecularly imprinted polymers.

    PubMed

    Van Biesen, Geert; Wiseman, Justin M; Li, Jessica; Bottaro, Christina S

    2010-09-01

    Desorption electrospray ionization-mass spectrometry (DESI-MS) is a powerful technique for the analysis of solid and liquid surfaces that has found numerous applications in the few years since its invention. For the first time, it is applied to the detection of analytes extracted by molecularly imprinted polymers (MIPs) in a thin-film format. MIPs formed with 2,4-dichlorophenoxyacetic acid (2,4-D) as the template were used for the extraction of this analyte from aqueous solutions spiked at concentrations of 0.0050-2.0 mg L(-1) (approximately 2 x 10(-8) to approximately 1 x 10(-5) M). The response was linear up to 0.50 mg L(-1), and then levelled off due to saturation of the active sites of the MIP. In MS/MS mode, the signal at 0.0050 mg L(-1) was still an order of magnitude higher than the signal of a blank. The MIP DESI-MS approach was also used for the analysis of tap water and river water spiked with 2,4-D and four analogues, which indicated that these analogues were also extracted to various extents. For practical applications of the MIP, a detection technique is required that can distinguish between these structurally similar compounds, and DESI-MS fulfills this purpose.

  13. "Soft"or "hard" ionisation? Investigation of metastable gas temperature effect on direct analysis in real-time analysis of Voriconazole.

    PubMed

    Lapthorn, Cris; Pullen, Frank

    2009-01-01

    The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.

  14. Advantages and Challenges of Dried Blood Spot Analysis by Mass Spectrometry Across the Total Testing Process.

    PubMed

    Zakaria, Rosita; Allen, Katrina J; Koplin, Jennifer J; Roche, Peter; Greaves, Ronda F

    2016-12-01

    Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; "blood spot" and "mass spectrometry"; while excluding "newborn"; and "neonate". In addition, databases were restricted to English language and human specific. There was no time period limit applied. As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required.

  15. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  16. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  17. Efficient techniques for forced response involving linear modal components interconnected by discrete nonlinear connection elements

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; O'Callahan, John

    2009-01-01

    Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.

  18. On-orbit evaluation of the control system/structural mode interactions on OSO-8

    NASA Technical Reports Server (NTRS)

    Slafer, L. I.

    1980-01-01

    The Orbiting Solar Observatory-8 experienced severe structural mode/control loop interaction problems during the spacecraft development. Extensive analytical studies, using the hybrid coordinate modeling approach, and comprehensive ground testing were carried out in order to achieve the system's precision pointing performance requirements. A recent series of flight tests were conducted with the spacecraft in which a wide bandwidth, high resolution telemetry system was utilized to evaluate the on-orbit flexible dynamics characteristics of the vehicle along with the control system performance. This paper describes the results of these tests, reviewing the basic design problem, analytical approach taken, ground test philosophy, and on-orbit testing. Data from the tests was used to determine the primary mode frequency, damping, and servo coupling dynamics for the on-orbit condition. Additionally, the test results have verified analytically predicted differences between the on-orbit and ground test environments. The test results have led to a validation of both the analytical modeling and servo design techniques used during the development of the control system, and also verified the approach taken to vehicle and servo ground testing.

  19. A semi-analytical description of protein folding that incorporates detailed geometrical information

    PubMed Central

    Suzuki, Yoko; Noel, Jeffrey K.; Onuchic, José N.

    2011-01-01

    Much has been done to study the interplay between geometric and energetic effects on the protein folding energy landscape. Numerical techniques such as molecular dynamics simulations are able to maintain a precise geometrical representation of the protein. Analytical approaches, however, often focus on the energetic aspects of folding, including geometrical information only in an average way. Here, we investigate a semi-analytical expression of folding that explicitly includes geometrical effects. We consider a Hamiltonian corresponding to a Gaussian filament with structure-based interactions. The model captures local features of protein folding often averaged over by mean-field theories, for example, loop contact formation and excluded volume. We explore the thermodynamics and folding mechanisms of beta-hairpin and alpha-helical structures as functions of temperature and Q, the fraction of native contacts formed. Excluded volume is shown to be an important component of a protein Hamiltonian, since it both dominates the cooperativity of the folding transition and alters folding mechanisms. Understanding geometrical effects in analytical formulae will help illuminate the consequences of the approximations required for the study of larger proteins. PMID:21721664

  20. On-line focusing of flavin derivatives using Dynamic pH junction-sweeping capillary electrophoresis with laser-induced fluorescence detection.

    PubMed

    Britz-McKibbin, Philip; Otsuka, Koji; Terabe, Shigeru

    2002-08-01

    Simple yet effective methods to enhance concentration sensitivity is needed for capillary electrophoresis (CE) to become a practical method to analyze trace levels of analytes in real samples. In this report, the development of a novel on-line preconcentration technique combining dynamic pH junction and sweeping modes of focusing is applied to the sensitive and selective analysis of three flavin derivatives: riboflavin, flavin mononucleotide (FMN) and flavin adenine dinucleotide (FAD). Picomolar (pM) detectability of flavins by CE with laser-induced fluorescence (LIF) detection is demonstrated through effective focusing of large sample volumes (up to 22% capillary length) using a dual pH junction-sweeping focusing mode. This results in greater than a 1,200-fold improvement in sensitivity relative to conventional injection methods, giving a limit of detection (S/N = 3) of approximately 4.0 pM for FAD and FMN. Flavin focusing is examined in terms of analyte mobility dependence on buffer pH, borate complexation and SDS interaction. Dynamic pH junction-sweeping extends on-line focusing to both neutral (hydrophobic) and weakly acidic (hydrophilic) species and is considered useful in cases when either conventional sweeping or dynamic pH junction techniques used alone are less effective for certain classes of analytes. Enhanced focusing performance by this hyphenated method was demonstrated by greater than a 4-fold reduction in flavin bandwidth, as compared to either sweeping or dynamic pH junction, reflected by analyte detector bandwidths <0.20 cm. Novel on-line focusing strategies are required to improve sensitivity in CE, which may be applied toward more effective biochemical analysis methods for diverse types of analytes.

  1. The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.

    2014-10-01

    Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less

  2. Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS

    NASA Astrophysics Data System (ADS)

    Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas

    2014-05-01

    Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected examples on combining isotopic systems for the study of ecosystem processes on different spatial scales will underpin the great opportunities substantiated by the field of analytical ecogeochemistry. Moreover, recent developments in plasma mass spectrometry and the application of new isotopic systems require sound metrological approaches in order to prevent scientific conclusions drawn from analytical artifacts.

  3. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  4. Thermodynamic analysis and subscale modeling of space-based orbit transfer vehicle cryogenic propellant resupply

    NASA Technical Reports Server (NTRS)

    Defelice, David M.; Aydelott, John C.

    1987-01-01

    The resupply of the cryogenic propellants is an enabling technology for spacebased orbit transfer vehicles. As part of the NASA Lewis ongoing efforts in microgravity fluid management, thermodynamic analysis and subscale modeling techniques were developed to support an on-orbit test bed for cryogenic fluid management technologies. Analytical results have shown that subscale experimental modeling of liquid resupply can be used to validate analytical models when the appropriate target temperature is selected to relate the model to its prototype system. Further analyses were used to develop a thermodynamic model of the tank chilldown process which is required prior to the no-vent fill operation. These efforts were incorporated into two FORTRAN programs which were used to present preliminary analyticl results.

  5. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  6. Quantitative secondary electron imaging for work function extraction at atomic level and layer identification of graphene

    PubMed Central

    Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou

    2016-01-01

    Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907

  7. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  8. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  9. Chemical fractionation-enhanced structural characterization of marine dissolved organic matter

    NASA Astrophysics Data System (ADS)

    Arakawa, N.; Aluwihare, L.

    2016-02-01

    Describing the molecular fingerprint of dissolved organic matter (DOM) requires sample processing methods and separation techniques that can adequately minimize its complexity. We have employed acid hydrolysis as a way to make the subcomponents of marine solid phase-extracted (PPL) DOM more accessible to analytical techniques. Using a combination of NMR and chemical derivatization or reduction analyzed by comprehensive (GCxGC) gas chromatography, we observed chemical features strikingly similar to terrestrial DOM. In particular, we observed reduced alicylic hydrocarbons believed to be the backbone of previously identified carboxylic rich alicyclic material (CRAM). Additionally, we found carbohydrates, amino acids and small lipids and acids.

  10. Analytical design of an advanced radial turbine. [automobile engines

    NASA Technical Reports Server (NTRS)

    Large, G. D.; Finger, D. G.; Linder, C. G.

    1981-01-01

    The aerodynamic and mechanical potential of a single stage ceramic radial inflow turbine was evaluated for a high temperature single stage automotive engine. The aerodynamic analysis utilizes a turbine system optimization technique to evaluate both radial and nonradial rotor blading. Selected turbine rotor configurations were evaluated mechanically with three dimensional finite element techniques. Results indicate that exceptionally high rotor tip speeds (2300 ft/sec) and performance potential are feasible with radial bladed rotors if the projected ceramic material properties are realized. Nonradial rotors reduced tip speed requirements (at constant turbine efficiency) but resulted in a lower cumulative probability of success due to higher blade and disk stresses.

  11. Temperature control of the Mariner class spacecraft - A seven mission summary.

    NASA Technical Reports Server (NTRS)

    Dumas, L. N.

    1973-01-01

    Mariner spacecraft have completed five missions of scientific investigation of the planets. Two additional missions are planned. A description of the thermal design of these seven spacecraft is given herein. The factors which have influenced the thermal design include the mission requirements and constraints, the flight environment, certain programmatic considerations and the experience gained as each mission is completed. These factors are reviewed and the impact of each on thermal design and developmental techniques is assessed. It is concluded that the flight success of these spacecraft indicates that adequate temperature control has been obtained, but that improvements in design data, hardware performance and analytical techniques are needed.

  12. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  13. Inexpensive but accurate driving circuits for quartz crystal microbalances

    NASA Astrophysics Data System (ADS)

    Bruschi, L.; Delfitto, G.; Mistura, G.

    1999-01-01

    The quartz crystal microbalance (QCM) is a common technique which finds a wide variety of applications in many different areas like adsorption, catalysis, analytical chemistry, biochemistry, etc., and more generally as a sensor in the investigation of viscoelastic films. In this article we describe some driving circuits of the quartz which we have realized and tested in our laboratory. These can be assembled with standard components which can be easily found. Their performance, in some cases, is as good as that of the much more expensive frequency modulation technique employed in very precise QCM measurements and which requires high-quality commercial radiofrequency generators and amplifiers.

  14. Titration in the treatment of the more troubled patient.

    PubMed

    Winer, J A; Ornstein, E D

    2001-01-01

    This article defines and discusses a modification of technique recommended by the authors in the psychoanalytic treatment of more troubled patients--a modification they call titration. Titration is defined as a conscious decision by the analyst to increase or decrease assistance (or gratification) gradually, in order to facilitate the analytic process. The authors emphasize the complexity of decisions in treatment by focusing on the decision-making processes that titration requires. Guidelines and a case vignette are presented. The authors conclude by considering some of the politics involved in the introduction of technique modifications, the salience of the titration concept, and directions for further exploration.

  15. Non-Cooperative Group Decision Support Systems: Problems and Some Solutions.

    DTIC Science & Technology

    1986-09-01

    appears that in these situations the 46 content of the problem and the structure of the problem is " fuzzy ." It requires an active cooperation between the...some unstructured parts will remain. This partial ’unstructurability’ is due to uncertainty, fuzziness , ignorance, and an inability to...according to the Analytic Hierarchy Process ( AHP ) technique (Gui, 1985). The AHP algorithm consists of the following steps; (i) Perform a pairwise comparison

  16. Advanced Signal Processing Analysis of Laser-Induced Breakdown Spectroscopy Data for the Discrimination of Obsidian Sources

    DTIC Science & Technology

    2012-02-09

    different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16

  17. Analytical Bibliography for Water Supply and Conservation Techniques.

    DTIC Science & Technology

    1982-01-01

    effective due to rapidly increasing costs of water and wastewater services from centralized systems. The report may be used as a primary reference for...conservation kit. Each kit contained a toilet water dam, a plastic shower-head restrictor, and a packet of vegetable dye tablets to detect leaks from toilet...water in the 50 sub-basins of the North Atlantic Region (NAR) of the United States. The water-flow requirements (water demands) were disintegrated by

  18. High Head Unshrouded Impeller Pump Stage Technology

    NASA Technical Reports Server (NTRS)

    Williams, Robert W.; Skelley, Stephen E.; Stewart, Eric T.; Droege, Alan R.; Prueger, George H.; Chen, Wei-Chung; Williams, Morgan; Turner, James E. (Technical Monitor)

    2000-01-01

    A team of engineers at NASA/MSFC and Boeing, Rocketdyne division, are developing unshrouded impeller technologies that will increase payload and decrease cost of future reusable launch vehicles. Using the latest analytical techniques and experimental data, a two-stage unshrouded fuel pump is being designed that will meet the performance requirements of a three-stage shrouded pump. Benefits of the new pump include lower manufacturing costs, reduced weight, and increased payload to orbit.

  19. Planning effectiveness may grow on fault trees.

    PubMed

    Chow, C W; Haddad, K; Mannino, B

    1991-10-01

    The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.

  20. Specifying and calibrating instrumentations for wideband electronic power measurements. [in switching circuits

    NASA Technical Reports Server (NTRS)

    Lesco, D. J.; Weikle, D. H.

    1980-01-01

    The wideband electric power measurement related topics of electronic wattmeter calibration and specification are discussed. Tested calibration techniques are described in detail. Analytical methods used to determine the bandwidth requirements of instrumentation for switching circuit waveforms are presented and illustrated with examples from electric vehicle type applications. Analog multiplier wattmeters, digital wattmeters and calculating digital oscilloscopes are compared. The instrumentation characteristics which are critical to accurate wideband power measurement are described.

  1. Introduction to Food Analysis

    NASA Astrophysics Data System (ADS)

    Nielsen, S. Suzanne

    Investigations in food science and technology, whether by the food industry, governmental agencies, or universities, often require determination of food composition and characteristics. Trends and demands of consumers, the food industry, and national and international regulations challenge food scientists as they work to monitor food composition and to ensure the quality and safety of the food supply. All food products require analysis as part of a quality management program throughout the development process (including raw ingredients), through production, and after a product is in the market. In addition, analysis is done of problem samples and competitor products. The characteristics of foods (i.e., chemical composition, physical properties, sensory properties) are used to answer specific questions for regulatory purposes and typical quality control. The nature of the sample and the specific reason for the analysis commonly dictate the choice of analytical methods. Speed, precision, accuracy, and ruggedness often are key factors in this choice. Validation of the method for the specific food matrix being analyzed is necessary to ensure usefulness of the method. Making an appropriate choice of the analytical technique for a specific application requires a good knowledge of the various techniques (Fig. 1.1). For example, your choice of method to determine the salt content of potato chips would be different if it is for nutrition labeling than for quality control. The success of any analytical method relies on the proper selection and preparation of the food sample, carefully performing the analysis, and doing the appropriate calculations and interpretation of the data. Methods of analysis developed and endorsed by several nonprofit scientific organizations allow for standardized comparisons of results between different laboratories and for evaluation of less standard procedures. Such official methods are critical in the analysis of foods, to ensure that they meet the legal requirements established by governmental agencies. Government regulations and international standards most relevant to the analysis of foods are mentioned here but covered in more detail in Chap. 2, and nutrition labeling regulations in the USA are covered in Chap. 3. Internet addresses for many of the organizations and government agencies discussed are given at the end of this chapter.

  2. A new concept of pencil beam dose calculation for 40-200 keV photons using analytical dose kernels.

    PubMed

    Bartzsch, Stefan; Oelfke, Uwe

    2013-11-01

    The advent of widespread kV-cone beam computer tomography in image guided radiation therapy and special therapeutic application of keV photons, e.g., in microbeam radiation therapy (MRT) require accurate and fast dose calculations for photon beams with energies between 40 and 200 keV. Multiple photon scattering originating from Compton scattering and the strong dependence of the photoelectric cross section on the atomic number of the interacting tissue render these dose calculations by far more challenging than the ones established for corresponding MeV beams. That is why so far developed analytical models of kV photon dose calculations fail to provide the required accuracy and one has to rely on time consuming Monte Carlo simulation techniques. In this paper, the authors introduce a novel analytical approach for kV photon dose calculations with an accuracy that is almost comparable to the one of Monte Carlo simulations. First, analytical point dose and pencil beam kernels are derived for homogeneous media and compared to Monte Carlo simulations performed with the Geant4 toolkit. The dose contributions are systematically separated into contributions from the relevant orders of multiple photon scattering. Moreover, approximate scaling laws for the extension of the algorithm to inhomogeneous media are derived. The comparison of the analytically derived dose kernels in water showed an excellent agreement with the Monte Carlo method. Calculated values deviate less than 5% from Monte Carlo derived dose values, for doses above 1% of the maximum dose. The analytical structure of the kernels allows adaption to arbitrary materials and photon spectra in the given energy range of 40-200 keV. The presented analytical methods can be employed in a fast treatment planning system for MRT. In convolution based algorithms dose calculation times can be reduced to a few minutes.

  3. Gas chromatography coupled to tunable pulsed glow discharge time-of-flight mass spectrometry for environmental analysis.

    PubMed

    Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo

    2010-05-01

    A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.

  4. Combined sensing platform for advanced diagnostics in exhaled mouse breath

    NASA Astrophysics Data System (ADS)

    Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris

    2013-03-01

    Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.

  5. Stable oxygen and hydrogen isotopes of brines - comparing isotope ratio mass spectrometry and isotope ratio infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Ahrens, Christian; Koeniger, Paul; van Geldern, Robert; Stadler, Susanne

    2013-04-01

    Today's standard analytical methods for high precision stable isotope analysis of fluids are gas-water equilibration and high temperature pyrolysis coupled to isotope ratio mass spectrometers (IRMS). In recent years, relatively new laser-based analytical instruments entered the market that are said to allow high isotope precision data on nearly every media. This optical technique is referred to as isotope ratio infrared spectroscopy (IRIS). The objective of this study is to evaluate the capability of this new instrument type for highly saline solutions and a comparison of the analytical results with traditional IRMS analysis. It has been shown for the equilibration method that the presence of salts influences the measured isotope values depending on the salt concentration (see Lécuyer et al, 2009; Martineau, 2012). This so-called 'isotope salt effect' depends on the salt type and salt concentration. These factors change the activity in the fluid and therefore shift the isotope ratios measured by the equilibration method. Consequently, correction factors have to be applied to these analytical data. Direct conversion techniques like pyrolysis or the new laser instruments allow the measurement of the water molecule from the sample directly and should therefore not suffer from the salt effect, i.e. no corrections of raw values are necessary. However, due to high salt concentrations this might cause technical problems with the analytical hardware and may require labor-intensive sample preparation (e.g. vacuum distillation). This study evaluates the salt isotope effect for the IRMS equilibration technique (Thermo Gasbench II coupled to Delta Plus XP) and the laser-based IRIS instruments with liquid injection (Picarro L2120-i). Synthetic salt solutions (NaCl, KCl, CaCl2, MgCl2, MgSO4, CaSO4) and natural brines collected from the Stassfurt Salt Anticline (Germany; Stadler et al., 2012) were analysed with both techniques. Salt concentrations ranged from seawater salinity up to full saturation. References Lécuyer, C. et al. (2009). Chem. Geol., 264, 122-126. [doi:10.1016/j.chemgeo.2009.02.017] Martineau, F. et al. (2012). Chem. Geol., 291, 236-240. [doi:10.1016/j.chemgeo.2011.10.017] Stadler, S. et al. (2012). Chem. Geol., 294-295, 226-242. [doi:10.1016/j.chemgeo.2011.12.006

  6. Improving entrepreneurial opportunity recognition through web content analytics

    NASA Astrophysics Data System (ADS)

    Bakar, Muhamad Shahbani Abu; Azmi, Azwiyati

    2017-10-01

    The ability to recognize and develop an opportunity into a venture defines an entrepreneur. Research in opportunity recognition has been robust and focuses more on explaining the processes involved in opportunity recognition. Factors such as prior knowledge, cognitive and creative capabilities are shown to affect opportunity recognition in entrepreneurs. Prior knowledge in areas such as customer problems, ways to serve the market, and technology has been shows in various studies to be a factor that facilitates entrepreneurs to identify and recognize opportunities. Findings from research also shows that experienced entrepreneurs search and scan for information to discover opportunities. Searching and scanning for information has also been shown to help novice entrepreneurs who lack prior knowledge to narrow this gap and enable them to better identify and recognize opportunities. There is less focus in research on finding empirically proven techniques and methods to develop and enhance opportunity recognition in student entrepreneurs. This is important as the country pushes for more graduate entrepreneurs that can drive the economy. This paper aims to discuss Opportunity Recognition Support System (ORSS), an information support system to help especially student entrepreneurs in identifying and recognizing business opportunities. The ORSS aims to provide the necessary knowledge to student entrepreneurs to be able to better identify and recognize opportunities. Applying design research, theories in opportunity recognition are applied to identify the requirements for the support system and the requirements in turn dictate the design of the support system. The paper proposes the use of web content mining and analytics as two core components and techniques for the support system. Web content mining can mine the vast knowledge repositories available on the internet and analytics can provide entrepreneurs with further insights into the information needed to recognize opportunities in a given market or industry.

  7. Classification of user interfaces for graph-based online analytical processing

    NASA Astrophysics Data System (ADS)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  8. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  9. Multi-disciplinary optimization of aeroservoelastic systems

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1990-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  10. Multidisciplinary optimization of aeroservoelastic systems using reduced-size models

    NASA Technical Reports Server (NTRS)

    Karpel, Mordechay

    1992-01-01

    Efficient analytical and computational tools for simultaneous optimal design of the structural and control components of aeroservoelastic systems are presented. The optimization objective is to achieve aircraft performance requirements and sufficient flutter and control stability margins with a minimal weight penalty and without violating the design constraints. Analytical sensitivity derivatives facilitate an efficient optimization process which allows a relatively large number of design variables. Standard finite element and unsteady aerodynamic routines are used to construct a modal data base. Minimum State aerodynamic approximations and dynamic residualization methods are used to construct a high accuracy, low order aeroservoelastic model. Sensitivity derivatives of flutter dynamic pressure, control stability margins and control effectiveness with respect to structural and control design variables are presented. The performance requirements are utilized by equality constraints which affect the sensitivity derivatives. A gradient-based optimization algorithm is used to minimize an overall cost function. A realistic numerical example of a composite wing with four controls is used to demonstrate the modeling technique, the optimization process, and their accuracy and efficiency.

  11. Inline roasting hyphenated with gas chromatography-mass spectrometry as an innovative approach for assessment of cocoa fermentation quality and aroma formation potential.

    PubMed

    Van Durme, Jim; Ingels, Isabel; De Winne, Ann

    2016-08-15

    Today, the cocoa industry is in great need of faster and robust analytical techniques to objectively assess incoming cocoa quality. In this work, inline roasting hyphenated with a cooled injection system coupled to a gas chromatograph-mass spectrometer (ILR-CIS-GC-MS) has been explored for the first time to assess fermentation quality and/or overall aroma formation potential of cocoa. This innovative approach resulted in the in-situ formation of relevant cocoa aroma compounds. After comparison with data obtained by headspace solid phase micro extraction (HS-SPME-GC-MS) on conventional roasted cocoa beans, ILR-CIS-GC-MS data on unroasted cocoa beans showed similar formation trends of important cocoa aroma markers as a function of fermentation quality. The latter approach only requires small aliquots of unroasted cocoa beans, can be automatated, requires no sample preparation, needs relatively short analytical times (<1h) and is highly reproducible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    PubMed

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  13. Surface enhanced Raman spectroscopy (SERS) from a molecule adsorbed on a nanoscale silver particle cluster in a holographic plate

    NASA Astrophysics Data System (ADS)

    Jusinski, Leonard E.; Bahuguna, Ramen; Das, Amrita; Arya, Karamjeet

    2006-02-01

    Surface enhanced Raman spectroscopy has become a viable technique for the detection of single molecules. This highly sensitive technique is due to the very large (up to 14 orders in magnitude) enhancement in the Raman cross section when the molecule is adsorbed on a metal nanoparticle cluster. We report here SERS (Surface Enhanced Raman Spectroscopy) experiments performed by adsorbing analyte molecules on nanoscale silver particle clusters within the gelatin layer of commercially available holographic plates which have been developed and fixed. The Ag particles range in size between 5 - 30 nanometers (nm). Sample preparation was performed by immersing the prepared holographic plate in an analyte solution for a few minutes. We report here the production of SERS signals from Rhodamine 6G (R6G) molecules of nanomolar concentration. These measurements demonstrate a fast, low cost, reproducible technique of producing SERS substrates in a matter of minutes compared to the conventional procedure of preparing Ag clusters from colloidal solutions. SERS active colloidal solutions require up to a full day to prepare. In addition, the preparations of colloidal aggregates are not consistent in shape, contain additional interfering chemicals, and do not generate consistent SERS enhancement. Colloidal solutions require the addition of KCl or NaCl to increase the ionic strength to allow aggregation and cluster formation. We find no need to add KCl or NaCl to create SERS active clusters in the holographic gelatin matrix. These holographic plates, prepared using simple, conventional procedures, can be stored in an inert environment and preserve SERS activity after several weeks subsequent to preparation.

  14. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  15. Towards nanometric resolution in multilayer depth profiling: a comparative study of RBS, SIMS, XPS and GDOES.

    PubMed

    Escobar Galindo, Ramón; Gago, Raul; Duday, David; Palacio, Carlos

    2010-04-01

    An increasing amount of effort is currently being directed towards the development of new functionalized nanostructured materials (i.e., multilayers and nanocomposites). Using an appropriate combination of composition and microstructure, it is possible to optimize and tailor the final properties of the material to its final application. The analytical characterization of these new complex nanostructures requires high-resolution analytical techniques that are able to provide information about surface and depth composition at the nanometric level. In this work, we comparatively review the state of the art in four different depth-profiling characterization techniques: Rutherford backscattering spectroscopy (RBS), secondary ion mass spectrometry (SIMS), X-ray photoelectron spectroscopy (XPS) and glow discharge optical emission spectroscopy (GDOES). In addition, we predict future trends in these techniques regarding improvements in their depth resolutions. Subnanometric resolution can now be achieved in RBS using magnetic spectrometry systems. In SIMS, the use of rotating sample holders and oxygen flooding during analysis as well as the optimization of floating low-energy ion guns to lower the impact energy of the primary ions improves the depth resolution of the technique. Angle-resolved XPS provides a very powerful and nondestructive technique for obtaining depth profiling and chemical information within the range of a few monolayers. Finally, the application of mathematical tools (deconvolution algorithms and a depth-profiling model), pulsed sources and surface plasma cleaning procedures is expected to greatly improve GDOES depth resolution.

  16. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  17. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  18. Individual Human Cell Responses to Low Doses of Chemicals and Radiation Studied by Synchrotron Infrared Spectromicroscopy

    NASA Astrophysics Data System (ADS)

    Martin, Michael C.; Holman, Hoi-Ying N.; Blakely, Eleanor A.; Goth-Goldstein, Regine; McKinney, Wayne R.

    2000-03-01

    Vibrational spectroscopy, when combined with synchrotron radiation-based (SR) microscopy, is a powerful new analytical tool with high spatial resolution for detecting biochemical changes in individual living cells. In contrast to other microscopy methods that require fixing, drying, staining or labeling, SR FTIR microscopy probes intact living cells providing a composite view of all of the molecular responses and the ability to monitor the responses over time in the same cell. Observed spectral changes include all types of lesions induced in that cell as well as cellular responses to external and internal stresses. These spectral changes combined with other analytical tools may provide a fundamental understanding of the key molecular mechanisms induced in response to stresses created by low-doses of radiation and chemicals. In this study we used high spatial-resolution SR FTIR vibrational spectromicroscopy at ALS Beamline 1.4.3 as a sensitive analytical tool to detect chemical- and radiation-induced changes in individual human cells. Our preliminary spectral measurements indicate that this technique is sensitive enough to detect changes in nucleic acids and proteins of cells treated with environmentally relevant concentrations of oxidative stresses: bleomycin, hydrogen peroxide, and X-rays. We observe spectral changes that are unique to each exogenous stressor. This technique has the potential to distinguish changes from exogenous or endogenous oxidative processes. Future development of this technique will allow rapid monitoring of cellular processes such as drug metabolism, early detection of disease, bio-compatibility of implant materials, cellular repair mechanisms, self assembly of cellular apparatus, cell differentiation and fetal development.

  19. A simple {sup 197}Hg RNAA procedure for the determination of mercury in urine, blood, and tissue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blotcky, A.J.; Rack, E.P.; Meade, A.G.

    1995-12-31

    Mercury has been implicated as a causal agent in such central nervous system diseases as Alzheimer`s and Parkinson`s. Consequently, there has been increased interest in the determination of ultra-trace-level mercury in biological matrices, especially in tissue. While such nonnuclear techniques as cold vapor atomic absorption spectrometry and cold vapor atomic fluorescence spectrometry have been employed routinely for mercury determinations in urine and blood, there is a paucity of nonnuclear techniques for the determination of mercury in the low parts-per-billion range in biological tissue. As pointed out by Fardy and Warner, instrumental and radiochemical neutron activation analysis (INAA and RNAA) requiremore » no blank determinations in contrast to nonnuclear analytical techniques employing digestion and/or chemical operations. Therefore, INAA and RNAA become the obvious choices for determination of ultra-trace levels of mercury in tissue. Most separation methods reported in the literature require different and separate methodologies for mercury determinations in urine, blood, or tissue. The purposes of this study are to develop a single methodology for the determination of low levels of mercury in all biological matrices by RNAA and to optimize parameters necessary for an efficacious trace-level determination. Previously, few studies have taken into account the effects of the Szilard-Chalmers reactions of the radioactivatable analyte within a biological matrix. It also would appear that little attention has been given to the optimum postirradiation carrier concentration of the analyte species necessary. This study discusses these various considerations.« less

  20. Graph Theoretic Foundations of Multibody Dynamics Part I: Structural Properties

    PubMed Central

    Jain, Abhinandan

    2011-01-01

    This is the first part of two papers that use concepts from graph theory to obtain a deeper understanding of the mathematical foundations of multibody dynamics. The key contribution is the development of a unifying framework that shows that key analytical results and computational algorithms in multibody dynamics are a direct consequence of structural properties and require minimal assumptions about the specific nature of the underlying multibody system. This first part focuses on identifying the abstract graph theoretic structural properties of spatial operator techniques in multibody dynamics. The second part paper exploits these structural properties to develop a broad spectrum of analytical results and computational algorithms. Towards this, we begin with the notion of graph adjacency matrices and generalize it to define block-weighted adjacency (BWA) matrices and their 1-resolvents. Previously developed spatial operators are shown to be special cases of such BWA matrices and their 1-resolvents. These properties are shown to hold broadly for serial and tree topology multibody systems. Specializations of the BWA and 1-resolvent matrices are referred to as spatial kernel operators (SKO) and spatial propagation operators (SPO). These operators and their special properties provide the foundation for the analytical and algorithmic techniques developed in the companion paper. We also use the graph theory concepts to study the topology induced sparsity structure of these operators and the system mass matrix. Similarity transformations of these operators are also studied. While the detailed development is done for the case of rigid-link multibody systems, the extension of these techniques to a broader class of systems (e.g. deformable links) are illustrated. PMID:22102790

  1. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  2. Moving your laboratories to the field--Advantages and limitations of the use of field portable instruments in environmental sample analysis.

    PubMed

    Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek

    2015-07-01

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Sampling Mars: Analytical requirements and work to do in advance

    NASA Technical Reports Server (NTRS)

    Koeberl, Christian

    1988-01-01

    Sending a mission to Mars to collect samples and return them to the Earth for analysis is without doubt one of the most exciting and important tasks for planetary science in the near future. Many scientifically important questions are associated with the knowledge of the composition and structure of Martian samples. Amongst the most exciting questions is the clarification of the SNC problem- to prove or disprove a possible Martian origin of these meteorites. Since SNC meteorites have been used to infer the chemistry of the planet Mars, and its evolution (including the accretion history), it would be important to know if the whole story is true. But before addressing possible scientific results, we have to deal with the analytical requirements, and with possible pre-return work. It is unlikely to expect that a possible Mars sample return mission will bring back anything close to the amount returned by the Apollo missions. It will be more like the amount returned by the Luna missions, or at least in that order of magnitude. This requires very careful sample selection, and very precise analytical techniques. These techniques should be able to use minimal sample sizes and on the other hand optimize the scientific output. The possibility to work with extremely small samples should not obstruct another problem: possible sampling errors. As we know from terrestrial geochemical studies, sampling procedures are quite complicated and elaborate to ensure avoiding sampling errors. The significance of analyzing a milligram or submilligram sized sample and putting that in relationship with the genesis of whole planetary crusts has to be viewed with care. This leaves a dilemma on one hand, to minimize the sample size as far as possible in order to have the possibility of returning as many different samples as possible, and on the other hand to take a sample large enough to be representative. Whole rock samples are very useful, but should not exceed the 20 to 50 g range, except in cases of extreme inhomogeneity, because for larger samples the information tends to become redundant. Soil samples should be in the 2 to 10 g range, permitting the splitting of the returned samples for studies in different laboratories with variety of techniques.

  4. Evaluation of new laser spectrometer techniques for in-situ carbon monoxide measurements

    NASA Astrophysics Data System (ADS)

    Zellweger, C.; Steinbacher, M.; Buchmann, B.

    2012-10-01

    Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. The current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came to market with promising properties for trace gas analytics. The current study compares three instruments that have recently become commercially available (since 2011) with the best currently available technique (Vacuum UV Fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques perform considerably better compared to previous techniques, although some issues, such as temperature influence and cross sensitivities, need further attention.

  5. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  6. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  7. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions

    NASA Astrophysics Data System (ADS)

    Donahue, William; Newhauser, Wayne D.; Ziegler, James F.

    2016-09-01

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u-1 to 450 MeV u-1 or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less

  9. Analytical model for ion stopping power and range in the therapeutic energy interval for beams of hydrogen and heavier ions.

    PubMed

    Donahue, William; Newhauser, Wayne D; Ziegler, James F

    2016-09-07

    Many different approaches exist to calculate stopping power and range of protons and heavy charged particles. These methods may be broadly categorized as physically complete theories (widely applicable and complex) or semi-empirical approaches (narrowly applicable and simple). However, little attention has been paid in the literature to approaches that are both widely applicable and simple. We developed simple analytical models of stopping power and range for ions of hydrogen, carbon, iron, and uranium that spanned intervals of ion energy from 351 keV u(-1) to 450 MeV u(-1) or wider. The analytical models typically reproduced the best-available evaluated stopping powers within 1% and ranges within 0.1 mm. The computational speed of the analytical stopping power model was 28% faster than a full-theoretical approach. The calculation of range using the analytic range model was 945 times faster than a widely-used numerical integration technique. The results of this study revealed that the new, simple analytical models are accurate, fast, and broadly applicable. The new models require just 6 parameters to calculate stopping power and range for a given ion and absorber. The proposed model may be useful as an alternative to traditional approaches, especially in applications that demand fast computation speed, small memory footprint, and simplicity.

  10. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  11. Tunable, Flexible, and Efficient Optimization of Control Pulses for Practical Qubits

    NASA Astrophysics Data System (ADS)

    Machnes, Shai; Assémat, Elie; Tannor, David; Wilhelm, Frank K.

    2018-04-01

    Quantum computation places very stringent demands on gate fidelities, and experimental implementations require both the controls and the resultant dynamics to conform to hardware-specific constraints. Superconducting qubits present the additional requirement that pulses must have simple parameterizations, so they can be further calibrated in the experiment, to compensate for uncertainties in system parameters. Other quantum technologies, such as sensing, require extremely high fidelities. We present a novel, conceptually simple and easy-to-implement gradient-based optimal control technique named gradient optimization of analytic controls (GOAT), which satisfies all the above requirements, unlike previous approaches. To demonstrate GOAT's capabilities, with emphasis on flexibility and ease of subsequent calibration, we optimize fast coherence-limited pulses for two leading superconducting qubits architectures—flux-tunable transmons and fixed-frequency transmons with tunable couplers.

  12. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  13. Analytical N beam position monitor method

    NASA Astrophysics Data System (ADS)

    Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.

    2017-11-01

    Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.

  14. Shape resonances of Be- and Mg- investigated with the method of analytic continuation

    NASA Astrophysics Data System (ADS)

    Čurík, Roman; Paidarová, I.; Horáček, J.

    2018-05-01

    The regularized method of analytic continuation is used to study the low-energy negative-ion states of beryllium (configuration 2 s2ɛ p 2P ) and magnesium (configuration 3 s2ɛ p 2P ) atoms. The method applies an additional perturbation potential and requires only routine bound-state multi-electron quantum calculations. Such computations are accessible by most of the free or commercial quantum chemistry software available for atoms and molecules. The perturbation potential is implemented as a spherical Gaussian function with a fixed width. Stability of the analytic continuation technique with respect to the width and with respect to the input range of electron affinities is studied in detail. The computed resonance parameters Er=0.282 eV, Γ =0.316 eV for the 2 p state of Be- and Er=0.188 eV, Γ =0.167 for the 3 p state of Mg- agree well with the best results obtained by much more elaborate and computationally demanding present-day methods.

  15. Literature search of publications concerning the prediction of dynamic inlet flow distortion and related topics

    NASA Technical Reports Server (NTRS)

    Schweikhhard, W. G.; Chen, Y. S.

    1983-01-01

    Publications prior to March 1981 were surveyed to determine inlet flow dynamic distortion prediction methods and to catalog experimental and analytical information concerning inlet flow dynamic distortion prediction methods and to catalog experimental and analytical information concerning inlet flow dynamics at the engine-inlet interface of conventional aircraft (excluding V/STOL). The sixty-five publications found are briefly summarized and tabulated according to topic and are cross-referenced according to content and nature of the investigation (e.g., predictive, experimental, analytical and types of tests). Three appendices include lists of references, authors, organizations and agencies conducting the studies. Also, selected materials summaries, introductions and conclusions - from the reports are included. Few reports were found covering methods for predicting the probable maximum distortion. The three predictive methods found are those of Melick, Jacox and Motycka. The latter two require extensive high response pressure measurements at the compressor face, while the Melick Technique can function with as few as one or two measurements.

  16. Analysis and Sizing for Transient Thermal Heating of Insulated Aerospace Vehicle Structures

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated structure subjected to a simplified heat pulse. The solution is solely a function of two nondimensional parameters. Simpler functions of these two parameters were developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective thermal properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Equations were also developed for the minimum mass required to maintain the inner, unheated surface below a specified temperature. In the course of the derivation, two figures of merit were identified. Required insulation masses calculated using the approximate equation were shown to typically agree with finite element results within 10%-20% over the relevant range of parameters studied.

  17. Proposed techniques for launching instrumented balloons into tornadoes

    NASA Technical Reports Server (NTRS)

    Grant, F. C.

    1971-01-01

    A method is proposed to introduce instrumented balloons into tornadoes by means of the radial pressure gradient, which supplies a buoyancy force driving to the center. Presented are analytical expressions, verified by computer calculations, which show the possibility of introducing instrumented balloons into tornadoes at or below the cloud base. The times required to reach the center are small enough that a large fraction of tornadoes are suitable for the technique. An experimental procedure is outlined in which a research airplane puts an instrumented, self-inflating balloon on the track ahead of the tornado. The uninflated balloon waits until the tornado closes to, typically, 750 meters; then it quickly inflates and spirals up and into the core, taking roughly 3 minutes. Since the drive to the center is automatically produced by the radial pressure gradient, a proper launch radius is the only guidance requirement.

  18. Occams Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel (Open Source)

    DTIC Science & Technology

    2016-02-15

    do not quote them here. A sequel details a yet more efficient analytic technique based on holomorphic functions of the internal - state Markov chain...required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state ...minimal, unifilar predictor4. The -machine’s causal states σ ∈ are defined by the equivalence relation that groups all histories = −∞ ←x x :0 that

  19. PROGRAM ASTEC (ADVANCED SOLAR TURBO ELECTRIC CONCEPT). PART IV. SOLAR COLLECTOR DEVELOPMENT SUPPORT TASKS. VOL. VI. DEVELOPMENT OF ANALYTICAL TECHNIQUES TO PREDICT THE STRUCTURAL BEHAVIOR OF PETAL-TYPE SOLAR COLLECTORS.

    DTIC Science & Technology

    The design of large petal-type paraboloidal solar collectors for the ASTEC Program requires a capability for determining the distortion and stress...analysis of a parabolic curved beam is given along with a numerical solution and digital program. The dynamic response of the ASTEC flight-test vehicle is discussed on the basis of modal analysis.

  20. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  1. A Methodology for Determining Statistical Performance Compliance for Airborne Doppler Radar with Forward-Looking Turbulence Detection Capability

    NASA Technical Reports Server (NTRS)

    Bowles, Roland L.; Buck, Bill K.

    2009-01-01

    The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.

  2. Advanced DPSM approach for modeling ultrasonic wave scattering in an arbitrary geometry

    NASA Astrophysics Data System (ADS)

    Yadav, Susheel K.; Banerjee, Sourav; Kundu, Tribikram

    2011-04-01

    Several techniques are used to diagnose structural damages. In the ultrasonic technique structures are tested by analyzing ultrasonic signals scattered by damages. The interpretation of these signals requires a good understanding of the interaction between ultrasonic waves and structures. Therefore, researchers need analytical or numerical techniques to have a clear understanding of the interaction between ultrasonic waves and structural damage. However, modeling of wave scattering phenomenon by conventional numerical techniques such as finite element method requires very fine mesh at high frequencies necessitating heavy computational power. Distributed point source method (DPSM) is a newly developed robust mesh free technique to simulate ultrasonic, electrostatic and electromagnetic fields. In most of the previous studies the DPSM technique has been applied to model two dimensional surface geometries and simple three dimensional scatterer geometries. It was difficult to perform the analysis for complex three dimensional geometries. This technique has been extended to model wave scattering in an arbitrary geometry. In this paper a channel section idealized as a thin solid plate with several rivet holes is formulated. The simulation has been carried out with and without cracks near the rivet holes. Further, a comparison study has been also carried out to characterize the crack. A computer code has been developed in C for modeling the ultrasonic field in a solid plate with and without cracks near the rivet holes.

  3. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  4. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  5. Strategies for Distinguishing Abiotic Chemistry from Martian Biochemistry in Samples Returned from Mars

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Burton, A. S.; Callahan, M. P.; Elsila, J. E.; Stern, J. C.; Dworkin, J. P.

    2012-01-01

    A key goal in the search for evidence of extinct or extant life on Mars will be the identification of chemical biosignatures including complex organic molecules common to all life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, and nucleobases, which serve as the structural basis of information storage in DNA and RNA. However, many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1]. Therefore, an important challenge in the search for evidence of life on Mars will be distinguishing between abiotic chemistry of either meteoritic or martian origin from any chemical biosignatures from an extinct or extant martian biota. Although current robotic missions to Mars, including the 2011 Mars Science Laboratory (MSL) and the planned 2018 ExoMars rovers, will have the analytical capability needed to identify these key classes of organic molecules if present [2,3], return of a diverse suite of martian samples to Earth would allow for much more intensive laboratory studies using a broad array of extraction protocols and state-of-theart analytical techniques for bulk and spatially resolved characterization, molecular detection, and isotopic and enantiomeric compositions that may be required for unambiguous confirmation of martian life. Here we will describe current state-of-the-art laboratory analytical techniques that have been used to characterize the abundance and distribution of amino acids and nucleobases in meteorites, Apollo samples, and comet- exposed materials returned by the Stardust mission with an emphasis on their molecular characteristics that can be used to distinguish abiotic chemistry from biochemistry as we know it. The study of organic compounds in carbonaceous meteorites is highly relevant to Mars sample return analysis, since exogenous organic matter should have accumulated in the martian regolith over the last several billion years and the analytical techniques previously developed for the study of extraterrestrial materials can be applied to martian samples.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eppich, G.; Kips, R.; Lindvall, R.

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived frommore » the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results indicate that the CUP-2 standard has a natural isotopic ratio, and does not appear to have been isotopically enriched or depleted in any way, and was not contaminated by a source of uranium with a non-natural isotopic composition. Furthermore, the lack of 233U and 236U above the instrumental detection limit indicates that this sample was not exposed to a neutron flux, which would have generated one or both of these isotopes in measurable concentrations.« less

  7. Advantages and Challenges of Dried Blood Spot Analysis by Mass Spectrometry Across the Total Testing Process

    PubMed Central

    Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter

    2016-01-01

    Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263

  8. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  9. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  10. An analytical study of reduced-gravity liquid reorientation using a simplified marker and cell technique

    NASA Technical Reports Server (NTRS)

    Betts, W. S., Jr.

    1972-01-01

    A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.

  11. A real time sorbent based air monitoring system for determining low level airborne exposure levels to Lewisite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lattin, F.G.; Paul, D.G.; Jakubowski, E.M.

    1994-12-31

    The Real Time Analytical Platform (RTAP) is designed to provide mobile, real-time monitoring support to ensure protection of worker safety in areas where military unique compounds are used and stored, and at disposal sites. Quantitative analysis of low-level vapor concentrations in air is accomplished through sorbent-based collection with subsequent thermal desorption into a gas chromatograph (GC) equipped with a variety of detectors. The monitoring system is characterized by its sensitivity (ability to measure at low concentrations), selectivity (ability to filter out interferences), dynamic range and linearity, real time mode (versus methods requiring extensive sample preparation procedures), and ability to interfacemore » with complimentary GC detectors. This presentation describes an RTAP analytical method for analyzing lewisite, an arsenical compound, that consists of a GC screening technique with an Electron Capture Detector (ECD), and a confirmation technique using an Atomic Emission Detector (AED). Included in the presentation is a description of quality assurance objectives in the monitoring system, and an assessment of method accuracy, precision and detection levels.« less

  12. Preparation of Ion Exchange Films for Solid-Phase Spectrophotometry and Solid-Phase Fluorometry

    NASA Technical Reports Server (NTRS)

    Hill, Carol M.; Street, Kenneth W.; Tanner, Stephen P.; Philipp, Warren H.

    2000-01-01

    Atomic spectroscopy has dominated the field of trace inorganic analysis because of its high sensitivity and selectivity. The advantages gained by the atomic spectroscopies come with the disadvantage of expensive and often complicated instrumentation. Solid-phase spectroscopy, in which the analyte is preconcentrated on a solid medium followed by conventional spectrophotometry or fluorometry, requires less expensive instrumentation and has considerable sensitivity and selectivity. The sensitivity gains come from preconcentration and the use of chromophore (or fluorophore) developers and the selectivity is achieved by use of ion exchange conditions that favor the analyte in combination with speciative chromophores. Little work has been done to optimize the ion exchange medium (IEM) associated with these techniques. In this report we present a method for making ion exchange polymer films, which considerably simplify the solid-phase spectroscopic techniques. The polymer consists of formaldehyde-crosslinked polyvinyl alcohol with polyacrylic acid entrapped therein. The films are a carboxylate weak cation exchanger in the calcium form. They are mechanically sturdy and optically transparent in the ultraviolet and visible portion of the spectrum, which makes them suitable for spectrophotometry and fluorometry.

  13. Analytical methods for determination of mycotoxins: An update (2009-2014).

    PubMed

    Turner, Nicholas W; Bramhmbhatt, Heli; Szabo-Vezse, Monika; Poma, Alessandro; Coker, Raymond; Piletsky, Sergey A

    2015-12-11

    Mycotoxins are a problematic and toxic group of small organic molecules that are produced as secondary metabolites by several fungal species that colonise crops. They lead to contamination at both the field and postharvest stages of food production with a considerable range of foodstuffs affected, from coffee and cereals, to dried fruit and spices. With wide ranging structural diversity of mycotoxins, severe toxic effects caused by these molecules and their high chemical stability the requirement for robust and effective detection methods is clear. This paper builds on our previous review and summarises the most recent advances in this field, in the years 2009-2014 inclusive. This review summarises traditional methods such as chromatographic and immunochemical techniques, as well as newer approaches such as biosensors, and optical techniques which are becoming more prevalent. A section on sampling and sample treatment has been prepared to highlight the importance of this step in the analytical methods. We close with a look at emerging technologies that will bring effective and rapid analysis out of the laboratory and into the field. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Photochemical Degradation of the Anticancer Drug Bortezomib by V-UV/UV (185/254 nm) Investigated by (1)H NMR Fingerprinting: A Way to Follow Aromaticity Evolution.

    PubMed

    Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence

    2015-06-18

    We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.

  15. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  16. Portal scatter to primary dose ratio of 4 to 18 MV photon spectra incident on heterogeneous phantoms

    NASA Astrophysics Data System (ADS)

    Ozard, Siobhan R.

    Electronic portal imagers designed and used to verify the positioning of a cancer patient undergoing radiation treatment can also be employed to measure the in vivo dose received by the patient. This thesis investigates the ratio of the dose from patient-scattered particles to the dose from primary (unscattered) photons at the imaging plane, called the scatter to primary dose ratio (SPR). The composition of the SPR according to the origin of scatter is analyzed more thoroughly than in previous studies. A new analytical method for calculating the SPR is developed and experimentally verified for heterogeneous phantoms. A novel technique that applies the analytical SPR method for in vivo dosimetry with a portal imager is evaluated. Monte Carlo simulation was used to determine the imager dose from patient-generated electrons and photons that scatter one or more times within the object. The database of SPRs reported from this investigation is new since the contribution from patient-generated electrons was neglected by previous Monte Carlo studies. The SPR from patient-generated electrons was found here to be as large as 0.03. The analytical SPR method relies on the established result that the scatter dose is uniform for an air gap between the patient and the imager that is greater than 50 cm. This method also applies the hypothesis that first-order Compton scatter only, is sufficient for scatter estimation. A comparison of analytical and measured SPRs for neck, thorax, and pelvis phantoms showed that the maximum difference was within +/-0.03, and the mean difference was less than +/-0.01 for most cases. This accuracy was comparable to similar analytical approaches that are limited to homogeneous phantoms. The analytical SPR method could replace lookup tables of measured scatter doses that can require significant time to measure. In vivo doses were calculated by combining our analytical SPR method and the convolution/superposition algorithm. Our calculated in vivo doses agreed within +/-3% with the doses measured in the phantom. The present in vivo method was faster compared to other techniques that use convolution/superposition. Our method is a feasible and satisfactory approach that contributes to on-line patient dose monitoring.

  17. Flight Validation of Mars Mission Technologies

    NASA Technical Reports Server (NTRS)

    Eberspeaker, P. J.

    2000-01-01

    Effective exploration and characterization of Mars will require the deployment of numerous surface probes, tethered balloon stations and free-flying balloon systems as well as larger landers and orbiting satellite systems. Since launch opportunities exist approximately every two years it is extremely critical that each and every mission maximize its potential for success. This will require significant testing of each system in an environment that simulates the actual operational environment as closely as possible. Analytical techniques and laboratory testing goes a long way in mitigating the inherent risks associated with space exploration, however they fall sort of accurately simulating the unpredictable operational environment in which these systems must function.

  18. Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.

    PubMed

    Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio

    2009-12-01

    Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.

  19. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    PubMed

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  20. Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.

    PubMed

    Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline

    2017-01-01

    Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.

  1. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  2. Contemporary sample stacking in analytical electrophoresis.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2011-01-01

    Sample stacking is of vital importance for analytical CE since it may bring the required sensitivity of analyses. A lot of new relevant papers are published every year and regular surveys seem to be very helpful for experts and practitioners. The contribution presented here is a continuation of a series of regularly published reviews on the topic and covers the last two years. It brings a survey of related literature organized, in accord with the main principle used in the procedure published, in the following mainstream sections: Kohlrausch adjustment of concentrations, pH step, micellar systems and combined techniques. Each part covers literature sorted according to the field of application as, e.g. clinical, pharmaceutical, food, environmental, etc. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Base catalytic transesterification of vegetable oil.

    PubMed

    Mainali, Kalidas

    2012-01-01

    Sustainable economic and industrial growth requires safe, sustainable resources of energy. Biofuel is becoming increasingly important as an alternative fuel for the diesel engine. The use of non-edible vegetable oils for biofuel production is significant because of the increasing demand for edible oils as food. With the recent debate of food versus fuel, some non-edible oils like soapnut and Jatropha (Jatropha curcus. L) are being investigated as possible sources of biofuel. Recent research has focused on the application of heterogeneous catalysis. This review considers catalytic transesterification and the possibility of heterogeneous base catalysts. The process of transesterification, and the effect of parameters, mechanism and kinetics are reviewed. Although chromatography (GC and HPLC) are the analytical methods most often used for biofuel characterization, other techniques and some improvements to analytical methods are discussed.

  4. Common aspects influencing the translocation of SERS to Biomedicine.

    PubMed

    Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu

    2018-01-04

    In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Metabolic Analysis

    NASA Astrophysics Data System (ADS)

    Tolstikov, Vladimir V.

    Analysis of the metabolome with coverage of all of the possibly detectable components in the sample, rather than analysis of each individual metabolite at a given time, can be accomplished by metabolic analysis. Targeted and/or nontargeted approaches are applied as needed for particular experiments. Monitoring hundreds or more metabolites at a given time requires high-throughput and high-end techniques that enable screening for relative changes in, rather than absolute concentrations of, compounds within a wide dynamic range. Most of the analytical techniques useful for these purposes use GC or HPLC/UPLC separation modules coupled to a fast and accurate mass spectrometer. GC separations require chemical modification (derivatization) before analysis, and work efficiently for the small molecules. HPLC separations are better suited for the analysis of labile and nonvolatile polar and nonpolar compounds in their native form. Direct infusion and NMR-based techniques are mostly used for fingerprinting and snap phenotyping, where applicable. Discovery and validation of metabolic biomarkers are exciting and promising opportunities offered by metabolic analysis applied to biological and biomedical experiments. We have demonstrated that GC-TOF-MS, HPLC/UPLC-RP-MS and HILIC-LC-MS techniques used for metabolic analysis offer sufficient metabolome mapping providing researchers with confident data for subsequent multivariate analysis and data mining.

  6. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  7. 40 CFR 161.180 - Enforcement analytical method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...

  8. Control system design for flexible structures using data models

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis; Frazier, W. Garth; Mitchell, Jerrel R.; Medina, Enrique A.; Bukley, Angelia P.

    1993-01-01

    The dynamics and control of flexible aerospace structures exercises many of the engineering disciplines. In recent years there has been considerable research in the developing and tailoring of control system design techniques for these structures. This problem involves designing a control system for a multi-input, multi-output (MIMO) system that satisfies various performance criteria, such as vibration suppression, disturbance and noise rejection, attitude control and slewing control. Considerable progress has been made and demonstrated in control system design techniques for these structures. The key to designing control systems for these structures that meet stringent performance requirements is an accurate model. It has become apparent that theoretically and finite-element generated models do not provide the needed accuracy; almost all successful demonstrations of control system design techniques have involved using test results for fine-tuning a model or for extracting a model using system ID techniques. This paper describes past and ongoing efforts at Ohio University and NASA MSFC to design controllers using 'data models.' The basic philosophy of this approach is to start with a stabilizing controller and frequency response data that describes the plant; then, iteratively vary the free parameters of the controller so that performance measures become closer to satisfying design specifications. The frequency response data can be either experimentally derived or analytically derived. One 'design-with-data' algorithm presented in this paper is called the Compensator Improvement Program (CIP). The current CIP designs controllers for MIMO systems so that classical gain, phase, and attenuation margins are achieved. The center-piece of the CIP algorithm is the constraint improvement technique which is used to calculate a parameter change vector that guarantees an improvement in all unsatisfied, feasible performance metrics from iteration to iteration. The paper also presents a recently demonstrated CIP-type algorithm, called the Model and Data Oriented Computer-Aided Design System (MADCADS), developed for achieving H(sub infinity) type design specifications using data models. Control system design for the NASA/MSFC Single Structure Control Facility are demonstrated for both CIP and MADCADS. Advantages of design-with-data algorithms over techniques that require analytical plant models are also presented.

  9. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  10. Remote software upload techniques in future vehicles and their performance analysis

    NASA Astrophysics Data System (ADS)

    Hossain, Irina

    Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.

  11. Analytical Chemistry Laboratory Progress Report for FY 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less

  12. Extended sources near-field processing of experimental aperture synthesis data and application of the Gerchberg method for enhancing radiometric three-dimensional millimetre-wave images in security screening portals

    NASA Astrophysics Data System (ADS)

    Salmon, Neil A.

    2017-10-01

    Aperture synthesis for passive millimetre wave imaging provides a means to screen people for concealed threats in the extreme near-field configuration of a portal, a regime where the imager to subject distance is of the order of both the required depth-of-field and the field-of-view. Due to optical aberrations, focal plane array imagers cannot deliver the large depth-of-fields and field-of-views required in this regime. Active sensors on the other hand can deliver these but face challenges of illumination, speckle and multi-path issues when imaging canyon regions of the body. Fortunately an aperture synthesis passive millimetre wave imaging system can deliver large depth-of-fields and field-of-views, whilst having no speckle effects, as the radiometric emission from the human body is spatially incoherent. Furthermore, as in portal security screening scenarios the aperture synthesis imaging technique delivers a half-wavelength spatial resolution, it can effectively screen the whole of the human body. Some recent measurements are presented that demonstrate the three-dimensional imaging capability of extended sources using a 22 GHz aperture synthesis system. A comparison is made between imagery generated via the analytic Fourier transform and a gridding fast Fourier transform method. The analytic Fourier transform enables aliasing in the imagery to be more clearly identified. Some initial results are also presented of how the Gerchberg technique, an image enhancement algorithm used in radio astronomy, is adapted for three-dimensional imaging in security screening. This technique is shown to be able to improve the quality of imagery, without adding extra receivers to the imager. The requirements of a walk through security screening system for use at entrances to airport departure lounges are discussed, concluding that these can be met by an aperture synthesis imager.

  13. Sensor Data Qualification System (SDQS) Implementation Study

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Melcher, Kevin; Fulton, Christopher; Maul, William

    2009-01-01

    The Sensor Data Qualification System (SDQS) is being developed to provide a sensor fault detection capability for NASA s next-generation launch vehicles. In addition to traditional data qualification techniques (such as limit checks, rate-of-change checks and hardware redundancy checks), SDQS can provide augmented capability through additional techniques that exploit analytical redundancy relationships to enable faster and more sensitive sensor fault detection. This paper documents the results of a study that was conducted to determine the best approach for implementing a SDQS network configuration that spans multiple subsystems, similar to those that may be implemented on future vehicles. The best approach is defined as one that most minimizes computational resource requirements without impacting the detection of sensor failures.

  14. Virtual Screening of Receptor Sites for Molecularly Imprinted Polymers.

    PubMed

    Bates, Ferdia; Cela-Pérez, María Concepción; Karim, Kal; Piletsky, Sergey; López-Vilariño, José Manuel

    2016-08-01

    Molecularly Imprinted Polymers (MIPs) are highly advantageous in the field of analytical chemistry. However, interference from secondary molecules can also impede capture of a target by a MIP receptor. This greatly complicates the design process and often requires extensive laboratory screening which is time consuming, costly, and creates substantial waste products. Herein, is presented a new technique for screening of "virtually imprinted receptors" for rebinding of the molecular template as well as secondary structures, correlating the virtual predictions with experimentally acquired data in three case studies. This novel technique is particularly applicable to the evaluation and prediction of MIP receptor specificity and efficiency in complex aqueous systems. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates

    DOE PAGES

    Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.; ...

    2014-11-04

    In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less

  16. A data mining system for providing analytical information on brain tumors to public health decision makers.

    PubMed

    Santos, R S; Malheiros, S M F; Cavalheiro, S; de Oliveira, J M Parente

    2013-03-01

    Cancer is the leading cause of death in economically developed countries and the second leading cause of death in developing countries. Malignant brain neoplasms are among the most devastating and incurable forms of cancer, and their treatment may be excessively complex and costly. Public health decision makers require significant amounts of analytical information to manage public treatment programs for these patients. Data mining, a technology that is used to produce analytically useful information, has been employed successfully with medical data. However, the large-scale adoption of this technique has been limited thus far because it is difficult to use, especially for non-expert users. One way to facilitate data mining by non-expert users is to automate the process. Our aim is to present an automated data mining system that allows public health decision makers to access analytical information regarding brain tumors. The emphasis in this study is the use of ontology in an automated data mining process. The non-experts who tried the system obtained useful information about the treatment of brain tumors. These results suggest that future work should be conducted in this area. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Superhydrophobic Analyte Concentration Utilizing Colloid-Pillar Array SERS Substrates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wallace, Ryan A.; Charlton, Jennifer J.; Kirchner, Teresa B.

    In order to detect a few molecules present in a large sample it is important to know the trace components in the medicinal and environmental sample. Surface enhanced Raman spectroscopy (SERS) is a technique that can be utilized to detect molecules at very low absolute numbers. However, detection at trace concentration levels in real samples requires properly designed delivery and detection systems. Moreover, the following work involves superhydrophobic surfaces that includes silicon pillar arrays formed by lithographic and dewetting protocols. In order to generate the necessary plasmonic substrate for SERS detection, simple and flow stable Ag colloid was added tomore » the functionalized pillar array system via soaking. The pillars are used native and with hydrophobic modification. The pillars provide a means to concentrate analyte via superhydrophobic droplet evaporation effects. A 100-fold concentration of analyte was estimated, with a limit of detection of 2.9 10-12 M for mitoxantrone dihydrochloride. Additionally, analytes were delivered to the surface via a multiplex approach in order to demonstrate an ability to control droplet size and placement for scaled-up applications in real world applications. Finally, a concentration process involving transport and sequestration based on surface treatment selective wicking is demonstrated.« less

  18. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  19. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  20. Comparision of ICP-OES and MP-AES in determing soil nutrients by Mechlich3 method

    NASA Astrophysics Data System (ADS)

    Tonutare, Tonu; Penu, Priit; Krebstein, Kadri; Rodima, Ako; Kolli, Raimo; Shanskiy, Merrit

    2014-05-01

    Accurate, routine testing of nutrients in soil samples is critical to understanding soil potential fertility. There are different factors which must be taken into account selecting the best analytical technique for soil laboratory analysis. Several techniques can provide adequate detection range for same analytical subject. In similar cases the choise of technique will depend on factors such as sample throughput, required infrastructure, ease of use, used chemicals and need for gas supply and operating costs. Mehlich 3 extraction method is widely used for the determination of the plant available nutrient elements contents in agricultural soils. For determination of Ca, K, and Mg from soil extract depending of laboratory ICP and AAS techniques are used, also flame photometry for K in some laboratories. For the determination of extracted P is used ICP or Vis spectrometry. The excellent sensitivity and wide working range for all extracted elements make ICP a nearly ideal method, so long as the sample throughput is big enough to justify the initial capital outlay. Other advantage of ICP techniques is the multiplex character (simultaneous acquisition of all wavelengths). Depending on element the detection limits are in range 0.1 - 1000 μg/L. For smaller laboratories with low sample throughput requirements the use of AAS is more common. Flame AAS is a fast, relatively cheap and easy technique for analysis of elements. The disadvantages of the method is single element analysis and use of flammable gas, like C2H2 and oxidation gas N2O for some elements. Detection limits of elements for AAS lays from 1 to 1000 μg/L. MP-AES offers a unique alternative to both, AAS and ICP-OES techniques with its detection power, speed of analysis. MP-AES is quite new, simple and relatively inexpensive multielemental technique, which is use self-sustained atmospheric pressure microwave plasma (MP) using nitrogen gas generated by nitrogen generator. Therefore not needs for argon and flammable (C2H2) gases, cylinder handling and the running costs of equipment are low. Detection limits of elements for MP-AES lays between the AAS and ICP ones. The objective of this study was to compare the results of soil analysis using two multielemental analytical methods - ICP-OES and MP-AES. In the experiment, different soil types with various texture, content of organic matter and pH were used. For the study soil samples of Albeluvisols, Leptosols, Cambisols, Regosols and Histosols were used . The plant available nutrients were estimated by Mehlich 3 extraction. The ICP-OES analysis were provided in the Estonian Agricultural Research Centre and MP-AES analysis in department of Soil Science and Agrochemistry at Estonian University of Life Sciences. The detection limits and limits of quantification of Ca, K, Mg and P in extracts are calculated and reported.

  1. Developing Formal Correctness Properties from Natural Language Requirements

    NASA Technical Reports Server (NTRS)

    Nikora, Allen P.

    2006-01-01

    This viewgraph presentation reviews the rationale of the program to transform natural language specifications into formal notation.Specifically, automate generation of Linear Temporal Logic (LTL)correctness properties from natural language temporal specifications. There are several reasons for this approach (1) Model-based techniques becoming more widely accepted, (2) Analytical verification techniques (e.g., model checking, theorem proving) significantly more effective at detecting types of specification design errors (e.g., race conditions, deadlock) than manual inspection, (3) Many requirements still written in natural language, which results in a high learning curve for specification languages, associated tools and increased schedule and budget pressure on projects reduce training opportunities for engineers, and (4) Formulation of correctness properties for system models can be a difficult problem. This has relevance to NASA in that it would simplify development of formal correctness properties, lead to more widespread use of model-based specification, design techniques, assist in earlier identification of defects and reduce residual defect content for space mission software systems. The presentation also discusses: potential applications, accomplishments and/or technological transfer potential and the next steps.

  2. Real time en face Fourier-domain optical coherence tomography with direct hardware frequency demodulation

    PubMed Central

    Biedermann, Benjamin R.; Wieser, Wolfgang; Eigenwillig, Christoph M.; Palte, Gesa; Adler, Desmond C.; Srinivasan, Vivek J.; Fujimoto, James G.; Huber, Robert

    2009-01-01

    We demonstrate en face swept source optical coherence tomography (ss-OCT) without requiring a Fourier transformation step. The electronic optical coherence tomography (OCT) interference signal from a k-space linear Fourier domain mode-locked laser is mixed with an adjustable local oscillator, yielding the analytic reflectance signal from one image depth for each frequency sweep of the laser. Furthermore, a method for arbitrarily shaping the spectral intensity profile of the laser is presented, without requiring the step of numerical apodization. In combination, these two techniques enable sampling of the in-phase and quadrature signal with a slow analog-to-digital converter and allow for real-time display of en face projections even for highest axial scan rates. Image data generated with this technique is compared to en face images extracted from a three-dimensional OCT data set. This technique can allow for real-time visualization of arbitrarily oriented en face planes for the purpose of alignment, registration, or operator-guided survey scans while simultaneously maintaining the full capability of high-speed volumetric ss-OCT functionality. PMID:18978919

  3. Beam position monitor engineering

    NASA Astrophysics Data System (ADS)

    Smith, Stephen R.

    1997-01-01

    The design of beam position monitors often involves challenging system design choices. Position transducers must be robust, accurate, and generate adequate position signal without unduly disturbing the beam. Electronics must be reliable and affordable, usually while meeting tough requirements on precision, accuracy, and dynamic range. These requirements may be difficult to achieve simultaneously, leading the designer into interesting opportunities for optimization or compromise. Some useful techniques and tools are shown. Both finite element analysis and analytic techniques will be used to investigate quasi-static aspects of electromagnetic fields such as the impedance of and the coupling of beam to striplines or buttons. Finite-element tools will be used to understand dynamic aspects of the electromagnetic fields of beams, such as wake fields and transmission-line and cavity effects in vacuum-to-air feedthroughs. Mathematical modeling of electrical signals through a processing chain will be demonstrated, in particular to illuminate areas where neither a pure time-domain nor a pure frequency-domain analysis is obviously advantageous. Emphasis will be on calculational techniques, in particular on using both time domain and frequency domain approaches to the applicable parts of interesting problems.

  4. Resonance Ionization, Mass Spectrometry.

    ERIC Educational Resources Information Center

    Young, J. P.; And Others

    1989-01-01

    Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)

  5. Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods

    ERIC Educational Resources Information Center

    Zhang, Ying

    2011-01-01

    Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…

  6. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  7. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  8. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  9. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  10. Tactical Decision Making: A Proposed Evaluation Criteria Model for the Infantry Battalion’s Tactical Estimate during Offensive Operations

    DTIC Science & Technology

    1993-06-04

    34 J In a paper entitled "Understanding and Developing Combat Power," by Colonel Huba Vass de Czege, a method identifying analytical techniques for...reiterates several important doctrinal and theoretical requirements for the de ’elopment of 9« an optimal «valuation criteria nodal. Although...Methode de Ralsonnenent Tactlque" (The Tactical Reasoning Method)’". Is a version of concurrent COA analysis under conditions at uncertainty. Figure

  11. Problem Definition Study on Techniques and Methodologies for Evaluating the Chemical and Toxicological Properties of Combustion Products of Gun Systems. Volume 1.

    DTIC Science & Technology

    1988-03-01

    methods that can resolve the various compounds are required. This chapter specifically focuses on analytical and sampling metho - dology used to determine...Salmonella typhimurium TA1538. Cancer Res. 35:2461-2468. Huy, N. D., R. Belleau, and P. E. Roy. 1975. Toxicity of marijuana and tobacco smoking in the... Medicine Division (HSHA-IPM) Fort Sam Houston, TX 78234 Commander U.S. Army Materiel Command ATTN: AMSCG 5001 Eisenhower Avenue Alexandria, VA 22333

  12. Rotative balance of the I.M.F. Lille and associated experimental techniques

    NASA Technical Reports Server (NTRS)

    Verbrugge, R.

    1981-01-01

    The study of aerodynamic effects at high incidence associated with motions of wide amplitude incorporating continuous rotations requires the consideration of coupled effects, which are generally nonlinear, in a formulation of equations of motion. A rotative balance designed to simulate such maneuvers in a windtunnel was created to form a test medium for analytical studies. A general description of the assembly is provided by considering two main ranges of application. The capacities and performance of the assembly are discussed.

  13. A Posteriori Quantification of Rate-Controlling Effects from High-Intensity Turbulence-Flame Interactions Using 4D Measurements

    DTIC Science & Technology

    2016-11-22

    Unclassified REPORT DOCUMENTATION PAGE Form ApprovedOMB No. 0704-0188 The public reporting burden for this collection of information is estimated to average 1...compact at all conditions tested, as indicated by the overlap of OH and CH2O distributions. 5. We developed analytical techniques for pseudo- Lagrangian ...condition in a constant density flow requires that the flow divergence is zero, ∇ · ~u = 0. Three smoothing schemes were examined, a moving average (i.e

  14. Quantitative Determination of Caffeine in Beverages Using a Combined SPME-GC/MS Method

    NASA Astrophysics Data System (ADS)

    Pawliszyn, Janusz; Yang, Min J.; Orton, Maureen L.

    1997-09-01

    Solid-phase microextraction (SPME) combined with gas chromatography/mass spectrometry (GC/MS) has been applied to the analysis of various caffeinated beverages. Unlike the current methods, this technique is solvent free and requires no pH adjustments. The simplicity of the SPME-GC/MS method lends itself to a good undergraduate laboratory practice. This publication describes the analytical conditions and presents the data for determination of caffeine in coffee, tea, and coke. Quantitation by isotopic dilution is also illustrated.

  15. Bioorganic chemistry and the emergence of the first cell

    NASA Technical Reports Server (NTRS)

    Fox, S. W.

    1977-01-01

    It is suggested that the best way to study the evolution of cells from primordial compounds is to attempt to assemble a protocell, i.e., a primordial cell. Simulation of processes that occurred in archaic times would require inductive reasoning and constructionist techniques rather than the analytic approach in which cell components are separated and studied in isolation. Advantages to primordial life which would result from protocell formation are surveyed, and the proteinoid microsphere, a model of the protocell, is discussed. A photoreactive proteinoid is considered.

  16. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  17. Properties of finite difference models of non-linear conservative oscillators

    NASA Technical Reports Server (NTRS)

    Mickens, R. E.

    1988-01-01

    Finite-difference (FD) approaches to the numerical solution of the differential equations describing the motion of a nonlinear conservative oscillator are investigated analytically. A generalized formulation of the Duffing and modified Duffing equations is derived and analyzed using several FD techniques, and it is concluded that, although it is always possible to contstruct FD models of conservative oscillators which are themselves conservative, caution is required to avoid numerical solutions which do not accurately reflect the properties of the original equation.

  18. WIPP waste characterization program sampling and analysis guidance manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    The Waste Isolation Pilot Plant (WIPP) Waste Characterization Program Sampling and Analysis Guidance Manual (Guidance Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Quality Assurance Program Plan (QAPP) for the WIPP Experimental-Waste Characterization Program (the Program). This Guidance Manual includes all of the sampling and testing methodologies accepted by the WIPP Project Office (DOE/WPO) for use in implementing the Program requirements specified in the QAPP. This includes methods for characterizing representative samples of transuranic (TRU) wastesmore » at DOE generator sites with respect to the gas generation controlling variables defined in the WIPP bin-scale and alcove test plans, as well as waste container headspace gas sampling and analytical procedures to support waste characterization requirements under the WIPP test program and the Resource Conservation and Recovery Act (RCRA). The procedures in this Guidance Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site specific procedures. The use of these procedures is intended to provide the necessary sensitivity, specificity, precision, and comparability of analyses and test results. The solutions to achieving specific program objectives will depend upon facility constraints, compliance with DOE Orders and DOE facilities' operating contractor requirements, and the knowledge and experience of the TRU waste handlers and analysts. With some analytical methods, such as gas chromatography/mass spectrometry, the Guidance Manual procedures may be used directly. With other methods, such as nondestructive/destructive characterization, the Guidance Manual provides guidance rather than a step-by-step procedure.« less

  19. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  20. Implementing a Matrix-free Analytical Jacobian to Handle Nonlinearities in Models of 3D Lithospheric Deformation

    NASA Astrophysics Data System (ADS)

    Kaus, B.; Popov, A.

    2015-12-01

    The analytical expression for the Jacobian is a key component to achieve fast and robust convergence of the nonlinear Newton-Raphson iterative solver. Accomplishing this task in practice often requires a significant algebraic effort. Therefore it is quite common to use a cheap alternative instead, for example by approximating the Jacobian with a finite difference estimation. Despite its simplicity it is a relatively fragile and unreliable technique that is sensitive to the scaling of the residual and unknowns, as well as to the perturbation parameter selection. Unfortunately no universal rule can be applied to provide both a robust scaling and a perturbation. The approach we use here is to derive the analytical Jacobian for the coupled set of momentum, mass, and energy conservation equations together with the elasto-visco-plastic rheology and a marker in cell/staggered finite difference method. The software project LaMEM (Lithosphere and Mantle Evolution Model) is primarily developed for the thermo-mechanically coupled modeling of the 3D lithospheric deformation. The code is based on a staggered grid finite difference discretization in space, and uses customized scalable solvers form PETSc library to efficiently run on the massively parallel machines (such as IBM Blue Gene/Q). Currently LaMEM relies on the Jacobian-Free Newton-Krylov (JFNK) nonlinear solver, which approximates the Jacobian-vector product using a simple finite difference formula. This approach never requires an assembled Jacobian matrix and uses only the residual computation routine. We use an approximate Jacobian (Picard) matrix to precondition the Krylov solver with the Galerkin geometric multigrid. Because of the inherent problems of the finite difference Jacobian estimation, this approach doesn't always result in stable convergence. In this work we present and discuss a matrix-free technique in which the Jacobian-vector product is replaced by analytically-derived expressions and compare results with those obtained with a finite difference approximation of the Jacobian. This project is funded by ERC Starting Grant 258830 and computer facilities were provided by Jülich supercomputer center (Germany).

  1. An integrated approach using orthogonal analytical techniques to characterize heparan sulfate structure.

    PubMed

    Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan

    2017-02-01

    Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.

  2. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  3. Thin Film Electrodes for Rare Event Detectors

    NASA Astrophysics Data System (ADS)

    Odgers, Kelly; Brown, Ethan; Lewis, Kim; Giordano, Mike; Freedberg, Jennifer

    2017-01-01

    In detectors for rare physics processes, such as neutrinoless double beta decay and dark matter, high sensitivity requires careful reduction of backgrounds due to radioimpurities in detector components. Ultra pure cylindrical resistors are being created through thin film depositions onto high purity substrates, such as quartz glass or sapphire. By using ultra clean materials and depositing very small quantities in the films, low radioactivity electrodes are produced. A new characterization process for cylindrical film resistors has been developed through analytic construction of an analogue to the Van Der Pauw technique commonly used for determining sheet resistance on a planar sample. This technique has been used to characterize high purity cylindrical resistors ranging from several ohms to several tera-ohms for applications in rare event detectors. The technique and results of cylindrical thin film resistor characterization will be presented.

  4. Electrochemical biosensors for hormone analyses.

    PubMed

    Bahadır, Elif Burcu; Sezgintürk, Mustafa Kemal

    2015-06-15

    Electrochemical biosensors have a unique place in determination of hormones due to simplicity, sensitivity, portability and ease of operation. Unlike chromatographic techniques, electrochemical techniques used do not require pre-treatment. Electrochemical biosensors are based on amperometric, potentiometric, impedimetric, and conductometric principle. Amperometric technique is a commonly used one. Although electrochemical biosensors offer a great selectivity and sensitivity for early clinical analysis, the poor reproducible results, difficult regeneration steps remain primary challenges to the commercialization of these biosensors. This review summarizes electrochemical (amperometric, potentiometric, impedimetric and conductometric) biosensors for hormone detection for the first time in the literature. After a brief description of the hormones, the immobilization steps and analytical performance of these biosensors are summarized. Linear ranges, LODs, reproducibilities, regenerations of developed biosensors are compared. Future outlooks in this area are also discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  6. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  7. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  8. S.S. Annunziata Church (L'Aquila, Italy) unveiled by non- and micro-destructive testing techniques

    NASA Astrophysics Data System (ADS)

    Sfarra, Stefano; Cheilakou, Eleni; Theodorakeas, Panagiotis; Paoletti, Domenica; Koui, Maria

    2017-03-01

    The present research work explores the potential of an integrated inspection methodology, combining Non-destructive testing and micro-destructive analytical techniques, for both the structural assessment of the S.S. Annunziata Church located in Roio Colle (L'Aquila, Italy) and the characterization of its wall paintings' pigments. The study started by applying passive thermal imaging for the structural monitoring of the church before and after the application of a consolidation treatment, while active thermal imaging was further used for assessing this consolidation procedure. After the earthquake of 2009, which seriously damaged the city of L'Aquila and its surroundings, part of the internal plaster fell off revealing the presence of an ancient mural painting that was subsequently investigated by means of a combined analytical approach involving portable VIS-NIR fiber optics diffuse reflectance spectroscopy (FORS) and laboratory methods, such as environmental scanning electron microscopy (ESEM) coupled with energy dispersive X-ray analysis (EDX), and attenuated total reflectance-fourier transform infrared spectroscopy (ATR-FTIR). The results obtained from the thermographic analysis provided information concerning the two different constrictive phases of the Church, enabled the assessment of the consolidation treatment, and contributed to the detection of localized problems mainly related to the rising damp phenomenon and to biological attack. In addition, the results obtained from the combined analytical approach allowed the identification of the wall painting pigments (red and yellow ochre, green earth, and smalt) and provided information on the binding media and the painting technique possibly applied by the artist. From the results of the present study, it is possible to conclude that the joint use of the above stated methods into an integrated methodology can produce the complete set of useful information required for the planning of the Church's restoration phase.

  9. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  10. An extended laser flash technique for thermal diffusivity measurement of high-temperature materials

    NASA Technical Reports Server (NTRS)

    Shen, F.; Khodadadi, J. M.

    1993-01-01

    Knowledge of thermal diffusivity data for high-temperature materials (solids and liquids) is very important in analyzing a number of processes, among them solidification, crystal growth, and welding. However, reliable thermal diffusivity versus temperature data, particularly those for high-temperature liquids, are still far from complete. The main measurement difficulties are due to the presence of convection and the requirement for a container. Fortunately, the availability of levitation techniques has made it possible to solve the containment problem. Based on the feasibility of the levitation technology, a new laser flash technique which is applicable to both levitated liquid and solid samples is being developed. At this point, the analysis for solid samples is near completion and highlights of the technique are presented here. The levitated solid sample which is assumed to be a sphere is subjected to a very short burst of high power radiant energy. The temperature of the irradiated surface area is elevated and a transient heat transfer process takes place within the sample. This containerless process is a two-dimensional unsteady heat conduction problem. Due to the nonlinearity of the radiative plus convective boundary condition, an analytic solution cannot be obtained. Two options are available at this point. Firstly, the radiation boundary condition can be linearized, which then accommodates a closed-form analytic solution. Comparison of the analytic curves for the temperature rise at different points to the experimentally-measured values will then provide the thermal diffusivity values. Secondly, one may set up an inverse conduction problem whereby experimentally obtained surface temperature history is used as the boundary conditions. The thermal diffusivity can then be elevated by minimizing the difference between the real heat flux boundary condition (radiation plus convection) and the measurements. Status of an experimental study directed at measuring the thermal diffusivity of high-temperature solid samples of pure Nickel and Inconel 718 superalloys are presented. Preliminary measurements showing surface temperature histories are discussed.

  11. On using the Hilbert transform for blind identification of complex modes: A practical approach

    NASA Astrophysics Data System (ADS)

    Antunes, Jose; Debut, Vincent; Piteau, Pilippe; Delaune, Xavier; Borsoi, Laurent

    2018-01-01

    The modal identification of dynamical systems under operational conditions, when subjected to wide-band unmeasured excitations, is today a viable alternative to more traditional modal identification approaches based on processing sets of measured FRFs or impulse responses. Among current techniques for performing operational modal identification, the so-called blind identification methods are the subject of considerable investigation. In particular, the SOBI (Second-Order Blind Identification) method was found to be quite efficient. SOBI was originally developed for systems with normal modes. To address systems with complex modes, various extension approaches have been proposed, in particular: (a) Using a first-order state-space formulation for the system dynamics; (b) Building complex analytic signals from the measured responses using the Hilbert transform. In this paper we further explore the latter option, which is conceptually interesting while preserving the model order and size. Focus is on applicability of the SOBI technique for extracting the modal responses from analytic signals built from a set of vibratory responses. The novelty of this work is to propose a straightforward computational procedure for obtaining the complex cross-correlation response matrix to be used for the modal identification procedure. After clarifying subtle aspects of the general theoretical framework, we demonstrate that the correlation matrix of the analytic responses can be computed through a Hilbert transform of the real correlation matrix, so that the actual time-domain responses are no longer required for modal identification purposes. The numerical validation of the proposed technique is presented based on time-domain simulations of a conceptual physical multi-modal system, designed to display modes ranging from normal to highly complex, while keeping modal damping low and nearly independent of the modal complexity, and which can prove very interesting in test bench applications. Numerical results for complex modal identifications are presented, and the quality of the identified modal matrix and modal responses, extracted using the complex SOBI technique and implementing the proposed formulation, is assessed.

  12. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  13. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  14. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  15. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  16. A Phase Correction Technique Based on Spatial Movements of Antennas in Real-Time (S.M.A.R.T.) for Designing Self-Adapting Conformal Array Antennas

    NASA Astrophysics Data System (ADS)

    Roy, Sayan

    This research presents a real-time adaptive phase correction technique for flexible phased array antennas on conformal surfaces of variable shapes. Previously reported pattern correctional methods for flexible phased array antennas require prior knowledge on the possible non-planar shapes in which the array may adapt for conformal applications. For the first time, this initial requirement of shape curvature knowledge is no longer needed and the instantaneous information on the relative location of array elements is used here for developing a geometrical model based on a set of Bezier curves. Specifically, by using an array of inclinometer sensors and an adaptive phase-correctional algorithm, it has been shown that the proposed geometrical model can successfully predict different conformal orientations of a 1-by-4 antenna array in real-time without the requirement of knowing the shape-changing characteristics of the surface the array is attached upon. Moreover, the phase correction technique is validated by determining the field patterns and broadside gain of the 1-by-4 antenna array on four different conformal surfaces with multiple points of curvatures. Throughout this work, measurements are shown to agree with the analytical solutions and full-wave simulations.

  17. Meeting future information needs for Great Lakes fisheries management

    USGS Publications Warehouse

    Christie, W.J.; Collins, John J.; Eck, Gary W.; Goddard, Chris I.; Hoenig, John M.; Holey, Mark; Jacobson, Lawrence D.; MacCallum, Wayne; Nepszy, Stephen J.; O'Gorman, Robert; Selgeby, James

    1987-01-01

    Description of information needs for management of Great Lakes fisheries is complicated by recent changes in biology and management of the Great Lakes, development of new analytical methodologies, and a transition in management from a traditional unispecies approach to a multispecies/community approach. A number of general problems with the collection and management of data and information for fisheries management need to be addressed (i.e. spatial resolution, reliability, computerization and accessibility of data, design of sampling programs, standardization and coordination among agencies, and the need for periodic review of procedures). Problems with existing data collection programs include size selectivity and temporal trends in the efficiency of fishing gear, inadequate creel survey programs, bias in age estimation, lack of detailed sea lamprey (Petromyzon marinus) wounding data, and data requirements for analytical techniques that are underutilized by managers of Great Lakes fisheries. The transition to multispecies and community approaches to fisheries management will require policy decisions by the management agencies, adequate funding, and a commitment to develop programs for collection of appropriate data on a long-term basis.

  18. Data informatics for the Detection, Characterization, and Attribution of Climate Extremes

    NASA Astrophysics Data System (ADS)

    Collins, W.; Wehner, M. F.; O'Brien, T. A.; Paciorek, C. J.; Krishnan, H.; Johnson, J. N.; Prabhat, M.

    2015-12-01

    The potential for increasing frequency and intensity of extremephenomena including downpours, heat waves, and tropical cyclonesconstitutes one of the primary risks of climate change for society andthe environment. The challenge of characterizing these risks is thatextremes represent the "tails" of distributions of atmosphericphenomena and are, by definition, highly localized and typicallyrelatively transient. Therefore very large volumes of observationaldata and projections of future climate are required to quantify theirproperties in a robust manner. Massive data analytics are required inorder to detect individual extremes, accumulate statistics on theirproperties, quantify how these statistics are changing with time, andattribute the effects of anthropogenic global warming on thesestatistics. We describe examples of the suite of techniques the climate communityis developing to address these analytical challenges. The techniquesinclude massively parallel methods for detecting and trackingatmospheric rivers and cyclones; data-intensive extensions togeneralized extreme value theory to summarize the properties ofextremes; and multi-model ensembles of hindcasts to quantify theattributable risk of anthropogenic influence on individual extremes.We conclude by highlighting examples of these methods developed by ourCASCADE (Calibrated and Systematic Characterization, Attribution, andDetection of Extremes) project.

  19. Nuclear Forensics and Attribution: A National Laboratory Perspective

    NASA Astrophysics Data System (ADS)

    Hall, Howard L.

    2008-04-01

    Current capabilities in technical nuclear forensics - the extraction of information from nuclear and/or radiological materials to support the attribution of a nuclear incident to material sources, transit routes, and ultimately perpetrator identity - derive largely from three sources: nuclear weapons testing and surveillance programs of the Cold War, advances in analytical chemistry and materials characterization techniques, and abilities to perform ``conventional'' forensics (e.g., fingerprints) on radiologically contaminated items. Leveraging that scientific infrastructure has provided a baseline capability to the nation, but we are only beginning to explore the scientific challenges that stand between today's capabilities and tomorrow's requirements. These scientific challenges include radically rethinking radioanalytical chemistry approaches, developing rapidly deployable sampling and analysis systems for field applications, and improving analytical instrumentation. Coupled with the ability to measure a signature faster or more exquisitely, we must also develop the ability to interpret those signatures for meaning. This requires understanding of the physics and chemistry of nuclear materials processes well beyond our current level - especially since we are unlikely to ever have direct access to all potential sources of nuclear threat materials.

  20. Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brown, Douglas L.

    1994-01-01

    In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.

  1. Engine Hydraulic Stability. [injector model for analyzing combustion instability

    NASA Technical Reports Server (NTRS)

    Kesselring, R. C.; Sprouse, K. M.

    1977-01-01

    An analytical injector model was developed specifically to analyze combustion instability coupling between the injector hydraulics and the combustion process. This digital computer dynamic injector model will, for any imposed chamber of inlet pressure profile with a frequency ranging from 100 to 3000 Hz (minimum) accurately predict/calculate the instantaneous injector flowrates. The injector system is described in terms of which flow segments enter and leave each pressure node. For each flow segment, a resistance, line lengths, and areas are required as inputs (the line lengths and areas are used in determining inertance). For each pressure node, volume and acoustic velocity are required as inputs (volume and acoustic velocity determine capacitance). The geometric criteria for determining inertances of flow segments and capacitance of pressure nodes was set. Also, a technique was developed for analytically determining time averaged steady-state pressure drops and flowrates for every flow segment in an injector when such data is not known. These pressure drops and flowrates are then used in determining the linearized flow resistance for each line segment of flow.

  2. Micelle to solvent stacking of organic cations in micellar electrokinetic chromatography with sodium dodecyl sulfate.

    PubMed

    Quirino, Joselito P; Aranas, Agnes T

    2011-10-14

    The on-line sample concentration technique, micelle to solvent stacking (MSS), was studied for small organic cations (quaternary ammonium herbicides, β-blocker drugs, and tricyclic antidepressant drugs) in reversed migration micellar electrokinetic chromatography. Electrokinetic chromatography was carried out in fused silica capillaries with a background solution of sodium dodecyl sulfate (SDS) in a low pH phosphate buffer. MSS was performed using anionic SDS micelles in the sample solution for analyte transport and methanol or acetonitrile as organic solvent in the background solution for analyte effective electrophoretic mobility reversal. The solvent also allowed for the separation of the analyte test mixtures. A model for focusing and separation was developed and the mobility reversal that involved micelle collapse was experimentally verified. The effect of analyte retention factor was observed by changing the % organic solvent in the background solution or the concentration of SDS in the sample matrix. With an injection length of 31.9 cm (77% of effective capillary length) for the 7 test drugs, the LODs (S/N=3) of 5-14 ng/mL were 101-346-fold better when compared to typical injection. The linearity (R(2), range=0.025-0.8 μg/mL), intraday and interday repeatability (%RSD, n=10) were ≥0.988, <6.0% and <8.5%, respectively. In addition, analysis of spiked urine samples after 10-fold dilution with the sample matrix yielded LODs=0.02-0.10 μg/mL. These LODs are comparable to published electrophoretic methods that required off-line sample concentration. However, the practicality of the technique for more complex samples will rely on dedicated sample preparation schemes. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  4. [Raman spectroscopy applied to analytical quality control of injectable drugs: analytical evaluation and comparative economic versus HPLC and UV / visible-FTIR].

    PubMed

    Bourget, P; Amin, A; Vidal, F; Merlette, C; Troude, P; Corriol, O

    2013-09-01

    In France, central IV admixture of chemotherapy (CT) treatments at the hospital is now required by law. We have previously shown that the shaping of Therapeutic Objects (TOs) could profit from an Analytical Quality Assurance (AQA), closely linked to the batch release, for the three key parameters: identity, purity, and initial concentration of the compound of interest. In the course of recent and diversified works, we showed the technical superiority of non-intrusive Raman Spectroscopy (RS) vs. any other analytical option and, especially for both HPLC and vibrational method using a UV/visible-FTIR coupling. An interconnected qualitative and economic assessment strongly helps to enrich these relevant works. The study compares in operational situation, the performance of three analytical methods used for the AQC of TOs. We used: a) a set of evaluation criteria, b) the depreciation tables of the machinery, c) the cost of disposables, d) the weight of equipment and technical installations, e) the basic accounting unit (unit of work) and its composite costs (Euros), which vary according to the technical options, the weight of both human resources and disposables; finally, different combinations are described. So, the unit of work can take 12 different values between 1 and 5.5 Euros, and we provide various recommendations. A qualitative evaluation grid constantly places the SR technology as superior or equal to the 2 other techniques currently available. Our results demonstrated: a) the major interest of the non-intrusive AQC performed by RS, especially when it is not possible to analyze a TO with existing methods e.g. elastomeric portable pumps, and b) the high potential for this technique to be a strong contributor to the security of the medication circuit, and to fight the iatrogenic effects of drugs especially in the hospital. It also contributes to the protection of all actors in healthcare and of their working environment.

  5. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. An update on pharmaceutical film coating for drug delivery.

    PubMed

    Felton, Linda A; Porter, Stuart C

    2013-04-01

    Pharmaceutical coating processes have generally been transformed from what was essentially an art form in the mid-twentieth century to a much more technology-driven process. This review article provides a basic overview of current film coating processes, including a discussion on polymer selection, coating formulation additives and processing equipment. Substrate considerations for pharmaceutical coating processes are also presented. While polymeric coating operations are commonplace in the pharmaceutical industry, film coating processes are still not fully understood, which presents serious challenges with current regulatory requirements. Novel analytical technologies and various modeling techniques that are being used to better understand film coating processes are discussed. This review article also examines the challenges of implementing process analytical technologies in coating operations, active pharmaceutical ingredients in polymer film coatings, the use of high-solids coating systems and continuous coating and other novel coating application methods.

  7. A microfluidic paper-based analytical device for the assay of albumin-corrected fructosamine values from whole blood samples.

    PubMed

    Boonyasit, Yuwadee; Laiwattanapaisal, Wanida

    2015-01-01

    A method for acquiring albumin-corrected fructosamine values from whole blood using a microfluidic paper-based analytical system that offers substantial improvement over previous methods is proposed. The time required to quantify both serum albumin and fructosamine is shortened to 10 min with detection limits of 0.50 g dl(-1) and 0.58 mM, respectively (S/N = 3). The proposed system also exhibited good within-run and run-to-run reproducibility. The results of the interference study revealed that the acceptable recoveries ranged from 95.1 to 106.2%. The system was compared with currently used large-scale methods (n = 15), and the results demonstrated good agreement among the techniques. The microfluidic paper-based system has the potential to continuously monitor glycemic levels in low resource settings.

  8. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  9. Nanoscale optical interferometry with incoherent light

    PubMed Central

    Li, Dongfang; Feng, Jing; Pacifici, Domenico

    2016-01-01

    Optical interferometry has empowered an impressive variety of biosensing and medical imaging techniques. A widely held assumption is that devices based on optical interferometry require coherent light to generate a precise optical signature in response to an analyte. Here we disprove that assumption. By directly embedding light emitters into subwavelength cavities of plasmonic interferometers, we demonstrate coherent generation of surface plasmons even when light with extremely low degrees of spatial and temporal coherence is employed. This surprising finding enables novel sensor designs with cheaper and smaller light sources, and consequently increases accessibility to a variety of analytes, such as biomarkers in physiological fluids, or even airborne nanoparticles. Furthermore, these nanosensors can now be arranged along open detection surfaces, and in dense arrays, accelerating the rate of parallel target screening used in drug discovery, among other high volume and high sensitivity applications. PMID:26880171

  10. Nanoscale optical interferometry with incoherent light.

    PubMed

    Li, Dongfang; Feng, Jing; Pacifici, Domenico

    2016-02-16

    Optical interferometry has empowered an impressive variety of biosensing and medical imaging techniques. A widely held assumption is that devices based on optical interferometry require coherent light to generate a precise optical signature in response to an analyte. Here we disprove that assumption. By directly embedding light emitters into subwavelength cavities of plasmonic interferometers, we demonstrate coherent generation of surface plasmons even when light with extremely low degrees of spatial and temporal coherence is employed. This surprising finding enables novel sensor designs with cheaper and smaller light sources, and consequently increases accessibility to a variety of analytes, such as biomarkers in physiological fluids, or even airborne nanoparticles. Furthermore, these nanosensors can now be arranged along open detection surfaces, and in dense arrays, accelerating the rate of parallel target screening used in drug discovery, among other high volume and high sensitivity applications.

  11. Raman spectroscopy in astrobiology.

    PubMed

    Jorge Villar, Susana E; Edwards, Howell G M

    2006-01-01

    Raman spectroscopy is proposed as a valuable analytical technique for planetary exploration because it is sensitive to organic and inorganic compounds and able to unambiguously identify key spectral markers in a mixture of biological and geological components; furthermore, sample manipulation is not required and any size of sample can be studied without chemical or mechanical pretreatment. NASA and ESA are considering the adoption of miniaturised Raman spectrometers for inclusion in suites of analytical instrumentation to be placed on robotic landers on Mars in the near future to search for extinct or extant life signals. In this paper we review the advantages and limitations of Raman spectroscopy for the analysis of complex specimens with relevance to the detection of bio- and geomarkers in extremophilic organisms which are considered to be terrestrial analogues of possible extraterrestial life that could have developed on planetary surfaces.

  12. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  13. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  14. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  15. Performance of laboratories analysing welding fume on filter samples: results from the WASP proficiency testing scheme.

    PubMed

    Stacey, Peter; Butler, Owen

    2008-06-01

    This paper emphasizes the need for occupational hygiene professionals to require evidence of the quality of welding fume data from analytical laboratories. The measurement of metals in welding fume using atomic spectrometric techniques is a complex analysis often requiring specialist digestion procedures. The results from a trial programme testing the proficiency of laboratories in the Workplace Analysis Scheme for Proficiency (WASP) to measure potentially harmful metals in several different types of welding fume showed that most laboratories underestimated the mass of analyte on the filters. The average recovery was 70-80% of the target value and >20% of reported recoveries for some of the more difficult welding fume matrices were <50%. This level of under-reporting has significant implications for any health or hygiene studies of the exposure of welders to toxic metals for the types of fumes included in this study. Good laboratories' performance measuring spiked WASP filter samples containing soluble metal salts did not guarantee good performance when measuring the more complex welding fume trial filter samples. Consistent rather than erratic error predominated, suggesting that the main analytical factor contributing to the differences between the target values and results was the effectiveness of the sample preparation procedures used by participating laboratories. It is concluded that, with practice and regular participation in WASP, performance can improve over time.

  16. A Comparative Study of Single-pulse and Double-pulse Laser-Induced Breakdown Spectroscopy with Uranium-containing Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skrodzki, P. J.; Becker, J. R.; Diwakar, P. K.

    Laser-induced breakdown spectroscopy (LIBS) holds potential advantages in special nuclear material (SNM) sensing and nuclear forensics which require rapid analysis, minimal sample preparation and stand-off distance capability. SNM, such as U, however, result in crowded emission spectra with LIBS, and characteristic emission lines are challenging to discern. It is well-known that double-pulse LIBS (DPLIBS) improves the signal intensity for analytes over conventional single-pulse LIBS (SPLIBS). This study investigates U signal in a glass matrix using DPLIBS and compares to signal features obtained using SPLIBS. DPLIBS involves sequential firing of 1.06 µm Nd:YAG pre-pulse and 10.6 µm TEA CO2 heating pulsemore » in near collinear geometry. Optimization of experimental parameters including inter-pulse delay and energy follows identification of characteristic lines and signals for bulk analyte Ca and minor constituent analyte U for both DPLIBS and SPLIBS. Spatial and temporal coupling of the two pulses in the proposed DPLIBS technique yields improvements in analytical merits with negligible further damage to the sample compared to SPLIBS. Subsequently, the study discusses optimum plasma emission conditions of U lines and relative figures of merit in both SPLIBS and DPLIBS. Investigation into plasma characteristics also addresses plausible mechanisms related to observed U analyte signal variation between SPLIBS and DPLIBS.« less

  17. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  18. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  19. Rotorcraft Diagnostics

    NASA Technical Reports Server (NTRS)

    Haste, Deepak; Azam, Mohammad; Ghoshal, Sudipto; Monte, James

    2012-01-01

    Health management (HM) in any engineering systems requires adequate understanding about the system s functioning; a sufficient amount of monitored data; the capability to extract, analyze, and collate information; and the capability to combine understanding and information for HM-related estimation and decision-making. Rotorcraft systems are, in general, highly complex. Obtaining adequate understanding about functioning of such systems is quite difficult, because of the proprietary (restricted access) nature of their designs and dynamic models. Development of an EIM (exact inverse map) solution for rotorcraft requires a process that can overcome the abovementioned difficulties and maximally utilize monitored information for HM facilitation via employing advanced analytic techniques. The goal was to develop a versatile HM solution for rotorcraft for facilitation of the Condition Based Maintenance Plus (CBM+) capabilities. The effort was geared towards developing analytic and reasoning techniques, and proving the ability to embed the required capabilities on a rotorcraft platform, paving the way for implementing the solution on an aircraft-level system for consolidation and reporting. The solution for rotorcraft can he used offboard or embedded directly onto a rotorcraft system. The envisioned solution utilizes available monitored and archived data for real-time fault detection and identification, failure precursor identification, and offline fault detection and diagnostics, health condition forecasting, optimal guided troubleshooting, and maintenance decision support. A variant of the onboard version is a self-contained hardware and software (HW+SW) package that can be embedded on rotorcraft systems. The HM solution comprises components that gather/ingest data and information, perform information/feature extraction, analyze information in conjunction with the dependency/diagnostic model of the target system, facilitate optimal guided troubleshooting, and offer decision support for optimal maintenance.

  20. Metrology for hydrogen energy applications: a project to address normative requirements

    NASA Astrophysics Data System (ADS)

    Haloua, Frédérique; Bacquart, Thomas; Arrhenius, Karine; Delobelle, Benoît; Ent, Hugo

    2018-03-01

    Hydrogen represents a clean and storable energy solution that could meet worldwide energy demands and reduce greenhouse gases emission. The joint research project (JRP) ‘Metrology for sustainable hydrogen energy applications’ addresses standardisation needs through pre- and co-normative metrology research in the fast emerging sector of hydrogen fuel that meet the requirements of the European Directive 2014/94/EU by supplementing the revision of two ISO standards that are currently too generic to enable a sustainable implementation of hydrogen. The hydrogen purity dispensed at refueling points should comply with the technical specifications of ISO 14687-2 for fuel cell electric vehicles. The rapid progress of fuel cell technology now requires revising this standard towards less constraining limits for the 13 gaseous impurities. In parallel, optimized validated analytical methods are proposed to reduce the number of analyses. The study aims also at developing and validating traceable methods to assess accurately the hydrogen mass absorbed and stored in metal hydride tanks; this is a research axis for the revision of the ISO 16111 standard to develop this safe storage technique for hydrogen. The probability of hydrogen impurity presence affecting fuel cells and analytical techniques for traceable measurements of hydrogen impurities will be assessed and new data of maximum concentrations of impurities based on degradation studies will be proposed. Novel validated methods for measuring the hydrogen mass absorbed in hydrides tanks AB, AB2 and AB5 types referenced to ISO 16111 will be determined, as the methods currently available do not provide accurate results. The outputs here will have a direct impact on the standardisation works for ISO 16111 and ISO 14687-2 revisions in the relevant working groups of ISO/TC 197 ‘Hydrogen technologies’.

Top