Sample records for analytical techniques capable

  1. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  2. The role of light microscopy in aerospace analytical laboratories

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.

    1977-01-01

    Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.

  3. Techniques for sensing methanol concentration in aqueous environments

    NASA Technical Reports Server (NTRS)

    Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)

    2001-01-01

    An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.

  4. Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zheng, Qiuling; Chen, Hao

    2016-06-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.

  5. INTEGRATING BIOANALYTICAL CAPABILITY IN AN ENVIRONMENTAL ANALYTICAL LABORATORY

    EPA Science Inventory

    The product is a book chapter which is an introductory and summary chapter for the reference work "Immunoassays and Other Bianalytical Techniques" to be published by CRC Press, Taylor and Francis Books. The chapter provides analytical chemists information on new techni...

  6. ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  8. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  9. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  10. Further Investigations of Content Analytic Techniques for Extracting the Differentiating Information Contained in the Narrative Sections of Performance Evaluations for Navy Enlisted Personnel. Technical Report No. 75-1.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.; Richman, Vivian

    The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IMMUNOASSAY KIT, ENVIROLOGIX, INC., PCB IN SOIL TUBE ASSAY

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...

  12. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  14. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  15. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  16. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Casto, Gordon V.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a toolbox format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  17. A Toolbox of Metrology-Based Techniques for Optical System Alignment

    NASA Technical Reports Server (NTRS)

    Coulter, Phillip; Ohl, Raymond G.; Blake, Peter N.; Bos, Brent J.; Eichhorn, William L.; Gum, Jeffrey S.; Hadjimichael, Theodore J.; Hagopian, John G.; Hayden, Joseph E.; Hetherington, Samuel E.; hide

    2016-01-01

    The NASA Goddard Space Flight Center (GSFC) and its partners have broad experience in the alignment of flight optical instruments and spacecraft structures. Over decades, GSFC developed alignment capabilities and techniques for a variety of optical and aerospace applications. In this paper, we provide an overview of a subset of the capabilities and techniques used on several recent projects in a "toolbox" format. We discuss a range of applications, from small-scale optical alignment of sensors to mirror and bench examples that make use of various large-volume metrology techniques. We also discuss instruments and analytical tools.

  18. Introducing Chemometrics to the Analytical Curriculum: Combining Theory and Lab Experience

    ERIC Educational Resources Information Center

    Gilbert, Michael K.; Luttrell, Robert D.; Stout, David; Vogt, Frank

    2008-01-01

    Beer's law is an ideal technique that works only in certain situations. A method for dealing with more complex conditions needs to be integrated into the analytical chemistry curriculum. For that reason, the capabilities and limitations of two common chemometric algorithms, classical least squares (CLS) and principal component regression (PCR),…

  19. Assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry

    USGS Publications Warehouse

    Taylor, Howard E.; Garbarino, John R.

    1988-01-01

    A thorough assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry was conducted for selected analytes of importance in water quality applications and hydrologic research. A multielement calibration curve technique was designed to produce accurate and precise results in analysis times of approximately one minute. The suite of elements included Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Hg, Li, Mn, Mo, Ni, Pb, Se, Sr, V, and Zn. The effects of sample matrix composition on the accuracy of the determinations showed that matrix elements (such as Na, Ca, Mg, and K) that may be present in natural water samples at concentration levels greater than 50 mg/L resulted in as much as a 10% suppression in ion current for analyte elements. Operational detection limits are presented.

  20. Solid Lubrication Fundamentals and Applications. Chapter 2

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1998-01-01

    This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.

  1. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  2. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  3. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  4. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Analytical Chemistry: A Literary Approach

    NASA Astrophysics Data System (ADS)

    Lucy, Charles A.

    2000-04-01

    The benefits of incorporating real-world examples of chemistry into lectures and lessons is reflected by the recent inclusion of the Teaching with Problems and Case Studies column in this Journal. However, these examples lie outside the experience of many students, and so much of the impact of "real-world" examples is lost. This paper provides an anthology of references to analytical chemistry techniques from history, popular fiction, and film. Such references are amusing to both instructor and student. Further, the fictional descriptions can serve as a focal point for discussions of a technique's true capabilities and limitations.

  6. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  7. Visual Analytics and Storytelling through Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Perrine, Kenneth A.; Mackey, Patrick S.

    2005-10-31

    This paper supplements a video clip submitted to the Video Track of IEEE Symposium on Information Visualization 2005. The original video submission applies a two-way storytelling approach to demonstrate the visual analytics capabilities of a new visualization technique. The paper presents our video production philosophy, describes the plot of the video, explains the rationale behind the plot, and finally, shares our production experiences with our readers.

  8. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  9. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  10. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  11. Performance Analyses of Intercity Ground Passenger Transportation Systems

    DOT National Transportation Integrated Search

    1976-04-01

    This report documents the development of analytical techniques and their use for investigating the performance of intercity ground passenger transportation systems. The purpose of the study is twofold: (1) to provide a capability of evaluating new pa...

  12. Recognizing and Managing Complexity: Teaching Advanced Programming Concepts and Techniques Using the Zebra Puzzle

    ERIC Educational Resources Information Center

    Crabtree, John; Zhang, Xihui

    2015-01-01

    Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…

  13. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  14. Instrumentation development for drug detection on the breath

    DOT National Transportation Integrated Search

    1972-09-01

    Based on a survey of candidate analytical methods, mass spectrometry was identified as a promising technique for drug detection on the breath. To demonstrate its capabilities, an existing laboratory mass spectrometer was modified by the addition of a...

  15. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  16. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  17. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  18. Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique

    NASA Technical Reports Server (NTRS)

    Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann

    2010-01-01

    We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package

  19. Bioanalytical Applications of Fluorescence Line-Narrowing and Non-Line-Narrowing Spectroscopy Interfaced with Capillary Electrophoresis and High-Performance Liquid Chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Kenneth Paul

    Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less

  20. EVALUATION OF COMPUTER-CONTROLLED SCANNING ELECTRON MICROSCOPY APPLIED TO AN AMBIENT URBAN AEROSOL SAMPLE

    EPA Science Inventory

    Concerns about the environmental and public health effects of particulate matter (PM) have stimulated interest in analytical techniques capable of measuring the size and chemical composition of individual aerosol particles. Computer-controlled scanning electron microscopy (CCSE...

  1. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  2. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    PubMed

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals

    NASA Astrophysics Data System (ADS)

    Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.

    2018-02-01

    Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.

  4. Synthesis of Human Factors Research on Older Drivers and Highway Safety. Volume I: Older Driver Research Synthesis

    DOT National Transportation Integrated Search

    1997-10-01

    The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...

  5. SYNTHESIS OF HUMAN FACTORS RESEARCH ON OLDER DRIVERS AND HIGHWAY SAFETY, Volume I: Older Driver Research Synthesis

    DOT National Transportation Integrated Search

    1999-11-23

    The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...

  6. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  7. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  8. Sensor Data Qualification System (SDQS) Implementation Study

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Melcher, Kevin; Fulton, Christopher; Maul, William

    2009-01-01

    The Sensor Data Qualification System (SDQS) is being developed to provide a sensor fault detection capability for NASA s next-generation launch vehicles. In addition to traditional data qualification techniques (such as limit checks, rate-of-change checks and hardware redundancy checks), SDQS can provide augmented capability through additional techniques that exploit analytical redundancy relationships to enable faster and more sensitive sensor fault detection. This paper documents the results of a study that was conducted to determine the best approach for implementing a SDQS network configuration that spans multiple subsystems, similar to those that may be implemented on future vehicles. The best approach is defined as one that most minimizes computational resource requirements without impacting the detection of sensor failures.

  9. Restructuring the rotor analysis program C-60

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The continuing evolution of the rotary wing industry demands increasing analytical capabilities. To keep up with this demand, software must be structured to accommodate change. The approach discussed for meeting this demand is to restructure an existing analysis. The motivational factors, basic principles, application techniques, and practical lessons from experience with this restructuring effort are reviewed.

  10. A preliminary study of air-pollution measurement by active remote-sensing techniques

    NASA Technical Reports Server (NTRS)

    Wright, M. L.; Proctor, E. K.; Gasiorek, L. S.; Liston, E. M.

    1975-01-01

    Air pollutants are identified, and the needs for their measurement from satellites and aircraft are discussed. An assessment is made of the properties of these pollutants and of the normal atmosphere, including interactions with light of various wavelengths and the resulting effects on transmission and scattering of optical signals. The possible methods for active remote measurement are described; the relative performance capabilities of double-ended and single-ended systems are compared qualitatively; and the capabilities of the several single-ended or backscattering techniques are compared quantitatively. The differential-absorption lidar (DIAL) technique is shown to be superior to the other backscattering techniques. The lidar system parameters and their relationships to the environmental factors and the properties of pollutants are examined in detail. A computer program that models both the atmosphere (including pollutants) and the lidar system is described. The performance capabilities of present and future lidar components are assessed, and projections are made of prospective measurement capabilities for future lidar systems. Following a discussion of some important operational factors that affect both the design and measurement capabilities of airborne and satellite-based lidar systems, the extensive analytical results obtained through more than 1000 individual cases analyzed with the aid of the computer program are summarized and discussed. The conclusions are presented. Recommendations are also made for additional studies to investigate cases that could not be explored adequately during this study.

  11. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  12. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  13. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  15. Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to Defence Transformation

    DTIC Science & Technology

    2005-04-01

    RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to

  16. Ionic liquids in solid-phase microextraction: a review.

    PubMed

    Ho, Tien D; Canestraro, Anthony J; Anderson, Jared L

    2011-06-10

    Solid-phase microextraction (SPME) has undergone a surge in popularity within the field of analytical chemistry in the past two decades since its introduction. Owing to its nature of extraction, SPME has become widely known as a quick and cost-effective sample preparation technique. Although SPME has demonstrated extraordinary versatility in sampling capabilities, the technique continues to experience a tremendous growth in innovation. Presently, increasing efforts have been directed towards the engineering of novel sorbent material in order to expand the applicability of SPME for a wider range of analytes and matrices. This review highlights the application of ionic liquids (ILs) and polymeric ionic liquids (PILs) as innovative sorbent materials for SPME. Characterized by their unique physico-chemical properties, these compounds can be structurally-designed to selectively extract target analytes based on unique molecular interactions. To examine the advantages of IL and PIL-based sorbent coatings in SPME, the field is reviewed by gathering available experimental data and exploring the sensitivity, linear calibration range, as well as detection limits for a variety of target analytes in the methods that have been developed. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Model-Based Data Integration and Process Standardization Techniques for Fault Management: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig

    2018-01-01

    This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.

  18. Biosensors and their applications in detection of organophosphorus pesticides in the environment.

    PubMed

    Hassani, Shokoufeh; Momtaz, Saeideh; Vakhshiteh, Faezeh; Maghsoudi, Armin Salek; Ganjali, Mohammad Reza; Norouzi, Parviz; Abdollahi, Mohammad

    2017-01-01

    This review discusses the past and recent advancements of biosensors focusing on detection of organophosphorus pesticides (OPs) due to their exceptional use during the last decades. Apart from agricultural benefits, OPs also impose adverse toxicological effects on animal and human population. Conventional approaches such as chromatographic techniques used for pesticide detection are associated with several limitations. A biosensor technology is unique due to the detection sensitivity, selectivity, remarkable performance capabilities, simplicity and on-site operation, fabrication and incorporation with nanomaterials. This study also provided specifications of the most OPs biosensors reported until today based on their transducer system. In addition, we highlighted the application of advanced complementary materials and analysis techniques in OPs detection systems. The availability of these new materials associated with new sensing techniques has led to introduction of easy-to-use analytical tools of high sensitivity and specificity in the design and construction of OPs biosensors. In this review, we elaborated the achievements in sensing systems concerning innovative nanomaterials and analytical techniques with emphasis on OPs.

  19. Probing Pharmaceutical Mixtures during Milling: The Potency of Low-Frequency Raman Spectroscopy in Identifying Disorder.

    PubMed

    Walker, Greg; Römann, Philipp; Poller, Bettina; Löbmann, Korbinian; Grohganz, Holger; Rooney, Jeremy S; Huff, Gregory S; Smith, Geoffrey P S; Rades, Thomas; Gordon, Keith C; Strachan, Clare J; Fraser-Miller, Sara J

    2017-12-04

    This study uses a multimodal analytical approach to evaluate the rates of (co)amorphization of milled drug and excipient and the effectiveness of different analytical methods in detecting these changes. Indomethacin and tryptophan were the model substances, and the analytical methods included low-frequency Raman spectroscopy (785 nm excitation and capable of measuring both low- (10 to 250 cm -1 ) and midfrequency (450 to 1800 cm -1 ) regimes, and a 830 nm system (5 to 250 cm -1 )), conventional (200-3000 cm -1 ) Raman spectroscopy, Fourier transform infrared spectroscopy (FTIR), and X-ray powder diffraction (XRPD). The kinetics of amorphization were found to be faster for the mixture, and indeed, for indomethacin, only partial amorphization occurred (after 360 min of milling). Each technique was capable of identifying the transformations, but some, such as low-frequency Raman spectroscopy and XRPD, provided less ambiguous signatures than the midvibrational frequency techniques (conventional Raman and FTIR). The low-frequency Raman spectra showed intense phonon mode bands for the crystalline and cocrystalline samples that could be used as a sensitive probe of order. Multivariate analysis has been used to further interpret the spectral changes. Overall, this study demonstrates the potential of low-frequency Raman spectroscopy, which has several practical advantages over XRPD, for probing (dis-)order during pharmaceutical processing, showcasing its potential for future development, and implementation as an in-line process monitoring method.

  20. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    NASA Astrophysics Data System (ADS)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  1. Propeller flow visualization techniques

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.

    1982-01-01

    Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.

  2. A close-range photogrammetric technique for mapping neotectonic features in trenches

    USGS Publications Warehouse

    Fairer, G.M.; Whitney, J.W.; Coe, J.A.

    1989-01-01

    Close-range photogrammetric techniques and newly available computerized plotting equipment were used to map exploratory trench walls that expose Quaternary faults in the vicinity of Yucca Mountain, Nevada. Small-scale structural, lithologic, and stratigraphic features can be rapidly mapped by the photogrammetric method. This method is more accurate and significantly more rapid than conventional trench-mapping methods, and the analytical plotter is capable of producing cartographic definition of high resolution when detailed trench maps are necessary. -from Authors

  3. Reusable biocompatible interface for immobilization of materials on a solid support

    DOEpatents

    Salamon, Zdzislaw; Schmidt, Richard A.; Tollin, Gordon; Macleod, H. Angus

    1996-01-01

    A method for the formation of a biocompatible film composed of a self-assembled bilayer membrane deposited on a planar surface. This bilayer membrane is capable of immobilizing materials to be analyzed in an environment very similar to their native state. Materials so immobilized may be subject to any of a number of analytical techniques.

  4. Users guide for EASI graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasser, D.W.

    1978-03-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of EASI Graphics and illustrates its application with some examples.

  5. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  6. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE PAGES

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...

    2016-02-01

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  7. Advanced analytical electron microscopy for alkali-ion batteries

    DOE PAGES

    Qian, Danna; Ma, Cheng; Meng, Ying Shirley; ...

    2015-06-26

    Lithium-ion batteries are a leading candidate for electric vehicle and smart grid applications. However, further optimizations of the energy/power density, coulombic efficiency and cycle life are still needed, and this requires a thorough understanding of the dynamic evolution of each component and their synergistic behaviors during battery operation. With the capability of resolving the structure and chemistry at an atomic resolution, advanced analytical transmission electron microscopy (AEM) is an ideal technique for this task. The present review paper focuses on recent contributions of this important technique to the fundamental understanding of the electrochemical processes of battery materials. A detailed reviewmore » of both static (ex situ) and real-time (in situ) studies will be given, and issues that still need to be addressed will be discussed.« less

  8. Some new features of Direct Analysis in Real Time mass spectrometry utilizing the desorption at an angle option.

    PubMed

    Chernetsova, Elena S; Revelsky, Alexander I; Morlock, Gertrud E

    2011-08-30

    The present study is a first step towards the unexplored capabilities of Direct Analysis in Real Time (DART) mass spectrometry (MS) arising from the possibility of the desorption at an angle: scanning analysis of surfaces, including the coupling of thin-layer chromatography (TLC) with DART-MS, and a more sensitive analysis due to the preliminary concentration of analytes dissolved in large volumes of liquids on glass surfaces. In order to select the most favorable conditions for DART-MS analysis, proper positioning of samples is important. Therefore, a simple and cheap technique for the visualization of the impact region of the DART gas stream onto a substrate was developed. A filter paper or TLC plate, previously loaded with the analyte, was immersed in a derivatization solution. On this substrate, owing to the impact of the hot DART gas, reaction of the analyte to a colored product occurred. An improved capability of detection of DART-MS for the analysis of liquids was demonstrated by applying large volumes of model solutions of coumaphos into small glass vessels and drying these solutions prior to DART-MS analysis under ambient conditions. This allowed the introduction of, by up to more than two orders of magnitude, increased quantities of analyte compared with the conventional DART-MS analysis of liquids. Through this improved detectability, the capabilities of DART-MS in trace analysis could be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.

  9. The analytical representation of viscoelastic material properties using optimization techniques

    NASA Technical Reports Server (NTRS)

    Hill, S. A.

    1993-01-01

    This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.

  10. Binary Oscillatory Crossflow Electrophoresis

    NASA Technical Reports Server (NTRS)

    Molloy, Richard F.; Gallagher, Christopher T.; Leighton, David T., Jr.

    1996-01-01

    We present preliminary results of our implementation of a novel electrophoresis separation technique: Binary Oscillatory Cross flow Electrophoresis (BOCE). The technique utilizes the interaction of two driving forces, an oscillatory electric field and an oscillatory shear flow, to create an active binary filter for the separation of charged species. Analytical and numerical studies have indicated that this technique is capable of separating proteins with electrophoretic mobilities differing by less than 10%. With an experimental device containing a separation chamber 20 cm long, 5 cm wide, and 1 mm thick, an order of magnitude increase in throughput over commercially available electrophoresis devices is theoretically possible.

  11. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  12. UltraSensitive Mycotoxin Detection by STING Sensors

    PubMed Central

    Actis, Paolo; Jejelowo, Olufisayo; Pourmand, Nader

    2010-01-01

    Signal Transduction by Ion Nano Gating (STING) technology is a label-free biosensor capable of identifying DNA and proteins. Based on a functionalized quartz nanopipette, the STING sensor includes specific recognition elements for analyte discrimination based on size, shape and charge density. A key feature of this technology is that it doesn't require any nanofabrication facility; each nanopipette can be easily, reproducibly, and inexpensively fabricated and tailored at the bench, thus reducing the cost and the turnaround time. Here, we show that STING sensors are capable of the ultrasensitive detection of HT-2 toxin with a detection limit of 100 fg/ml and compare the STING capabilities with respect to conventional sandwich assay techniques. PMID:20829024

  13. Historical review of missile aerodynamic developments

    NASA Technical Reports Server (NTRS)

    Spearman, M. Leroy

    1989-01-01

    A comprehensive development history to about 1970 is presented for missile technologies and their associated capabilities and difficulties. Attention is given to the growth of an experimental data base for missile design, as well as to the critical early efforts to develop analytical methods applicable to missiles. Most of the important missile development efforts made during the period from the end of the Second World War to the early 1960s were based primarily on experiences gained through wind tunnel and flight testing; analytical techniques began to demonstrate their usefulness in the design process only in the late 1960s.

  14. Observability during planetary approach navigation

    NASA Technical Reports Server (NTRS)

    Bishop, Robert H.; Burkhart, P. Daniel; Thurman, Sam W.

    1993-01-01

    The objective of the research is to develop an analytic technique to predict the relative navigation capability of different Earth-based radio navigation measurements. In particular, the problem is to determine the relative ability of geocentric range and Doppler measurements to detect the effects of the target planet gravitational attraction on the spacecraft during the planetary approach and near-encounter mission phases. A complete solution to the two-dimensional problem has been developed. Relatively simple analytic formulas are obtained for range and Doppler measurements which describe the observability content of the measurement data along the approach trajectories. An observability measure is defined which is based on the observability matrix for nonlinear systems. The results show good agreement between the analytic observability analysis and the computational batch processing method.

  15. Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols

    NASA Astrophysics Data System (ADS)

    Shafer, M. M.; Schauer, J. J.; Park, J.

    2001-12-01

    Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.

  16. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  17. Evaluation of capillary electrophoresis for in-flight ionic contaminant monitoring of SSF potable water

    NASA Technical Reports Server (NTRS)

    Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.

    1992-01-01

    Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.

  18. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  19. Joint Utility of Event-Dependent and Environmental Crime Analysis Techniques for Violent Crime Forecasting

    ERIC Educational Resources Information Center

    Caplan, Joel M.; Kennedy, Leslie W.; Piza, Eric L.

    2013-01-01

    Violent crime incidents occurring in Irvington, New Jersey, in 2007 and 2008 are used to assess the joint analytical capabilities of point pattern analysis, hotspot mapping, near-repeat analysis, and risk terrain modeling. One approach to crime analysis suggests that the best way to predict future crime occurrence is to use past behavior, such as…

  20. Reusable biocompatible interface for immobilization of materials on a solid support

    DOEpatents

    Salamon, Z.; Schmidt, R.A.; Tollin, G.; Macleod, H.A.

    1996-05-28

    A method is presented for the formation of a biocompatible film composed of a self-assembled bilayer membrane deposited on a planar surface. This bilayer membrane is capable of immobilizing materials to be analyzed in an environment very similar to their native state. Materials so immobilized may be subject to any of a number of analytical techniques. 3 figs.

  1. Nuclear Forensics and Attribution: A National Laboratory Perspective

    NASA Astrophysics Data System (ADS)

    Hall, Howard L.

    2008-04-01

    Current capabilities in technical nuclear forensics - the extraction of information from nuclear and/or radiological materials to support the attribution of a nuclear incident to material sources, transit routes, and ultimately perpetrator identity - derive largely from three sources: nuclear weapons testing and surveillance programs of the Cold War, advances in analytical chemistry and materials characterization techniques, and abilities to perform ``conventional'' forensics (e.g., fingerprints) on radiologically contaminated items. Leveraging that scientific infrastructure has provided a baseline capability to the nation, but we are only beginning to explore the scientific challenges that stand between today's capabilities and tomorrow's requirements. These scientific challenges include radically rethinking radioanalytical chemistry approaches, developing rapidly deployable sampling and analysis systems for field applications, and improving analytical instrumentation. Coupled with the ability to measure a signature faster or more exquisitely, we must also develop the ability to interpret those signatures for meaning. This requires understanding of the physics and chemistry of nuclear materials processes well beyond our current level - especially since we are unlikely to ever have direct access to all potential sources of nuclear threat materials.

  2. International Technical Working Group Round Robin Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudder, Gordon B.; Hanlen, Richard C.; Herbillion, Georges M.

    The goal of nuclear forensics is to develop a preferred approach to support illicit trafficking investigations. This approach must be widely understood and accepted as credible. The principal objectives of the Round Robin Tests are to prioritize forensic techniques and methods, evaluate attribution capabilities, and examine the utility of database. The HEU (Highly Enriched Uranium) Round Robin, and previous Plutonium Round Robin, have made tremendous contributions to fulfilling these goals through a collaborative learning experience that resulted from the outstanding efforts of the nine participating internal laboratories. A prioritized list of techniques and methods has been developed based on thismore » exercise. Current work is focused on the extent to which the techniques and methods can be generalized. The HEU Round Robin demonstrated a rather high level of capability to determine the important characteristics of the materials and processes using analytical methods. When this capability is combined with the appropriate knowledge/database, it results in a significant capability to attribute the source of the materials to a specific process or facility. A number of shortfalls were also identified in the current capabilities including procedures for non-nuclear forensics and the lack of a comprehensive network of data/knowledge bases. The results of the Round Robin will be used to develop guidelines or a ''recommended protocol'' to be made available to the interested authorities and countries to use in real cases.« less

  3. Evaluation of Ion Mobility-Mass Spectrometry for Comparative Analysis of Monoclonal Antibodies

    NASA Astrophysics Data System (ADS)

    Ferguson, Carly N.; Gucinski-Ruth, Ashley C.

    2016-05-01

    Analytical techniques capable of detecting changes in structure are necessary to monitor the quality of monoclonal antibody drug products. Ion mobility mass spectrometry offers an advanced mode of characterization of protein higher order structure. In this work, we evaluated the reproducibility of ion mobility mass spectrometry measurements and mobiligrams, as well as the suitability of this approach to differentiate between and/or characterize different monoclonal antibody drug products. Four mobiligram-derived metrics were identified to be reproducible across a multi-day window of analysis. These metrics were further applied to comparative studies of monoclonal antibody drug products representing different IgG subclasses, manufacturers, and lots. These comparisons resulted in some differences, based on the four metrics derived from ion mobility mass spectrometry mobiligrams. The use of collision-induced unfolding resulted in more observed differences. Use of summed charge state datasets and the analysis of metrics beyond drift time allowed for a more comprehensive comparative study between different monoclonal antibody drug products. Ion mobility mass spectrometry enabled detection of differences between monoclonal antibodies with the same target protein but different production techniques, as well as products with different targets. These differences were not always detectable by traditional collision cross section studies. Ion mobility mass spectrometry, and the added separation capability of collision-induced unfolding, was highly reproducible and remains a promising technique for advanced analytical characterization of protein therapeutics.

  4. An aircraft measurement technique for formaldehyde and soluble carbonyl compounds

    NASA Astrophysics Data System (ADS)

    Lee, Yin-Nan; Zhou, Xianliang; Leaitch, W. Richard; Banic, Catharine M.

    1996-12-01

    An aircraft technique was developed for measuring ambient concentrations of formaldehyde and a number of soluble carbonyl compounds, including glycolaldehyde, glyoxal, methylglyoxal, glyoxylic acid, and pyruvic acid. Sampling was achieved by liquid scrubbing using a glass coil scrubber in conjunction with an autosampler which collected 5-min integrated liquid samples in septum-sealed vials. Analysis was performed on the ground after flight using high-performance liquid chromatography following derivatization of the carbonyl analytes with 2,4-dinitrophenylhydrazine; the limit of detection was 0.01 to 0.02 parts per billion by volume (ppbv) in the gas phase. Although lacking a real-time capability, this technique offers the advantage of simultaneously measuring six carbonyl compounds, savings in space and power on the aircraft, and a dependable ground-based analysis. This technique was deployed on the Canadian National Research Council DHC-6 Twin Otter during the 1993 summer intensive of the North Atlantic Regional Experiment. The data obtained on August 28, 1993, during a pollutant transport episode are presented as an example of the performance and capability of this technique.

  5. Current Status of Mycotoxin Analysis: A Critical Review.

    PubMed

    Shephard, Gordon S

    2016-07-01

    It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.

  6. Nanomanipulation-Coupled Matrix-Assisted Laser Desorption/ Ionization-Direct Organelle Mass Spectrometry: A Technique for the Detailed Analysis of Single Organelles

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.

    2016-02-01

    We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.

  7. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.

  8. Relativistic algorithm for time transfer in Mars missions under IAU Resolutions: an analytic approach

    NASA Astrophysics Data System (ADS)

    Pan, Jun-Yang; Xie, Yi

    2015-02-01

    With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.

  9. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  10. Non-volatile analysis in fruits by laser resonant ionization spectrometry: application to resveratrol (3,5,4'-trihydroxystilbene) in grapes

    NASA Astrophysics Data System (ADS)

    Montero, C.; Orea, J. M.; Soledad Muñoz, M.; Lobo, R. F. M.; González Ureña, A.

    A laser desorption (LD) coupled with resonance-enhanced multiphoton ionisation (REMPI) and time-of-flight mass spectrometry (TOFMS) technique for non-volatile trace analysis compounds is presented. Essential features are: (a) an enhanced desorption yield due to the mixing of metal powder with the analyte in the sample preparation, (b) a high resolution, great sensitivity and low detection limit due to laser resonant ionisation and mass spectrometry detection. Application to resveratrol content in grapes demonstrated the capability of the analytical method with a sensitivity of 0.2 pg per single laser shot and a detection limit of 5 ppb.

  11. Rotorcraft Diagnostics

    NASA Technical Reports Server (NTRS)

    Haste, Deepak; Azam, Mohammad; Ghoshal, Sudipto; Monte, James

    2012-01-01

    Health management (HM) in any engineering systems requires adequate understanding about the system s functioning; a sufficient amount of monitored data; the capability to extract, analyze, and collate information; and the capability to combine understanding and information for HM-related estimation and decision-making. Rotorcraft systems are, in general, highly complex. Obtaining adequate understanding about functioning of such systems is quite difficult, because of the proprietary (restricted access) nature of their designs and dynamic models. Development of an EIM (exact inverse map) solution for rotorcraft requires a process that can overcome the abovementioned difficulties and maximally utilize monitored information for HM facilitation via employing advanced analytic techniques. The goal was to develop a versatile HM solution for rotorcraft for facilitation of the Condition Based Maintenance Plus (CBM+) capabilities. The effort was geared towards developing analytic and reasoning techniques, and proving the ability to embed the required capabilities on a rotorcraft platform, paving the way for implementing the solution on an aircraft-level system for consolidation and reporting. The solution for rotorcraft can he used offboard or embedded directly onto a rotorcraft system. The envisioned solution utilizes available monitored and archived data for real-time fault detection and identification, failure precursor identification, and offline fault detection and diagnostics, health condition forecasting, optimal guided troubleshooting, and maintenance decision support. A variant of the onboard version is a self-contained hardware and software (HW+SW) package that can be embedded on rotorcraft systems. The HM solution comprises components that gather/ingest data and information, perform information/feature extraction, analyze information in conjunction with the dependency/diagnostic model of the target system, facilitate optimal guided troubleshooting, and offer decision support for optimal maintenance.

  12. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.

  13. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  15. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  16. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  17. NIR and UV-vis spectroscopy, artificial nose and tongue: comparison of four fingerprinting techniques for the characterisation of Italian red wines.

    PubMed

    Casale, M; Oliveri, P; Armanino, C; Lanteri, S; Forina, M

    2010-06-04

    Four rapid and low-cost vanguard analytical systems (NIR and UV-vis spectroscopy, a headspace-mass based artificial nose and a voltammetric artificial tongue), together with chemometric pattern recognition techniques, were applied and compared in addressing a food authentication problem: the distinction between wine samples from the same Italian oenological region, according to the grape variety. Specifically, 59 certified samples belonging to the Barbera d'Alba and Dolcetto d'Alba appellations and collected from the same vintage (2007) were analysed. The instrumental responses, after proper data pre-processing, were used as fingerprints of the characteristics of the samples: the results from principal component analysis and linear discriminant analysis were discussed, comparing the capability of the four analytical strategies in addressing the problem studied. Copyright 2010 Elsevier B.V. All rights reserved.

  18. Temperature measurement in a compressible flow field using laser-induced iodine fluorescence

    NASA Technical Reports Server (NTRS)

    Fletcher, D. G.; Mcdaniel, J. C.

    1987-01-01

    The thermometric capability of a two-line fluorescence technique using iodine seed molecules in air is investigated analytically and verified experimentally in a known steady compressible flow field. Temperatures ranging from 165 to 295 K were measured in the flowfield using two iodine transitions accessed with a 30-GHz dye-laser scan near 543 nm. The effect of pressure broadening on temperature measurement is evaluated.

  19. The NASTRAN theoretical manual

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Designed to accommodate additions and modifications, this commentary on NASTRAN describes the problem solving capabilities of the program in a narrative fashion and presents developments of the analytical and numerical procedures that underlie the program. Seventeen major sections and numerous subsections cover; the organizational aspects of the program, utility matrix routines, static structural analysis, heat transfer, dynamic structural analysis, computer graphics, special structural modeling techniques, error analysis, interaction between structures and fluids, and aeroelastic analysis.

  20. 2018 Ground Robotics Capabilities Conference and Exhibiton

    DTIC Science & Technology

    2018-04-11

    Transportable Robot System (MTRS) Inc 1 Non -standard Equipment (approved) Explosive Ordnance Disposal Common Robotic System-Heavy (CRS-H) Inc 1 AROC: 3-Star...and engineering • AI risk mitigation methodologies and techniques are at best immature – E.g., V&V; Probabilistic software analytics; code level...controller to minimize potential UxS mishaps and unauthorized Command and Control (C2). • PSP-10 – Ensure that software systems which exhibit non

  1. Development of analytically capable time-of-flight mass spectrometer with continuous ion introduction

    NASA Astrophysics Data System (ADS)

    Hárs, György; Dobos, Gábor

    2010-03-01

    The present article describes the results and findings explored in the course of the development of the analytically capable prototype of continuous time-of-flight (CTOF) mass spectrometer. Currently marketed pulsed TOF (PTOF) instruments use ion introduction with a 10 ns or so pulse width, followed by a waiting period roughly 100 μs. Accordingly, the sample is under excitation in 10-4 part of the total measuring time. This very low duty cycle severely limits the sensitivity of the PTOF method. A possible approach to deal with this problem is to use linear sinusoidal dual modulation technique (CTOF) as described in this article. This way the sensitivity of the method is increased, due to the 50% duty cycle of the excitation. All other types of TOF spectrometer use secondary electron multiplier (SEM) for detection, which unfortunately discriminates in amplification in favor of the lighter ions. This discrimination effect is especially undesirable in a mass spectrometric method, which targets high mass range. In CTOF method, SEM is replaced with Faraday cup detector, thus eliminating the mass discrimination effect. Omitting SEM is made possible by the high ion intensity and the very slow ion detection with some hundred hertz detection bandwidth. The electrometer electronics of the Faraday cup detector operates with amplification 1010 V/A. The primary ion beam is highly monoenergetic due to the construction of the ion gun, which made possible to omit any electrostatic mirror configuration for bunching the ions. The measurement is controlled by a personal computer and the intelligent signal generator Type Tabor WW 2571, which uses the direct digital synthesis technique for making arbitrary wave forms. The data are collected by a Labjack interface board, and the fast Fourier transformation is performed by the software. Noble gas mixture has been used to test the analytical capabilities of the prototype setup. Measurement presented proves the results of the mathematical calculations as well as the future potentiality for use in chemical analysis of gaseous mixtures.

  2. Comprehensive study of solid pharmaceutical tablets in visible, near infrared (NIR), and longwave infrared (LWIR) spectral regions using a rapid simultaneous ultraviolet/visible/NIR (UVN) + LWIR laser-induced breakdown spectroscopy linear arrays detection system and a fast acousto-optic tunable filter NIR spectrometer.

    PubMed

    Yang, Clayton S C; Jin, Feng; Swaminathan, Siva R; Patel, Sita; Ramer, Evan D; Trivedi, Sudhir B; Brown, Ei E; Hommerich, Uwe; Samuels, Alan C

    2017-10-30

    This is the first report of a simultaneous ultraviolet/visible/NIR and longwave infrared laser-induced breakdown spectroscopy (UVN + LWIR LIBS) measurement. In our attempt to study the feasibility of combining the newly developed rapid LWIR LIBS linear array detection system to existing rapid analytical techniques for a wide range of chemical analysis applications, two different solid pharmaceutical tablets, Tylenol arthritis pain and Bufferin, were studied using both a recently designed simultaneous UVN + LWIR LIBS detection system and a fast AOTF NIR (1200 to 2200 nm) spectrometer. Every simultaneous UVN + LWIR LIBS emission spectrum in this work was initiated by one single laser pulse-induced micro-plasma in the ambient air atmosphere. Distinct atomic and molecular LIBS emission signatures of the target compounds measured simultaneously in UVN (200 to 1100 nm) and LWIR (5.6 to 10 µm) spectral regions are readily detected and identified without the need to employ complex data processing. In depth profiling studies of these two pharmaceutical tablets without any sample preparation, one can easily monitor the transition of the dominant LWIR emission signatures from coating ingredients gradually to the pharmaceutical ingredients underneath the coating. The observed LWIR LIBS emission signatures provide complementary molecular information to the UVN LIBS signatures, thus adding robustness to identification procedures. LIBS techniques are more surface specific while NIR spectroscopy has the capability to probe more bulk materials with its greater penetration depth. Both UVN + LWIR LIBS and NIR absorption spectroscopy have shown the capabilities of acquiring useful target analyte spectral signatures in comparable short time scales. The addition of a rapid LWIR spectroscopic probe to these widely used optical analytical methods, such as NIR spectroscopy and UVN LIBS, may greatly enhance the capability and accuracy of the combined system for a comprehensive analysis.

  3. Automatically measuring brain ventricular volume within PACS using artificial intelligence.

    PubMed

    Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon

    2018-01-01

    The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.

  4. Synchrotron based mass spectrometry to investigate the molecular properties of mineral-organic associations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Suet Yi; Kleber, Markus; Takahashi, Lynelle K.

    2013-04-01

    Soil organic matter (OM) is important because its decay drives life processes in the biosphere. Analysis of organic compounds in geological systems is difficult because of their intimate association with mineral surfaces. To date there is no procedure capable of quantitatively separating organic from mineral phases without creating artifacts or mass loss. Therefore, analytical techniques that can (a) generate information about both organic and mineral phases simultaneously and (b) allow the examination of predetermined high-interest regions of the sample as opposed to conventional bulk analytical techniques are valuable. Laser Desorption Synchrotron Postionization (synchrotron-LDPI) mass spectrometry is introduced as a novelmore » analytical tool to characterize the molecular properties of organic compounds in mineral-organic samples from terrestrial systems, and it is demonstrated that when combined with Secondary Ion Mass Spectrometry (SIMS), can provide complementary information on mineral composition. Mass spectrometry along a decomposition gradient in density fractions, verifies the consistency of our results with bulk analytical techniques. We further demonstrate that by changing laser and photoionization energies, variations in molecular stability of organic compounds associated with mineral surfaces can be determined. The combination of synchrotron-LDPI and SIMS shows that the energetic conditions involved in desorption and ionization of organic matter may be a greater determinant of mass spectral signatures than the inherent molecular structure of the organic compounds investigated. The latter has implications for molecular models of natural organic matter that are based on mass spectrometric information.« less

  5. Convergence in full motion video processing, exploitation, and dissemination and activity based intelligence

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Lewis, Gina

    2012-06-01

    Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.

  6. Calculation of ground vibration spectra from heavy military vehicles

    NASA Astrophysics Data System (ADS)

    Krylov, V. V.; Pickup, S.; McNuff, J.

    2010-07-01

    The demand for reliable autonomous systems capable to detect and identify heavy military vehicles becomes an important issue for UN peacekeeping forces in the current delicate political climate. A promising method of detection and identification is the one using the information extracted from ground vibration spectra generated by heavy military vehicles, often termed as their seismic signatures. This paper presents the results of the theoretical investigation of ground vibration spectra generated by heavy military vehicles, such as tanks and armed personnel carriers. A simple quarter car model is considered to identify the resulting dynamic forces applied from a vehicle to the ground. Then the obtained analytical expressions for vehicle dynamic forces are used for calculations of generated ground vibrations, predominantly Rayleigh surface waves, using Green's function method. A comparison of the obtained theoretical results with the published experimental data shows that analytical techniques based on the simplified quarter car vehicle model are capable of producing ground vibration spectra of heavy military vehicles that reproduce basic properties of experimental spectra.

  7. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  8. Fluorescence excitation-emission matrix spectroscopy for degradation monitoring of machinery lubricants

    NASA Astrophysics Data System (ADS)

    Sosnovski, Oleg; Suresh, Pooja; Dudelzak, Alexander E.; Green, Benjamin

    2018-02-01

    Lubrication oil is a vital component of heavy rotating machinery defining the machine's health, operational safety and effectiveness. Recently, the focus has been on developing sensors that provide real-time/online monitoring of oil condition/lubricity. Industrial practices and standards for assessing oil condition involve various analytical methods. Most these techniques are unsuitable for online applications. The paper presents the results of studying degradation of antioxidant additives in machinery lubricants using Fluorescence Excitation-Emission Matrix (EEM) Spectroscopy and Machine Learning techniques. EEM Spectroscopy is capable of rapid and even standoff sensing; it is potentially applicable to real-time online monitoring.

  9. Automated measurement of respiratory gas exchange by an inert gas dilution technique

    NASA Technical Reports Server (NTRS)

    Sawin, C. F.; Rummel, J. A.; Michel, E. L.

    1974-01-01

    A respiratory gas analyzer (RGA) has been developed wherein a mass spectrometer is the sole transducer required for measurement of respiratory gas exchange. The mass spectrometer maintains all signals in absolute phase relationships, precluding the need to synchronize flow and gas composition as required in other systems. The RGA system was evaluated by comparison with the Douglas bag technique. The RGA system established the feasibility of the inert gas dilution method for measuring breath-by-breath respiratory gas exchange. This breath-by-breath analytical capability permits detailed study of transient respiratory responses to exercise.

  10. Modeling of ion acceleration through drift and diffusion at interplanetary shocks

    NASA Technical Reports Server (NTRS)

    Decker, R. B.; Vlahos, L.

    1986-01-01

    A test particle simulation designed to model ion acceleration through drift and diffusion at interplanetary shocks is described. The technique consists of integrating along exact particle orbits in a system where the angle between the shock normal and mean upstream magnetic field, the level of magnetic fluctuations, and the energy of injected particles can assume a range of values. The technique makes it possible to study time-dependent shock acceleration under conditions not amenable to analytical techniques. To illustrate the capability of the numerical model, proton acceleration was considered under conditions appropriate for interplanetary shocks at 1 AU, including large-amplitude transverse magnetic fluctuations derived from power spectra of both ambient and shock-associated MHD waves.

  11. Early Oscillation Detection for Hybrid DC/DC Converter Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Wang, Bright L.

    2011-01-01

    This paper describes a novel fault detection technique for hybrid DC/DC converter oscillation diagnosis. The technique is based on principles of feedback control loop oscillation and RF signal modulations, and Is realized by using signal spectral analysis. Real-circuit simulation and analytical study reveal critical factors of the oscillation and indicate significant correlations between the spectral analysis method and the gain/phase margin method. A stability diagnosis index (SDI) is developed as a quantitative measure to accurately assign a degree of stability to the DC/DC converter. This technique Is capable of detecting oscillation at an early stage without interfering with DC/DC converter's normal operation and without limitations of probing to the converter.

  12. Trace level detection of analytes using artificial olfactometry

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor); Wong, Bernard (Inventor)

    2002-01-01

    The present invention provides a device for detecting the presence of an analyte, such as for example, a lightweight device, including: a sample chamber having a fluid inlet port for the influx of the analyte; a fluid concentrator in flow communication with the sample chamber wherein the fluid concentrator has an absorbent material capable of absorbing the analyte and capable of desorbing a concentrated analyte; and an array of sensors in fluid communication with the concentrated analyte to be released from the fluid concentrator.

  13. Separation of negatively charged carbohydrates by capillary electrophoresis.

    PubMed

    Linhardt, R J; Pervin, A

    1996-01-12

    Capillary electrophoresis (CE) has recently emerged as a highly promising technique consuming an extremely small amount of sample and capable of the rapid, high-resolution separation, characterization, and quantitation of analytes. CE has been used for the separation of biopolymers, including acidic carbohydrates. Since CE is basically an analytical method for ions, acidic carbohydrates that give anions in weakly acid, neutral, or alkaline media are often the direct objects of this method. The scope of this review is limited to the use of CE for the analysis of carbohydrates containing carboxylate, sulfate, and phosphate groups as well as neutral carbohydrates that have been derivatized to incorporate strongly acidic functionality, such as sulfonate groups.

  14. Evaluating bis(2-ethylhexyl) methanediphosphonic acid (H 2DEH[MDP]) based polymer ligand film (PLF) for plutonium and uranium extraction

    DOE PAGES

    Rim, Jung H.; Armenta, Claudine E.; Gonzales, Edward R.; ...

    2015-09-12

    This paper describes a new analyte extraction medium called polymer ligand film (PLF) that was developed to rapidly extract radionuclides. PLF is a polymer medium with ligands incorporated in its matrix that selectively and quickly extracts analytes. The main focus of the new technique is to shorten and simplify the procedure for chemically isolating radionuclides for determination through alpha spectroscopy. The PLF system was effective for plutonium and uranium extraction. The PLF was capable of co-extracting or selectively extracting plutonium over uranium depending on the PLF composition. As a result, the PLF and electrodeposited samples had similar alpha spectra resolutions.

  15. Functional-analytical capabilities of GIS technology in the study of water use risks

    NASA Astrophysics Data System (ADS)

    Nevidimova, O. G.; Yankovich, E. P.; Yankovich, K. S.

    2015-02-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved.

  16. Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulesza, Joel A.

    This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.

  17. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  18. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  19. Remote Geochemical and Mineralogical Analyses under Venus Atmospheric Conditions by Raman - Laser Induced Breakdown Spectroscopy (LIBS)

    NASA Astrophysics Data System (ADS)

    Clegg, S. M.; Wiens, R. C.; Newell, R. T.; DeCroix, D. S.; Sharma, S. K.; Misra, A. K.; Dyar, M. D.; Anderson, R. B.; Angel, S. M.; Martinez, R.; McInroy, R.

    2016-12-01

    The extreme Venus surface temperature ( 740 K) and atmospheric pressure ( 93 atm) create a challenging environment for surface geochemical and mineralogical investigations. Such investigations must be completed within hours of landing before the lander will be overcome by the harsh atmosphere. A combined remote Raman - LIBS spectrometer (RLS) is capable of accomplishing the geochemical science goals without the risks associated with collecting samples and bringing them into the lander. Wiens et al. [1], Sharma et al. [2] and Clegg et al. [3] demonstrated that both analytical techniques can be integrated into a single instrument similar to the SuperCam instrument selected for the Mars 2020 rover. The focus of this paper is to explore the capability to probe geologic samples by Raman and LIBS and demonstrate quantitative analysis under Venus surface conditions. Raman and LIBS are highly complementary analytical techniques capable of determining both the mineralogical and geochemical composition of Venus surface samples. These techniques have the potential to profoundly increase our knowledge of the Venus surface composition, which is currently limited to geochemical data from the Venera and VEGA landers [4]. Based on the observed compositional differences and recognizing the imprecise nature of the existing data, samples were chosen to constitute a Venus-analog suite for this study. LIBS data reduction involved generating a partial least squares (PLS) model with a subset of the rock powder standards to quantitatively determine the major elemental abundance of the remaining samples. The Raman experiments have been conducted under supercritical CO2 involving single-mineral and mixed-mineral samples containing talc, olivine, pyroxenes, feldspars, anhydrite, barite, and siderite. These experiments involve a new RLS prototype similar to the SuperCam instrument as well a new 2 m long pressure chamber capable of simulating the Venus surface temperature and pressure. Results of these combined Raman-LIBS investigations will be presented and discussed. [1] Wiens R.C., et al. (2005) Spect. Acta A 61, 2324; [2] Sharma, S. K. et al. (2007) Spect. Acta A, 68 , 1036 (2007); [3] Clegg, S.M. et al. (2014) Appl. Spec. 68, 925; [4] Barsukov VL (1992) In Venus Geology, Geochemistry, and Geophysics, Univ. Arizona Press, pp. 165.

  20. Development of methodologies and procedures for identifying STS users and uses

    NASA Technical Reports Server (NTRS)

    Archer, J. L.; Beauchamp, N. A.; Macmichael, D. C.

    1974-01-01

    A study was conducted to identify new uses and users of the new Space Transporation System (STS) within the domestic government sector. The study develops a series of analytical techniques and well-defined functions structured as an integrated planning process to assure efficient and meaningful use of the STS. The purpose of the study is to provide NASA with the following functions: (1) to realize efficient and economic use of the STS and other NASA capabilities, (2) to identify new users and uses of the STS, (3) to contribute to organized planning activities for both current and future programs, and (4) to air in analyzing uses of NASA's overall capabilities.

  1. On-line soft sensing in upstream bioprocessing.

    PubMed

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  2. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  3. Comparative Characterization of Crofelemer Samples Using Data Mining and Machine Learning Approaches With Analytical Stability Data Sets.

    PubMed

    Nariya, Maulik K; Kim, Jae Hyun; Xiong, Jian; Kleindl, Peter A; Hewarathna, Asha; Fisher, Adam C; Joshi, Sangeeta B; Schöneich, Christian; Forrest, M Laird; Middaugh, C Russell; Volkin, David B; Deeds, Eric J

    2017-11-01

    There is growing interest in generating physicochemical and biological analytical data sets to compare complex mixture drugs, for example, products from different manufacturers. In this work, we compare various crofelemer samples prepared from a single lot by filtration with varying molecular weight cutoffs combined with incubation for different times at different temperatures. The 2 preceding articles describe experimental data sets generated from analytical characterization of fractionated and degraded crofelemer samples. In this work, we use data mining techniques such as principal component analysis and mutual information scores to help visualize the data and determine discriminatory regions within these large data sets. The mutual information score identifies chemical signatures that differentiate crofelemer samples. These signatures, in many cases, would likely be missed by traditional data analysis tools. We also found that supervised learning classifiers robustly discriminate samples with around 99% classification accuracy, indicating that mathematical models of these physicochemical data sets are capable of identifying even subtle differences in crofelemer samples. Data mining and machine learning techniques can thus identify fingerprint-type attributes of complex mixture drugs that may be used for comparative characterization of products. Copyright © 2017 American Pharmacists Association®. All rights reserved.

  4. Application of headspace solid-phase microextraction (HS-SPME) and comprehensive two-dimensional gas chromatography (GC x GC) for the chemical profiling of volatile oils in complex herbal mixtures.

    PubMed

    Di, Xin; Shellie, Robert A; Marriott, Philip J; Huie, Carmen W

    2004-04-01

    The coupling of headspace solid-phase microextraction (HS-SPME) with comprehensive two-dimensional gas chromatography (GC x GC) was shown to be a powerful technique for the rapid sampling and analysis of volatile oils in complex herbal materials. When compared to one-dimensional (1-D) GC, the improved analytical capabilities of GC x GC in terms of increased detection sensitivity and separation power were demonstrated by using HS-SPME/GC x GC for the chemical profiling (fingerprinting) of essential/volatile oils contained in herbal materials of increasing analytical complexity. More than 20 marker compounds belonging to Panax quinquefolius (American ginseng) can be observed within the 2-D contour plots of ginseng itself, a mixture of ginseng and another important herb (P. quinquefolius/Radix angelicae sinensis), as well as a mixture of ginseng and three other herbs (P. quinquefolius /R. angelicae sinensis/R. astragali/R. rehmanniae preparata). Such analytical capabilities should be important towards the authentication and quality control of herbal products, which are receiving increasing attention as alternative medicines worldwide. In particular, the presence of Panax in the herb formulation could be readily identified through its specific peak pattern in the 2-D GC x GC plot.

  5. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    PubMed

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.

  6. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  7. Uranium determination in natural water by the fissiontrack technique

    USGS Publications Warehouse

    Reimer, G.M.

    1975-01-01

    The fission track technique, utilizing the neutron-induced fission of uranium-235, provides a versatile analytical method for the routine analysis of uranium in liquid samples of natural water. A detector is immersed in the sample and both are irradiated. The fission track density observed in the detector is directly proportional to the uranium concentration. The specific advantages of this technique are: (1) only a small quantity of sample, typically 0.1-1 ml, is needed; (2) no sample concentration is necessary; (3) it is capable of providing analyses with a lower reporting limit of 1 ??g per liter; and (4) the actual time spent on an analysis can be only a few minutes. This paper discusses and describes the method. ?? 1975.

  8. Development of catchment research, with particular attention to Plynlimon and its forerunner, the East African catchments

    NASA Astrophysics Data System (ADS)

    Blackie, J. R.; Robinson, M.

    2007-01-01

    Dr J.S.G. McCulloch was deeply involved in the establishment of research catchments in East Africa and subsequently in the UK to investigate the hydrological consequences of changes in land use. Comparison of these studies provides an insight into how influential his inputs and direction have been in the progressive development of the philosophy, the instrumentation and the analytical techniques now employed in catchment research. There were great contrasts in the environments: tropical highland (high radiation, intense rainfall) vs. temperate maritime (low radiation and frontal storms), contrasting soils and vegetation types, as well as the differing social and economic pressures in developing and developed nations. Nevertheless, the underlying scientific philosophy was common to both, although techniques had to be modified according to local conditions. As specialised instrumentation and analytical techniques were developed for the UK catchments many were also integrated into the East African studies. Many lessons were learned in the course of these studies and from the experiences of other studies around the world. Overall, a rigorous scientific approach was developed with widespread applicability. Beyond the basics of catchment selection and the quantification of the main components of the catchment water balance, this involved initiating parallel process studies to provide information on specific aspects of catchment behaviour. This information could then form the basis for models capable of extrapolation from the observed time series to other periods/hydrological events and, ultimately, the capability of predicting the consequences of changes in catchment land management to other areas in a range of climates.

  9. Foodomics imaging by mass spectrometry and magnetic resonance.

    PubMed

    Canela, Núria; Rodríguez, Miguel Ángel; Baiges, Isabel; Nadal, Pedro; Arola, Lluís

    2016-07-01

    This work explores the use of advanced imaging MS (IMS) and magnetic resonance imaging (MRI) techniques in food science and nutrition to evaluate food sensory characteristics, nutritional value and health benefits. Determining the chemical content and applying imaging tools to food metabolomics offer detailed information about food quality, safety, processing, storage and authenticity assessment. IMS and MRI are powerful analytical systems with an excellent capability for mapping the distribution of many molecules, and recent advances in these platforms are reviewed and discussed, showing the great potential of these techniques for small molecule-based food metabolomics research. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  11. Looking ahead in systems engineering

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Donald S.

    1966-01-01

    Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.

  12. PROGRAM ASTEC (ADVANCED SOLAR TURBO ELECTRIC CONCEPT). PART IV. SOLAR COLLECTOR DEVELOPMENT SUPPORT TASKS. VOL. VI. DEVELOPMENT OF ANALYTICAL TECHNIQUES TO PREDICT THE STRUCTURAL BEHAVIOR OF PETAL-TYPE SOLAR COLLECTORS.

    DTIC Science & Technology

    The design of large petal-type paraboloidal solar collectors for the ASTEC Program requires a capability for determining the distortion and stress...analysis of a parabolic curved beam is given along with a numerical solution and digital program. The dynamic response of the ASTEC flight-test vehicle is discussed on the basis of modal analysis.

  13. User's guide and description of the streamline divergence computer program. [turbulent convective heat transfer

    NASA Technical Reports Server (NTRS)

    Sulyma, P. R.; Mcanally, J. V.

    1975-01-01

    The streamline divergence program was developed to demonstrate the capability to trace inviscid surface streamlines and to calculate outflow-corrected laminar and turbulent convective heating rates on surfaces subjected to exhaust plume impingement. The analytical techniques used in formulating this program are discussed. A brief description of the streamline divergence program is given along with a user's guide. The program input and output for a sample case are also presented.

  14. A fundamental approach to adhesion: Synthesis, surface analysis, thermodynamics and mechanics

    NASA Technical Reports Server (NTRS)

    Chen, W.; Wightman, J. P.

    1979-01-01

    Adherend surfaces and fractography were studied using electron spectroscopy for chemical analysis and scanning electron microscopy/energy dispersive analysis of X-rays. In addition, Auger Electron Spectroscopy with depth profiling capability was used. It is shown that contamination of adhesion systems plays an important role not only in determining initial bond strengths but also in the durability of adhesive bonds. It is concluded that the analytical techniques used to characterize and monitor such contamination.

  15. Single-indicator-based Multidimensional Sensing: Detection and Identification of Heavy Metal Ions and Understanding the Foundations from Experiment to Simulation

    PubMed Central

    Leng, Yumin; Qian, Sihua; Wang, Yuhui; Lu, Cheng; Ji, Xiaoxu; Lu, Zhiwen; Lin, Hengwei

    2016-01-01

    Multidimensional sensing offers advantages in accuracy, diversity and capability for the simultaneous detection and discrimination of multiple analytes, however, the previous reports usually require complicated synthesis/fabrication process and/or need a variety of techniques (or instruments) to acquire signals. Therefore, to take full advantages of this concept, simple designs are highly desirable. Herein, a novel concept is conceived to construct multidimensional sensing platforms based on a single indicator that has capability of showing diverse color/fluorescence responses with the addition of different analytes. Through extracting hidden information from these responses, such as red, green and blue (RGB) alterations, a triple-channel-based multidimensional sensing platform could consequently be fabricated, and the RGB alterations are further applicable to standard statistical methods. As a proof-of-concept study, a triple-channel sensing platform is fabricated solely using dithizone with assistance of cetyltrimethylammonium bromide (CTAB) for hyperchromicity and sensitization, which demonstrates superior capabilities in detection and identification of ten common heavy metal ions at their standard concentrations of wastewater-discharge of China. Moreover, this sensing platform exhibits promising applications in semi-quantitative and even quantitative analysis individuals of these heavy metal ions with high sensitivity as well. Finally, density functional theory calculations are performed to reveal the foundations for this analysis. PMID:27146105

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parrado, G., E-mail: gparrado@sgc.gov.co; Cañón, Y.; Peña, M., E-mail: mlpena@sgc.gov.co

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes withmore » medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.« less

  17. SFC-MS/MS as an orthogonal technique for improved screening of polar analytes in anti-doping control.

    PubMed

    Parr, Maria Kristina; Wuest, Bernhard; Naegele, Edgar; Joseph, Jan F; Wenzel, Maxi; Schmidt, Alexander H; Stanic, Mijo; de la Torre, Xavier; Botrè, Francesco

    2016-09-01

    HPLC is considered the method of choice for the separation of various classes of drugs. However, some analytes are still challenging as HPLC shows limited resolution capabilities for highly polar analytes as they interact insufficiently on conventional reversed-phase (RP) columns. Especially in combination with mass spectrometric detection, limitations apply for alterations of stationary phases. Some highly polar sympathomimetic drugs and their metabolites showed almost no retention on different RP columns. Their retention remains poor even on phenylhexyl phases that show different selectivity due to π-π interactions. Supercritical fluid chromatography (SFC) as an orthogonal separation technique to HPLC may help to overcome these issues. Selected polar drugs and metabolites were analyzed utilizing SFC separation. All compounds showed sharp peaks and good retention even for the very polar analytes, such as sulfoconjugates. Retention times and elution orders in SFC are different to both RP and HILIC separations as a result of the orthogonality. Short cycle times could be realized. As temperature and pressure strongly influence the polarity of supercritical fluids, precise regulation of temperature and backpressure is required for the stability of the retention times. As CO2 is the main constituent of the mobile phase in SFC, solvent consumption and solvent waste are considerably reduced. Graphical Abstract SFC-MS/MS vs. LC-MS/MS.

  18. Analytical and Radiochemistry for Nuclear Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott

    Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.

  19. Analytical electron microscopy in the study of biological systems.

    PubMed

    Johnson, D E

    1986-01-01

    The AEM is a powerful tool in biological research, capable of providing information simply not available by other means. The use of a field emission STEM for this application can lead to a significant improvement in spatial resolution in most cases now allowed by the quality of the specimen preparation but perhaps ultimately limited by the effects of radiation damage. Increased elemental sensitivity is at least possible in selected cases with electron energy-loss spectrometry, but fundamental aspects of ELS will probably confine its role to that of a limited complement to EDS. The considerable margin for improvement in sensitivity of the basic analytical technique means that the search for technological improvement will continue. Fortunately, however, current technology can also continue to answer important biological questions.

  20. Aerodynamic parameter studies and sensitivity analysis for rotor blades in axial flight

    NASA Technical Reports Server (NTRS)

    Chiu, Y. Danny; Peters, David A.

    1991-01-01

    The analytical capability is offered for aerodynamic parametric studies and sensitivity analyses of rotary wings in axial flight by using a 3-D undistorted wake model in curved lifting line theory. The governing equations are solved by both the Multhopp Interpolation technique and the Vortex Lattice method. The singularity from the bound vortices is eliminated through the Hadamard's finite part concept. Good numerical agreement between both analytical methods and finite differences methods are found. Parametric studies were made to assess the effects of several shape variables on aerodynamic loads. It is found, e.g., that a rotor blade with out-of-plane and inplane curvature can theoretically increase lift in the inboard and outboard regions respectively without introducing an additional induced drag.

  1. Two-dimensional convolute integers for analytical instrumentation

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1982-01-01

    As new analytical instruments and techniques emerge with increased dimensionality, a corresponding need is seen for data processing logic which can appropriately address the data. Two-dimensional measurements reveal enhanced unknown mixture analysis capability as a result of the greater spectral information content over two one-dimensional methods taken separately. It is noted that two-dimensional convolute integers are merely an extension of the work by Savitzky and Golay (1964). It is shown that these low-pass, high-pass and band-pass digital filters are truly two-dimensional and that they can be applied in a manner identical with their one-dimensional counterpart, that is, a weighted nearest-neighbor, moving average with zero phase shifting, convoluted integer (universal number) weighting coefficients.

  2. A State-of-the-Art Contamination Effects Research and Test Facility

    NASA Technical Reports Server (NTRS)

    Olson, Keith R.; Folgner, Kelsey A.; Barrie, James D.; Villahermosa, Randy M.

    2008-01-01

    In the ongoing effort to better understand various spacecraft contamination phenomena, a new state of the art contamination effects research and test facility was designed, and recently brought on-line at The Aerospace Corporation s Space Materials Laboratory. This high vacuum test chamber employs multiple in-situ analytical techniques, making it possible to study both the qualitative and quantitative aspects of contaminant film formation in the presence or absence of VUV radiation. Adsorption and desorption kinetics, "photo-fixing efficiency", transmission loss of uniform contaminant films, light scatter from non-uniform films, and film morphology have been studied in this facility. This paper describes this new capability in detail and presents data collected from several of the analytical instruments.

  3. Development and Application of a Fast Chromatography Technique for Analysis of Biogenic Volatile Organic Compounds in Plant Emissions

    NASA Astrophysics Data System (ADS)

    Jones, C. E.; Kato, S.; Nakashima, Y.; Yamazakii, S.; Kajii, Y. J.

    2011-12-01

    Biogenic volatile organic compounds (BVOCs) emitted from vegetation constitute the largest fraction (>90 %) of total global non-methane VOC supplied to the atmosphere, yet the chemical complexity of these emissions means that achieving comprehensive measurements of BVOCs, and in particular the less volatile terpenes, is not straightforward. As such, there is still significant uncertainty associated with the contribution of BVOCs to the tropospheric oxidation budget, and to atmospheric secondary organic aerosol (SOA) formation. The rate of BVOC emission from vegetation is regulated by environmental conditions such as light intensity and temperature, and thus can be highly variable, necessitating high time-resolution BVOC measurements. In addition, the numerous monoterpene and sesquiterpene isomers, which are indistinguishable by some analytical techniques, have greatly varying lifetimes with respect to atmospheric oxidants, and as such quantification of each individual isomer is fundamental to achieving a comprehensive characterisation of the impact of BVOCs upon the atmospheric oxidation capacity. However, established measurement techniques for these trace gases typically offer a trade-off between sample frequency and the level of speciation; detailed information regarding chemical composition may be obtained, but with reduced time resolution, or vice versa. We have developed a Fast-GC-FID technique for quantification of a range of monoterpene, sesquiterpene and oxygenated C10 BVOC isomers, which retains the separation capability of conventional gas chromatography, yet offers considerably improved sample frequency. Development of this system is ongoing, but currently a 20 m x 0.18 mm i.d resistively heated metal column is employed to achieve chromatographic separation of thirteen C10-C15 BVOCs, within a total cycle time of ~15 minutes. We present the instrument specifications and analytical capability, together with the first application of this Fast-GC technique for BVOC analysis, monitoring BVOC emissions from white spruce (Picea glauca) during plant chamber studies.

  4. Analytical capabilities and services of Lawrence Livermore Laboratory's General Chemistry Division. [Methods available at Lawrence Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutmacher, R.; Crawford, R.

    This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.

  5. Actionable data analytics in oncology: are we there yet?

    PubMed

    Barkley, Ronald; Greenapple, Rhonda; Whang, John

    2014-03-01

    To operate under a new value-based paradigm, oncology providers must develop the capability to aggregate, analyze, measure, and report their value proposition--that is, their outcomes and associated costs. How are oncology providers positioned currently to perform these functions in a manner that is actionable? What is the current state of analytic capabilities in oncology? Are oncology providers prepared? This line of inquiry was the basis for the 2013 Cancer Center Business Summit annual industry research survey. This article reports on the key findings and implications of the 2013 research survey with regard to data analytic capabilities in the oncology sector. The essential finding from the study is that only a small number of oncology providers (7%) currently possess the analytic tools and capabilities necessary to satisfy internal and external demands for aggregating and reporting clinical outcome and economic data. However there is an expectation that a majority of oncology providers (60%) will have developed such capabilities within the next 2 years.

  6. Automated multi-radionuclide separation and analysis with combined detection capability

    NASA Astrophysics Data System (ADS)

    Plionis, Alexander Asterios

    The radiological dispersal device (RDD) is a weapon of great concern to those agencies responsible for protecting the public from the modern age of terrorism. In order to effectively respond to an RDD event, these agencies need to possess the capability to rapidly identify the radiological agents involved in the incident and assess the uptake of each individual victim. Since medical treatment for internal radiation poisoning is radionuclide-specific, it is critical to identify and quantify the radiological uptake of each individual victim. This dissertation describes the development of automated analytical components that could be used to determine and quantify multiple radionuclides in human urine bioassays. This is accomplished through the use of extraction chromatography that is plumbed in-line with one of a variety of detection instruments. Flow scintillation analysis is used for 90Sr and 210Po determination, flow gamma analysis is used assess 60 Co and 137Cs, and inductively coupled plasma mass spectrometry is used to determine actinides. Detection limits for these analytes were determined for the appropriate technique and related to their implications for health physics.

  7. Molecular imaging of cannabis leaf tissue with MeV-SIMS method

    NASA Astrophysics Data System (ADS)

    Jenčič, Boštjan; Jeromel, Luka; Ogrinc Potočnik, Nina; Vogel-Mikuš, Katarina; Kovačec, Eva; Regvar, Marjana; Siketić, Zdravko; Vavpetič, Primož; Rupnik, Zdravko; Bučar, Klemen; Kelemen, Mitja; Kovač, Janez; Pelicon, Primož

    2016-03-01

    To broaden our analytical capabilities with molecular imaging in addition to the existing elemental imaging with micro-PIXE, a linear Time-Of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry (MeV-SIMS) was constructed and added to the existing nuclear microprobe at the Jožef Stefan Institute. We measured absolute molecular yields and damage cross-section of reference materials, without significant alteration of the fragile biological samples during the duration of measurements in the mapping mode. We explored the analytical capability of the MeV-SIMS technique for chemical mapping of the plant tissue of medicinal cannabis leaves. A series of hand-cut plant tissue slices were prepared by standard shock-freezing and freeze-drying protocol and deposited on the Si wafer. We show the measured MeV-SIMS spectra showing a series of peaks in the mass area of cannabinoids, as well as their corresponding maps. The indicated molecular distributions at masses of 345.5 u and 359.4 u may be attributed to the protonated THCA and THCA-C4 acids, and show enhancement in the areas with opened trichome morphology.

  8. Portable laser-induced breakdown spectroscopy/diffuse reflectance hybrid spectrometer for analysis of inorganic pigments

    NASA Astrophysics Data System (ADS)

    Siozos, Panagiotis; Philippidis, Aggelos; Anglos, Demetrios

    2017-11-01

    A novel, portable spectrometer, combining two analytical techniques, laser-induced breakdown spectroscopy (LIBS) and diffuse reflectance spectroscopy, was developed with the aim to provide an enhanced instrumental and methodological approach with regard to the analysis of pigments in objects of cultural heritage. Technical details about the hybrid spectrometer and its operation are presented and examples are given relevant to the analysis of paint materials. Both LIBS and diffuse reflectance spectra in the visible and part of the near infrared, corresponding to several neat mineral pigment samples, were recorded and the complementary information was used to effectively distinguish different types of pigments even if they had similar colour or elemental composition. The spectrometer was also employed in the analysis of different paints on the surface of an ancient pottery sherd demonstrating the capabilities of the proposed hybrid diagnostic approach. Despite its instrumental simplicity and compact size, the spectrometer is capable of supporting analytical campaigns relevant to archaeological, historical or art historical investigations, particularly when quick data acquisition is required in the context of surveys of large numbers of objects and samples.

  9. An Analytical Approach for Performance Enhancement of FSO Communication System Using Array of Receivers in Adverse Weather Conditions

    NASA Astrophysics Data System (ADS)

    Nagpal, Shaina; Gupta, Amit

    2017-08-01

    Free Space Optics (FSO) link exploits the tremendous network capacity and is capable of offering wireless communications similar to communications through optical fibres. However, FSO link is extremely weather dependent and the major effect on FSO links is due to adverse weather conditions like fog and snow. In this paper, an FSO link is designed using an array of receivers. The disparity of the link for very high attenuation conditions due to fog and snow is analysed using aperture averaging technique. Further effect of aperture averaging technique is investigated by comparing the systems using aperture averaging technique with systems not using aperture averaging technique. The performance of proposed model of FSO link has been evaluated in terms of Q factor, bit error rate (BER) and eye diagram.

  10. Influence versus intent for predictive analytics in situation awareness

    NASA Astrophysics Data System (ADS)

    Cui, Biru; Yang, Shanchieh J.; Kadar, Ivan

    2013-05-01

    Predictive analytics in situation awareness requires an element to comprehend and anticipate potential adversary activities that might occur in the future. Most work in high level fusion or predictive analytics utilizes machine learning, pattern mining, Bayesian inference, and decision tree techniques to predict future actions or states. The emergence of social computing in broader contexts has drawn interests in bringing the hypotheses and techniques from social theory to algorithmic and computational settings for predictive analytics. This paper aims at answering the question on how influence and attitude (some interpreted such as intent) of adversarial actors can be formulated and computed algorithmically, as a higher level fusion process to provide predictions of future actions. The challenges in this interdisciplinary endeavor include drawing existing understanding of influence and attitude in both social science and computing fields, as well as the mathematical and computational formulation for the specific context of situation to be analyzed. The study of `influence' has resurfaced in recent years due to the emergence of social networks in the virtualized cyber world. Theoretical analysis and techniques developed in this area are discussed in this paper in the context of predictive analysis. Meanwhile, the notion of intent, or `attitude' using social theory terminologies, is a relatively uncharted area in the computing field. Note that a key objective of predictive analytics is to identify impending/planned attacks so their `impact' and `threat' can be prevented. In this spirit, indirect and direct observables are drawn and derived to infer the influence network and attitude to predict future threats. This work proposes an integrated framework that jointly assesses adversarial actors' influence network and their attitudes as a function of past actions and action outcomes. A preliminary set of algorithms are developed and tested using the Global Terrorism Database (GTD). Our results reveals the benefits to perform joint predictive analytics with both attitude and influence. At the same time, we discover significant challenges in deriving influence and attitude from indirect observables for diverse adversarial behavior. These observations warrant further investigation of optimal use of influence and attitude for predictive analytics, as well as the potential inclusion of other environmental or capability elements for the actors.

  11. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  12. The convergence study of the homotopy analysis method for solving nonlinear Volterra-Fredholm integrodifferential equations.

    PubMed

    Ghanbari, Behzad

    2014-01-01

    We aim to study the convergence of the homotopy analysis method (HAM in short) for solving special nonlinear Volterra-Fredholm integrodifferential equations. The sufficient condition for the convergence of the method is briefly addressed. Some illustrative examples are also presented to demonstrate the validity and applicability of the technique. Comparison of the obtained results HAM with exact solution shows that the method is reliable and capable of providing analytic treatment for solving such equations.

  13. Ion Beam Analysis of Diffusion in Diamondlike Carbon Films

    NASA Astrophysics Data System (ADS)

    Chaffee, Kevin Paul

    The van de Graaf accelerator facility at Case Western Reserve University was developed into an analytical research center capable of performing Rutherford Backscattering Spectrometry, Elastic Recoil Detection Analysis for hydrogen profiling, Proton Enhanced Scattering, and ^4 He resonant scattering for ^{16 }O profiling. These techniques were applied to the study of Au, Na^+, Cs ^+, and H_2O water diffusion in a-C:H films. The results are consistent with the fully constrained network model of the microstructure as described by Angus and Jansen.

  14. Current and future technology in radial and axial gas turbines

    NASA Technical Reports Server (NTRS)

    Rohlik, H. E.

    1983-01-01

    Design approaches and flow analysis techniques currently employed by aircraft engine manufacturers are assessed. Studies were performed to define the characteristics of aircraft and engines for civil missions of the 1990's and beyond. These studies, coupled with experience in recent years, identified the critical technologies needed to meet long range goals in fuel economy and other operating costs. Study results, recent and current research and development programs, and an estimate of future design and analytic capabilities are discussed.

  15. Review of design and operational characteristics of the 0.3-meter transonic cryogenic tunnel

    NASA Technical Reports Server (NTRS)

    Ray, E. J.; Ladson, C. L.; Adcock, J. B.; Lawing, P. L.; Hall, R. M.

    1979-01-01

    The fundamentals of cryogenic testing are validated both analytically and experimentally employing the 0.3-m transonic cryogenic tunnel. The tunnel with its unique Reynolds number capability has been used for a wide variety of aerodynamic tests. Techniques regarding real-gas effects have been developed and cryogenic tunnel conditions are set and maintained accurately. It is shown that cryogenic cooling, by injecting nitrogen directly into the tunnel circuit, imposes no problems with temperature distribution or dynamic response characteristics.

  16. A Monte Carlo-finite element model for strain energy controlled microstructural evolution - 'Rafting' in superalloys

    NASA Technical Reports Server (NTRS)

    Gayda, J.; Srolovitz, D. J.

    1989-01-01

    This paper presents a specialized microstructural lattice model, MCFET (Monte Carlo finite element technique), which simulates microstructural evolution in materials in which strain energy has an important role in determining morphology. The model is capable of accounting for externally applied stress, surface tension, misfit, elastic inhomogeneity, elastic anisotropy, and arbitrary temperatures. The MCFET analysis was found to compare well with the results of analytical calculations of the equilibrium morphologies of isolated particles in an infinite matrix.

  17. Scattering of Lamb waves in a composite plate

    NASA Technical Reports Server (NTRS)

    Bratton, Robert; Datta, Subhendu; Shah, Arvind

    1991-01-01

    A combined analytical and finite element technique is developed to gain a better understanding of the scattering of elastic waves by defects. This hybrid method is capable of predicting scattered displacements from arbitrary shaped defects as well as inclusions of different material. The continuity of traction and displacements at the boundaries of the two areas provided the necessary equations to find the nodal displacements and expansion coefficients. Results clearly illustrate the influence of increasing crack depth on the scattered signal.

  18. Underground Mining Method Selection Using WPM and PROMETHEE

    NASA Astrophysics Data System (ADS)

    Balusa, Bhanu Chander; Singam, Jayanthu

    2018-04-01

    The aim of this paper is to represent the solution to the problem of selecting suitable underground mining method for the mining industry. It is achieved by using two multi-attribute decision making techniques. These two techniques are weighted product method (WPM) and preference ranking organization method for enrichment evaluation (PROMETHEE). In this paper, analytic hierarchy process is used for weight's calculation of the attributes (i.e. parameters which are used in this paper). Mining method selection depends on physical parameters, mechanical parameters, economical parameters and technical parameters. WPM and PROMETHEE techniques have the ability to consider the relationship between the parameters and mining methods. The proposed techniques give higher accuracy and faster computation capability when compared with other decision making techniques. The proposed techniques are presented to determine the effective mining method for bauxite mine. The results of these techniques are compared with methods used in the earlier research works. The results show, conventional cut and fill method is the most suitable mining method.

  19. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  20. Theoretical considerations on the optogalvanic detection of laser induced fluorescence in atmospheric pressure atomizers

    NASA Astrophysics Data System (ADS)

    Omenetto, N.; Smith, B. W.; Winefordner, J. D.

    1989-01-01

    Several theoretical considerations are given on the potential and practical capabilities of a detector of fluorescence radiation whose operating principle is based on a multi-step excitation-ionization scheme involving the fluorescence photons as the first excitation step. This detection technique, which was first proposed by MATVEEVet al. [ Zh. Anal Khim.34, 846 (1979)], combines two independent atomizers, one analytical cell for the excitation of the sample fluorescence and one cell, filled with pure analyte atomic vapor, acting as the ionization detector. One laser beam excites the analyte fluorescence in the analytical cell and one (or two) laser beams are used to ionize the excited atoms in the detector. Several different causes of signal and noise are evaluated, together with a discussion on possible analytical atom reservoirs (flames, furnaces) and laser sources which could be used with this approach. For properly devised conditions, i.e. optical saturation of the fluorescence and unity ionization efficiency, detection limits well below pg/ml in solution and well below femtograms as absolute amounts in furnaces can be predicted. However, scattering problems, which are absent in a conventional laser-enhanced ionization set-up, may be important in this approach.

  1. Recent Advances in Bioprinting and Applications for Biosensing

    PubMed Central

    Dias, Andrew D.; Kingsley, David M.; Corr, David T.

    2014-01-01

    Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write) that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic. PMID:25587413

  2. Laboratory Instruments Available to Support Space Station Researchers at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Panda, Binayak; Gorti, Sridhar

    2013-01-01

    A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe Micro-analyzer (EPMA) for very precise microanalysis are available as needed by the researcher. Space Station researchers are invited to work with MSFC in analyzing their samples using these techniques.

  3. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  4. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  5. Generalized dynamic engine simulation techniques for the digital computer

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1974-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design-point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar all-digital programs on future engine simulation philosophy is also discussed.

  6. Generalized dynamic engine simulation techniques for the digital computers

    NASA Technical Reports Server (NTRS)

    Sellers, J.; Teren, F.

    1975-01-01

    Recently advanced simulation techniques have been developed for the digital computer and used as the basis for development of a generalized dynamic engine simulation computer program, called DYNGEN. This computer program can analyze the steady state and dynamic performance of many kinds of aircraft gas turbine engines. Without changes to the basic program, DYNGEN can analyze one- or two-spool turbofan engines. The user must supply appropriate component performance maps and design point information. Examples are presented to illustrate the capabilities of DYNGEN in the steady state and dynamic modes of operation. The analytical techniques used in DYNGEN are briefly discussed, and its accuracy is compared with a comparable simulation using the hybrid computer. The impact of DYNGEN and similar digital programs on future engine simulation philosophy is also discussed.

  7. Facile hyphenation of gas chromatography and a microcantilever array sensor for enhanced selectivity.

    PubMed

    Chapman, Peter J; Vogt, Frank; Dutta, Pampa; Datskos, Panos G; Devault, Gerald L; Sepaniak, Michael J

    2007-01-01

    The very simple coupling of a standard, packed-column gas chromatograph with a microcantilever array (MCA) is demonstrated for enhanced selectivity and potential analyte identification in the analysis of volatile organic compounds (VOCs). The cantilevers in MCAs are differentially coated on one side with responsive phases (RPs) and produce bending responses of the cantilevers due to analyte-induced surface stresses. Generally, individual components are difficult to elucidate when introduced to MCA systems as mixtures, although pattern recognition techniques are helpful in identifying single components, binary mixtures, or composite responses of distinct mixtures (e.g., fragrances). In the present work, simple test VOC mixtures composed of acetone, ethanol, and trichloroethylene (TCE) in pentane and methanol and acetonitrile in pentane are first separated using a standard gas chromatograph and then introduced into a MCA flow cell. Significant amounts of response diversity to the analytes in the mixtures are demonstrated across the RP-coated cantilevers of the array. Principal component analysis is used to demonstrate that only three components of a four-component VOC mixture could be identified without mixture separation. Calibration studies are performed, demonstrating a good linear response over 2 orders of magnitude for each component in the primary study mixture. Studies of operational parameters including column temperature, column flow rate, and array cell temperature are conducted. Reproducibility studies of VOC peak areas and peak heights are also carried out showing RSDs of less than 4 and 3%, respectively, for intra-assay studies. Of practical significance is the facile manner by which the hyphenation of a mature separation technique and the burgeoning sensing approach is accomplished, and the potential to use pattern recognition techniques with MCAs as a new type of detector for chromatography with analyte-identifying capabilities.

  8. Synthesis of active controls for flutter suppression on a flight research wing

    NASA Technical Reports Server (NTRS)

    Abel, I.; Perry, B., III; Murrow, H. N.

    1977-01-01

    This paper describes some activities associated with the preliminary design of an active control system for flutter suppression capable of demonstrating a 20% increase in flutter velocity. Results from two control system synthesis techniques are given. One technique uses classical control theory, and the other uses an 'aerodynamic energy method' where control surface rates or displacements are minimized. Analytical methods used to synthesize the control systems and evaluate their performance are described. Some aspects of a program for flight testing the active control system are also given. This program, called DAST (Drones for Aerodynamics and Structural Testing), employs modified drone-type vehicles for flight assessments and validation testing.

  9. Graphene Nanoplatelet-Polymer Chemiresistive Sensor Arrays for the Detection and Discrimination of Chemical Warfare Agent Simulants.

    PubMed

    Wiederoder, Michael S; Nallon, Eric C; Weiss, Matt; McGraw, Shannon K; Schnee, Vincent P; Bright, Collin J; Polcha, Michael P; Paffenroth, Randy; Uzarski, Joshua R

    2017-11-22

    A cross-reactive array of semiselective chemiresistive sensors made of polymer-graphene nanoplatelet (GNP) composite coated electrodes was examined for detection and discrimination of chemical warfare agents (CWA). The arrays employ a set of chemically diverse polymers to generate a unique response signature for multiple CWA simulants and background interferents. The developed sensors' signal remains consistent after repeated exposures to multiple analytes for up to 5 days with a similar signal magnitude across different replicate sensors with the same polymer-GNP coating. An array of 12 sensors each coated with a different polymer-GNP mixture was exposed 100 times to a cycle of single analyte vapors consisting of 5 chemically similar CWA simulants and 8 common background interferents. The collected data was vector normalized to reduce concentration dependency, z-scored to account for baseline drift and signal-to-noise ratio, and Kalman filtered to reduce noise. The processed data was dimensionally reduced with principal component analysis and analyzed with four different machine learning algorithms to evaluate discrimination capabilities. For 5 similarly structured CWA simulants alone 100% classification accuracy was achieved. For all analytes tested 99% classification accuracy was achieved demonstrating the CWA discrimination capabilities of the developed system. The novel sensor fabrication methods and data processing techniques are attractive for development of sensor platforms for discrimination of CWA and other classes of chemical vapors.

  10. TomoPhantom, a software package to generate 2D-4D analytical phantoms for CT image reconstruction algorithm benchmarks

    NASA Astrophysics Data System (ADS)

    Kazantsev, Daniil; Pickalov, Valery; Nagella, Srikanth; Pasca, Edoardo; Withers, Philip J.

    2018-01-01

    In the field of computerized tomographic imaging, many novel reconstruction techniques are routinely tested using simplistic numerical phantoms, e.g. the well-known Shepp-Logan phantom. These phantoms cannot sufficiently cover the broad spectrum of applications in CT imaging where, for instance, smooth or piecewise-smooth 3D objects are common. TomoPhantom provides quick access to an external library of modular analytical 2D/3D phantoms with temporal extensions. In TomoPhantom, quite complex phantoms can be built using additive combinations of geometrical objects, such as, Gaussians, parabolas, cones, ellipses, rectangles and volumetric extensions of them. Newly designed phantoms are better suited for benchmarking and testing of different image processing techniques. Specifically, tomographic reconstruction algorithms which employ 2D and 3D scanning geometries, can be rigorously analyzed using the software. TomoPhantom also provides a capability of obtaining analytical tomographic projections which further extends the applicability of software towards more realistic, free from the "inverse crime" testing. All core modules of the package are written in the C-OpenMP language and wrappers for Python and MATLAB are provided to enable easy access. Due to C-based multi-threaded implementation, volumetric phantoms of high spatial resolution can be obtained with computational efficiency.

  11. Use of a novel cation-exchange restricted-access material for automated sample clean-up prior to the determination of basic drugs in plasma by liquid chromatography.

    PubMed

    Chiap, P; Rbeida, O; Christiaens, B; Hubert, Ph; Lubda, D; Boos, K S; Crommen, J

    2002-10-25

    A new kind of silica-based restricted-access material (RAM) has been tested in pre-columns for the on-line solid-phase extraction (SPE) of basic drugs from directly injected plasma samples before their quantitative analysis by reversed-phase liquid chromatography (LC), using the column switching technique. The outer surface of the porous RAM particlescontains hydrophilic diol groups while sulphonic acid groups are bound to the internal surface, which gives the sorbent the properties of a strong cation exchanger towards low molecular mass compounds. Macromolecules such as proteins have no access to the internal surface of the pre-column due to their exclusion from the pores and are then flushed directly out. The retention capability of this novel packing material has been tested for some hydrophilic basic drugs, such as atropine, fenoterol, ipratropium, procaine, sotalol and terbutaline, used as model compounds. The influence of the composition of the washing liquid on the retention of the analytes in the pre-column has been investigated. The elution profiles of the different compounds and the plasma matrix as well as the time needed for the transfer of the analytes from the pre-column to the analytical column were determined in order to deduce the most suitable conditions for the clean-up step and develop on-line methods for the LC determination of these compounds in plasma. The cationic exchange sorbent was also compared to another RAM, namely RP-18 ADS (alkyl diol silica) sorbent with respect to retention capability towards basic analytes.

  12. The role of atomic fluorescence spectrometry in the automatic environmental monitoring of trace element analysis

    PubMed Central

    Stockwell, P. B.; Corns, W. T.

    1993-01-01

    Considerable attention has been drawn to the environmental levels of mercury, arsenic, selenium and antimony in the last decade. Legislative and environmental pressure has forced levels to be lowered and this has created an additional burden for analytical chemists. Not only does an analysis have to reach lower detection levels, but it also has to be seen to be correct. Atomic fluorescence detection, especially when coupled to vapour generation techniques, offers both sensitivity and specificity. Developments in the design of specified atomic fluorescence detectors for mercury, for the hydride-forming elements and also for cadmium, are described in this paper. Each of these systems is capable of analysing samples in the part per trillion (ppt) range reliably and economically. Several analytical applications are described. PMID:18924964

  13. Application of capability indices and control charts in the analytical method control strategy.

    PubMed

    Oliva, Alexis; Llabres Martinez, Matías

    2017-08-01

    In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm  = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Novel selective TOCSY method enables NMR spectral elucidation of metabolomic mixtures

    NASA Astrophysics Data System (ADS)

    MacKinnon, Neil; While, Peter T.; Korvink, Jan G.

    2016-11-01

    Complex mixture analysis is routinely encountered in NMR-based investigations. With the aim of component identification, spectral complexity may be addressed chromatographically or spectroscopically, the latter being favored to reduce sample handling requirements. An attractive experiment is selective total correlation spectroscopy (sel-TOCSY), which is capable of providing tremendous spectral simplification and thereby enhancing assignment capability. Unfortunately, isolating a well resolved resonance is increasingly difficult as the complexity of the mixture increases and the assumption of single spin system excitation is no longer robust. We present TOCSY optimized mixture elucidation (TOOMIXED), a technique capable of performing spectral assignment particularly in the case where the assumption of single spin system excitation is relaxed. Key to the technique is the collection of a series of 1D sel-TOCSY experiments as a function of the isotropic mixing time (τm), resulting in a series of resonance intensities indicative of the underlying molecular structure. By comparing these τm -dependent intensity patterns with a library of pre-determined component spectra, one is able to regain assignment capability. After consideration of the technique's robustness, we tested TOOMIXED firstly on a model mixture. As a benchmark we were able to assign a molecule with high confidence in the case of selectively exciting an isolated resonance. Assignment confidence was not compromised when performing TOOMIXED on a resonance known to contain multiple overlapping signals, and in the worst case the method suggested a follow-up sel-TOCSY experiment to confirm an ambiguous assignment. TOOMIXED was then demonstrated on two realistic samples (whisky and urine), where under our conditions an approximate limit of detection of 0.6 mM was determined. Taking into account literature reports for the sel-TOCSY limit of detection, the technique should reach on the order of 10 μ M sensitivity. We anticipate this technique will be highly attractive to various analytical fields facing mixture analysis, including metabolomics, foodstuff analysis, pharmaceutical analysis, and forensics.

  15. A method for rapid sampling and characterization of smokeless powder using sorbent-coated wire mesh and direct analysis in real time - mass spectrometry (DART-MS).

    PubMed

    Li, Frederick; Tice, Joseph; Musselman, Brian D; Hall, Adam B

    2016-09-01

    Improvised explosive devices (IEDs) are often used by terrorists and criminals to create public panic and destruction, necessitating rapid investigative information. However, backlogs in many forensic laboratories resulting in part from time-consuming GC-MS and LC-MS techniques prevent prompt analytical information. Direct analysis in real time - mass spectrometry (DART-MS) is a promising analytical technique that can address this challenge in the forensic science community by permitting rapid trace analysis of energetic materials. Therefore, we have designed a qualitative analytical approach that utilizes novel sorbent-coated wire mesh and dynamic headspace concentration to permit the generation of information rich chemical attribute signatures (CAS) for trace energetic materials in smokeless powder with DART-MS. Sorbent-coated wire mesh improves the overall efficiency of capturing trace energetic materials in comparison to swabbing or vacuuming. Hodgdon Lil' Gun smokeless powder was used to optimize the dynamic headspace parameters. This method was compared to traditional GC-MS methods and validated using the NIST RM 8107 smokeless powder reference standard. Additives and energetic materials, notably nitroglycerin, were rapidly and efficiently captured by the Carbopack X wire mesh, followed by detection and identification using DART-MS. This approach has demonstrated the capability of generating comparable results with significantly reduced analysis time in comparison to GC-MS. All targeted components that can be detected by GC-MS were detected by DART-MS in less than a minute. Furthermore, DART-MS offers the advantage of detecting targeted analytes that are not amenable to GC-MS. The speed and efficiency associated with both the sample collection technique and DART-MS demonstrate an attractive and viable potential alternative to conventional techniques. Copyright © 2016 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  16. High speed operation of permanent magnet machines

    NASA Astrophysics Data System (ADS)

    El-Refaie, Ayman M.

    This work proposes methods to extend the high-speed operating capabilities of both the interior PM (IPM) and surface PM (SPM) machines. For interior PM machines, this research has developed and presented the first thorough analysis of how a new bi-state magnetic material can be usefully applied to the design of IPM machines. Key elements of this contribution include identifying how the unique properties of the bi-state magnetic material can be applied most effectively in the rotor design of an IPM machine by "unmagnetizing" the magnet cavity center posts rather than the outer bridges. The importance of elevated rotor speed in making the best use of the bi-state magnetic material while recognizing its limitations has been identified. For surface PM machines, this research has provided, for the first time, a clear explanation of how fractional-slot concentrated windings can be applied to SPM machines in order to achieve the necessary conditions for optimal flux weakening. A closed-form analytical procedure for analyzing SPM machines designed with concentrated windings has been developed. Guidelines for designing SPM machines using concentrated windings in order to achieve optimum flux weakening are provided. Analytical and numerical finite element analysis (FEA) results have provided promising evidence of the scalability of the concentrated winding technique with respect to the number of poles, machine aspect ratio, and output power rating. Useful comparisons between the predicted performance characteristics of SPM machines equipped with concentrated windings and both SPM and IPM machines designed with distributed windings are included. Analytical techniques have been used to evaluate the impact of the high pole number on various converter performance metrics. Both analytical techniques and FEA have been used for evaluating the eddy-current losses in the surface magnets due to the stator winding subharmonics. Techniques for reducing these losses have been investigated. A 6kW, 36slot/30pole prototype SPM machine has been designed and built. Experimental measurements have been used to verify the analytical and FEA results. These test results have demonstrated that wide constant-power speed range can be achieved. Other important machine features such as the near-sinusoidal back-emf, high efficiency, and low cogging torque have also been demonstrated.

  17. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  18. 3D surface pressure measurement with single light-field camera and pressure-sensitive paint

    NASA Astrophysics Data System (ADS)

    Shi, Shengxian; Xu, Shengming; Zhao, Zhou; Niu, Xiaofu; Quinn, Mark Kenneth

    2018-05-01

    A novel technique that simultaneously measures three-dimensional model geometry, as well as surface pressure distribution, with single camera is demonstrated in this study. The technique takes the advantage of light-field photography which can capture three-dimensional information with single light-field camera, and combines it with the intensity-based pressure-sensitive paint method. The proposed single camera light-field three-dimensional pressure measurement technique (LF-3DPSP) utilises a similar hardware setup to the traditional two-dimensional pressure measurement technique, with exception that the wind-on, wind-off and model geometry images are captured via an in-house-constructed light-field camera. The proposed LF-3DPSP technique was validated with a Mach 5 flared cone model test. Results show that the technique is capable of measuring three-dimensional geometry with high accuracy for relatively large curvature models, and the pressure results compare well with the Schlieren tests, analytical calculations, and numerical simulations.

  19. Remote Raman - laser induced breakdown spectroscopy (LIBS) geochemical investigation under Venus atmospheric conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Sanuel M; Barefield, James E; Humphries, Seth D

    2010-12-13

    The extreme Venus surface temperatures ({approx}740 K) and atmospheric pressures ({approx}93 atm) create a challenging environment for surface missions. Scientific investigations capable of Venus geochemical observations must be completed within hours of landing before the lander will be overcome by the harsh atmosphere. A combined remote Raman - LIBS (Laser Induced Breakdown Spectroscopy) instrument is capable of accomplishing the geochemical science goals without the risks associated with collecting samples and bringing them into the lander. Wiens et al. and Sharma et al. demonstrated that both analytical techniques can be integrated into a single instrument capable of planetary missions. The focusmore » of this paper is to explore the capability to probe geologic samples with Raman - LIBS and demonstrate quantitative analysis under Venus surface conditions. Raman and LIBS are highly complementary analytical techniques capable of detecting both the mineralogical and geochemical composition of Venus surface materials. These techniques have the potential to profoundly increase our knowledge of the Venus surface composition, which is currently limited to geochemical data from Soviet Venera and VEGA landers that collectively suggest a surface composition that is primarily tholeiitic basaltic with some potentially more evolved compositions and, in some locations, K-rich trachyandesite. These landers were not equipped to probe the surface mineralogy as can be accomplished with Raman spectroscopy. Based on the observed compositional differences and recognizing the imprecise nature of the existing data, 15 samples were chosen to constitute a Venus-analog suite for this study, including five basalts, two each of andesites, dacites, and sulfates, and single samples of a foidite, trachyandesite, rhyolite, and basaltic trachyandesite under Venus conditions. LIBS data reduction involved generating a partial least squares (PLS) model with a subset of the rock powder standards to quantitatively determine the major elemental abundance of the remaining samples. PLS analysis suggests that the major element compositions can be determined with root mean square errors ca. 5% (absolute) for SiO{sub 2}, Al{sub 2}O{sub 3}, Fe{sub 2}O{sub 3}(total), MgO, and CaO, and ca. 2% or less for TiO{sub 2}, Cr{sub 2}O{sub 3}, MnO, K{sub 2}O, and Na{sub 2}O. Finally, the Raman experiments have been conducted under supercritical CO{sub 2} involving single-mineral and mixed-mineral samples containing talc, olivine, pyroxenes, feldspars, anhydrite, barite, and siderite. The Raman data have shown that the individual minerals can easily be identified individually or in mixtures.« less

  20. Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices

    PubMed Central

    Volckens, John

    2014-01-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892

  1. Rapid detection of transition metals in welding fumes using paper-based analytical devices.

    PubMed

    Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John

    2014-05-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.

  2. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  3. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  4. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research

    DOE PAGES

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J.; ...

    2015-06-18

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is amore » technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. Here, this review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. Electron tomography produces quantitative 3D reconstructions for biological and physical sciences from sets of 2D projections acquired at different tilting angles in a transmission electron microscope. Finally, state-of-the-art techniques capable of producing 3D representations such as Pt-Pd core-shell nanoparticles and IgG1 antibody molecules are reviewed.« less

  5. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  6. A method of atmospheric density measurements during space shuttle entry using ultraviolet-laser Rayleigh scattering

    NASA Technical Reports Server (NTRS)

    Mckenzie, Robert L.

    1988-01-01

    An analytical study and its experimental verification are described which show the performance capabilities and the hardware requirements of a method for measuring atmospheric density along the Space Shuttle flightpath during entry. Using onboard instrumentation, the technique relies on Rayleigh scattering of light from a pulsed ArF excimer laser operating at a wavelength of 193 nm. The method is shown to be capable of providing density measurements with an uncertainty of less than 1 percent and with a spatial resolution along the flightpath of 1 km, over an altitude range from 50 to 90 km. Experimental verification of the signal linearity and the expected signal-to-noise ratios is demonstrated in a simulation facility at conditions that duplicate the signal levels of the flight environment.

  7. Multichamber Multipotentiostat System for Cellular Microphysiometry.

    PubMed

    Lima, Eduardo A; Snider, Rachel M; Reiserer, Ronald S; McKenzie, Jennifer R; Kimmel, Danielle W; Eklund, Sven E; Wikswo, John P; Cliffel, David E

    2014-12-01

    Multianalyte microphysiometry is a powerful technique for studying cellular metabolic flux in real time. Monitoring several analytes concurrently in a number of individual chambers, however, requires specific instrumentation that is not available commercially in a single, compact, benchtop form at an affordable cost. We developed a multipotentiostat system capable of performing simultaneous amperometric and potentiometric measurements in up to eight individual chambers. The modular design and custom LabVIEW™ control software provide flexibility and allow for expansion and modification to suit different experimental conditions. Superior accuracy is achieved when operating the instrument in a standalone configuration; however, measurements performed in conjunction with a previously developed multianalyte microphysiometer have shown low levels of crosstalk as well. Calibrations and experiments with primary and immortalized cell cultures demonstrate the performance of the instrument and its capabilities.

  8. Development of an Atmospheric Pressure Ionization Mass Spectrometer

    NASA Technical Reports Server (NTRS)

    1998-01-01

    A commercial atmospheric pressure ionization mass spectrometer (APIMS) was purchased from EXTREL Mass Spectrometry, Inc. (Pittsburgh, PA). Our research objectives were to adapt this instrument and develop techniques for real-time determinations of the concentrations of trace species in the atmosphere. The prototype instrument is capable of making high frequency measurements with no sample preconcentrations. Isotopically labeled standards are used as an internal standard to obtain high precision and to compensate for changes in instrument sensitivity and analyte losses in the sampling manifold as described by Bandy and coworkers. The prototype instrument is capable of being deployed on NASA C130, Electra, P3, and DC8 aircraft. After purchasing and taking delivery by June 1994, we assembled the mass spectrometer, data acquisition, and manifold flow control instrumentation in electronic racks and performed tests.

  9. Techniques to measure sorption and migration between small molecules and packaging. A critical review.

    PubMed

    Kadam, Ashish A; Karbowiak, Thomas; Voilley, Andrée; Debeaufort, Frédéric

    2015-05-01

    The mass transfer parameters diffusion and sorption in food and packaging or between them are the key parameters for assessing a food product's shelf-life in reference to consumer safety. This has become of paramount importance owing to the legislations set by the regulated markets. The technical capabilities that can be exploited for analyzing product-package interactions have been growing rapidly. Different techniques categorized according to the state of the diffusant (gas or liquid) in contact with the packaging material are emphasized in this review. Depending on the diffusant and on the analytical question under review, the different ways to study sorption and/or migration are presented and compared. Some examples have been suggested to reach the best possible choice, consisting of a single technique or a combination of different approaches. © 2014 Society of Chemical Industry.

  10. Ultra-spatial synchrotron radiation for imaging molecular chemical structure: Applications in plant and animal studies

    DOE PAGES

    Yu, Peiqiang

    2007-01-01

    Synchrotron-based Fourier transform infrared microspectroscopy (S-FTIR) has been developed as a rapid, direct, non-destructive, bioanalytical technique. This technique takes advantage of synchrotron light brightness and small effective source size and is capable of exploring the molecular chemical features and make-up within microstructures of a biological tissue without destruction of inherent structures at ultra-spatial resolutions within cellular dimension. To date there has been very little application of this advanced synchrotron technique to the study of plant and animal tissues' inherent structure at a cellular or subcellular level. In this article, a novel approach was introduced to show the potential of themore » newly developed, advanced synchrotron-based analytical technology, which can be used to reveal molecular structural-chemical features of various plant and animal tissues.« less

  11. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  12. The use of surface-enhanced Raman scattering for detecting molecular evidence of life in rocks, sediments, and sedimentary deposits.

    PubMed

    Bowden, Stephen A; Wilson, Rab; Cooper, Jonathan M; Parnell, John

    2010-01-01

    Raman spectroscopy is a versatile analytical technique capable of characterizing the composition of both inorganic and organic materials. Consequently, it is frequently suggested as a payload on many planetary landers. Only approximately 1 in every 10(6) photons are Raman scattered; therefore, the detection of trace quantities of an analyte dispersed in a sample matrix can be much harder to achieve. To overcome this, surface-enhanced Raman scattering (SERS) and surface-enhanced resonance Raman scattering (SERRS) both provide greatly enhanced signals (enhancements between 10(5) and 10(9)) through the analyte's interaction with the locally generated surface plasmons, which occur at a "roughened" or nanostructured metallic surface (e.g., Cu, Au, and Ag). Both SERS and SERRS may therefore provide a viable technique for trace analysis of samples. In this paper, we describe the development of SERS assays for analyzing trace amounts of compounds present in the solvent extracts of sedimentary deposits. These assays were used to detect biological pigments present in an Arctic microoasis (a small locale of elevated biological productivity) and its detrital regolith, characterize the pigmentation of microbial mats around hydrothermal springs, and detect fossil organic matter in hydrothermal deposits. These field study examples demonstrate that SERS technology is sufficiently mature to be applied to many astrobiological analog studies on Earth. Many current and proposed imaging systems intended for remote deployment already posses the instrumental components needed for SERS. The addition of wet chemistry sample processing facilities to these instruments could yield field-deployable analytical instruments with a broadened analytical window for detecting organic compounds with a biological or geological origin.

  13. Real-time sensor data validation

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy W.

    1994-01-01

    This report describes the status of an on-going effort to develop software capable of detecting sensor failures on rocket engines in real time. This software could be used in a rocket engine controller to prevent the erroneous shutdown of an engine due to sensor failures which would otherwise be interpreted as engine failures by the control software. The approach taken combines analytical redundancy with Bayesian belief networks to provide a solution which has well defined real-time characteristics and well-defined error rates. Analytical redundancy is a technique in which a sensor's value is predicted by using values from other sensors and known or empirically derived mathematical relations. A set of sensors and a set of relations among them form a network of cross-checks which can be used to periodically validate all of the sensors in the network. Bayesian belief networks provide a method of determining if each of the sensors in the network is valid, given the results of the cross-checks. This approach has been successfully demonstrated on the Technology Test Bed Engine at the NASA Marshall Space Flight Center. Current efforts are focused on extending the system to provide a validation capability for 100 sensors on the Space Shuttle Main Engine.

  14. Electrochemical sensor for multiplex screening of genetically modified DNA: identification of biotech crops by logic-based biomolecular analysis.

    PubMed

    Liao, Wei-Ching; Chuang, Min-Chieh; Ho, Ja-An Annie

    2013-12-15

    Genetically modified (GM) technique, one of the modern biomolecular engineering technologies, has been deemed as profitable strategy to fight against global starvation. Yet rapid and reliable analytical method is deficient to evaluate the quality and potential risk of such resulting GM products. We herein present a biomolecular analytical system constructed with distinct biochemical activities to expedite the computational detection of genetically modified organisms (GMOs). The computational mechanism provides an alternative to the complex procedures commonly involved in the screening of GMOs. Given that the bioanalytical system is capable of processing promoter, coding and species genes, affirmative interpretations succeed to identify specified GM event in terms of both electrochemical and optical fashions. The biomolecular computational assay exhibits detection capability of genetically modified DNA below sub-nanomolar level and is found interference-free by abundant coexistence of non-GM DNA. This bioanalytical system, furthermore, sophisticates in array fashion operating multiplex screening against variable GM events. Such a biomolecular computational assay and biosensor holds great promise for rapid, cost-effective, and high-fidelity screening of GMO. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Application of the radioisotope excited X-ray fluorescence technique in charge optimization during thermite smelting of Fe-Ni, Fe-cr, and Fe-Ti alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, I.G.; Joseph, D.; Lal, M.

    1995-10-01

    A wide range of ferroalloys are used to facilitate the addition of different alloying elements to molten steel. High-carbon ferroalloys are produced on a tonnage basis by carbothermic smelting in an electric furnace, and an aluminothermic route is generally adopted for small scale production of low-carbon varieties. The physicochemical principles of carbothermy and aluminothermy have been well documented in the literature. However, limited technical data are reported on the production of individual ferroalloys of low-carbon varieties from their selected resources. The authors demonstrate her the application of an energy dispersive X-ray fluorescence (EDXRF) technique in meeting the analytical requirements ofmore » a thermite smelting campaign, carried out with the aim of preparing low-carbon-low-nitrogen Fe-Ni, Fe-Cr, and Fe-Ti alloys from indigenously available nickel bearing spent catalyst, mineral chromite, and ilmenite/rutile, respectively. They have chosen the EDXRF technique to meet the analytical requirements because of its capability to analyze samples of ore, minerals, a metal, and alloys in different forms, such as powder, sponge, as-smelted, or as-cast, to obtain rapid multielement analyses with ease. Rapid analyses of thermite feed and product by this technique have aided in the appropriate alterations of the charge constitutents to obtain optimum charge consumption.« less

  16. FORMAC integration program: A special applications package used in developing techniques of orbital decay and long term ephemeris prediction for satellites in earth orbit

    NASA Technical Reports Server (NTRS)

    Rowe, C. K.

    1971-01-01

    The symbolic manipulation capabilities of the FORMAC (Formula Manipulation Compiler) language are employed to expand and analytically evaluate integrals. The program integration is effected by expanding the integral(s) into a series of subintegrals and then substituting a pre-derived and pre-coded solution for that particular subintegral. Derivation of the integral solutions necessary for precoding is included, as is a discussion of the FORMAC system limitations encountered in the programming effort.

  17. Expanding the Capabilities of the JPL Electronic Nose for an International Space Station Technology Demonstration

    NASA Technical Reports Server (NTRS)

    Ryan, Margaret A.; Shevade, A. V.; Taylor, C. J.; Homer, M. L.; Jewell, A. D.; Kisor, A.; Manatt, K. S .; Yen, S. P. S.; Blanco, M.; Goddard, W. A., III

    2006-01-01

    An array-based sensing system based on polymer/carbon composite conductometric sensors is under development at JPL for use as an environmental monitor in the International Space Station. Sulfur dioxide has been added to the analyte set for this phase of development. Using molecular modeling techniques, the interaction energy between SO2 and polymer functional groups has been calculated, and polymers selected as potential SO2 sensors. Experiment has validated the model and two selected polymers have been shown to be promising materials for SO2 detection.

  18. Internally stabilized selenocysteine derivatives: syntheses, 77Se NMR and biomimetic studies.

    PubMed

    Phadnis, Prasad P; Mugesh, G

    2005-07-07

    Selenocystine ([Sec]2) and aryl-substituted selenocysteine (Sec) derivatives are synthesized, starting from commercially available amino acid l-serine. These compounds are characterized by a number of analytical techniques such as NMR (1H, 13C and 77Se) and TOF mass spectroscopy. This study reveals that the introduction of amino/imino substituents capable of interacting with selenium may stabilize the Sec derivatives. This study further suggests that the oxidation-elimination reactions in Sec derivatives could be used for the generation of biologically active selenols having internally stabilizing substituents.

  19. Application of nuclear analytical techniques using long-life sealed-tube neutron generators.

    PubMed

    Bach, P; Cluzeau, S; Lambermont, C

    1994-01-01

    The new range of sealed-tube neutron generators developed by SODERN appears to be appropriate for the industrial environment. The main characteristics are the high emission stability during the very long lifetime of the tube, flexible pulsed mode capability, safety in operation with no radiation in "off" state, and the easy transportation of equipment. Some applications of the neutron generators, called GENIE, are considered: high-sensitivity measurement of transuranic elements in nuclear waste drums, bulk material analysis for process control, and determination of the airborne pollutants for environmental monitoring.

  20. Artificial intelligence in medicine.

    PubMed Central

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. RESULTS: The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. DISCUSSION: Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting. PMID:15333167

  1. Artificial intelligence in medicine.

    PubMed

    Ramesh, A N; Kambhampati, C; Monson, J R T; Drew, P J

    2004-09-01

    Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting.

  2. Strategic assay deployment as a method for countering analytical bottlenecks in high throughput process development: case studies in ion exchange chromatography.

    PubMed

    Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel

    2012-01-01

    High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).

  3. Spartan service module finite element modeling technique and analysis

    NASA Technical Reports Server (NTRS)

    Lindenmoyer, A. J.

    1985-01-01

    Sounding rockets have served as a relatively inexpensive and easy method of carrying experiments into the upper atmosphere. Limited observation time and pointing capabilities suggested the development of a new sounding rocket type carrier compatible with NASA's Space Transportation System. This concept evolved into the Spartan program, now credited with a successful Spartan 101 mission launched in June 1985. The next series of Spartans will use a service module primary structure. This newly designed reusable and universal component in the Spartan carrier system required thorough analysis and evaluation for flight certification. Using advanced finite element modeling techniques, the structure was analyzed and determined acceptable by meeting strict design goals and will be tested for verification of the analytical results.

  4. Use of pressure manifestations following the water plasma expansion for phytomass disintegration.

    PubMed

    Maroušek, Josef; Kwan, Jason Tai Hong

    2013-01-01

    A prototype capable of generating underwater high-voltage discharges (3.5 kV) coupled with water plasma expansion was constructed. The level of phytomass disintegration caused by transmission of the pressure shockwaves (50-60 MPa) followed by this expansion was analyzed using gas adsorption techniques. The dynamics of the external surface area and the micropore volume on multiple pretreatment stages of maize silage and sunflower seeds was approximated with robust analytical techniques. The multiple increases on the reaction surface were manifest in up to a 15% increase in cumulative methane production, which was itself manifest in the overall acceleration of the anaerobic fermentation process. Disintegration of the sunflower seeds allowed up to 45% higher oil yields using the same operating pressure.

  5. Role and Evaluation of Interlaboratory Comparison Results in Laboratory Accreditation

    NASA Astrophysics Data System (ADS)

    Bode, P.

    2008-08-01

    Participation in interlaboratory comparisons provides laboratories an opportunity for independent assessment of their analytical performance, both in absolute way and in comparison with those by other techniques. However, such comparisons are hindered by differences in the way laboratories participate, e.g. at best measurement capability or under routine conditions. Neutron activation analysis laboratories, determining total mass fractions, often see themselves classified as `outliers' since the majority of other participants employ techniques with incomplete digestion methods. These considerations are discussed in relation to the way results from interlaboratory comparisons are evaluated by accreditation bodies following the requirements of Clause 5.9.1 of the ISO/IEC 17025:2005. The discussion and conclusions come largely forth from experiences in the author's own laboratory.

  6. Investigation of new radar-data-reduction techniques used to determine drag characteristics of a free-flight vehicle

    NASA Technical Reports Server (NTRS)

    Woodbury, G. E.; Wallace, J. W.

    1974-01-01

    An investigation was conducted of new techniques used to determine the complete transonic drag characteristics of a series of free-flight drop-test models using principally radar tracking data. The full capabilities of the radar tracking and meteorological measurement systems were utilized. In addition, preflight trajectory design, exact kinematic equations, and visual-analytical filtering procedures were employed. The results of this study were compared with the results obtained from analysis of the onboard, accelerometer and pressure sensor data of the only drop-test model that was instrumented. The accelerometer-pressure drag curve was approximated by the radar-data drag curve. However, a small amplitude oscillation on the latter curve precluded a precise definition of its drag rise.

  7. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  8. Analytical Investigation of the Limits for the In-Plane Thermal Conductivity Measurement Using a Suspended Membrane Setup

    NASA Astrophysics Data System (ADS)

    Linseis, V.; Völklein, F.; Reith, H.; Woias, P.; Nielsch, K.

    2018-06-01

    An analytical study has been performed on the measurement capabilities of a 100-nm thin suspended membrane setup for the in-plane thermal conductivity measurements of thin film samples using the 3 ω measurement technique, utilizing a COSMOL Multiphysics simulation. The maximum measurement range under observance of given boundary conditions has been studied. Three different exemplary sample materials, with a thickness from the nanometer to the micrometer range and a thermal conductivity from 0.4 W/mK up to 100 W/mK have been investigated as showcase studies. The results of the simulations have been compared to a previously published evaluation model, in order to determine the deviation between both and thereby the measurement limit. As thermal transport properties are temperature dependent, all calculations refer to constant room temperature conditions.

  9. Evaluating Pillar Industry’s Transformation Capability: A Case Study of Two Chinese Steel-Based Cities

    PubMed Central

    Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan

    2015-01-01

    Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China’s steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities’ abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns. PMID:26422266

  10. Evaluating Pillar Industry's Transformation Capability: A Case Study of Two Chinese Steel-Based Cities.

    PubMed

    Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan

    2015-01-01

    Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China's steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities' abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns.

  11. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    USGS Publications Warehouse

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  12. Laser Ablation in situ (U-Th-Sm)/He and U-Pb Double-Dating of Apatite and Zircon: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    McInnes, B.; Danišík, M.; Evans, N.; McDonald, B.; Becker, T.; Vermeesch, P.

    2015-12-01

    We present a new laser-based technique for rapid, quantitative and automated in situ microanalysis of U, Th, Sm, Pb and He for applications in geochronology, thermochronometry and geochemistry (Evans et al., 2015). This novel capability permits a detailed interrogation of the time-temperature history of rocks containing apatite, zircon and other accessory phases by providing both (U-Th-Sm)/He and U-Pb ages (+trace element analysis) on single crystals. In situ laser microanalysis offers several advantages over conventional bulk crystal methods in terms of safety, cost, productivity and spatial resolution. We developed and integrated a suite of analytical instruments including a 193 nm ArF excimer laser system (RESOlution M-50A-LR), a quadrupole ICP-MS (Agilent 7700s), an Alphachron helium mass spectrometry system and swappable flow-through and ultra-high vacuum analytical chambers. The analytical protocols include the following steps: mounting/polishing in PFA Teflon using methods similar to those adopted for fission track etching; laser He extraction and analysis using a 2 s ablation at 5 Hz and 2-3 J/cm2fluence; He pit volume measurement using atomic force microscopy, and U-Th-Sm-Pb (plus optional trace element) analysis using traditional laser ablation methods. The major analytical challenges for apatite include the low U, Th and He contents relative to zircon and the elevated common Pb content. On the other hand, apatite typically has less extreme and less complex zoning of parent isotopes (primarily U and Th). A freeware application has been developed for determining (U-Th-Sm)/He ages from the raw analytical data and Iolite software was used for U-Pb age and trace element determination. In situ double-dating has successfully replicated conventional U-Pb and (U-Th)/He age variations in xenocrystic zircon from the diamondiferous Ellendale lamproite pipe, Western Australia and increased zircon analytical throughput by a factor of 50 over conventional methods.Reference: Evans NJ, McInnes BIA, McDonald B, Becker T, Vermeesch P, Danisik M, Shelley M, Marillo-Sialer E and Patterson D. An in situ technique for (U-Th-Sm)/He and U-Pb double dating. J Analytical Atomic Spectrometry, 30, 1636 - 1645.

  13. The Superior Lambert Algorithm

    NASA Astrophysics Data System (ADS)

    der, G.

    2011-09-01

    Lambert algorithms are used extensively for initial orbit determination, mission planning, space debris correlation, and missile targeting, just to name a few applications. Due to the significance of the Lambert problem in Astrodynamics, Gauss, Battin, Godal, Lancaster, Gooding, Sun and many others (References 1 to 15) have provided numerous formulations leading to various analytic solutions and iterative methods. Most Lambert algorithms and their computer programs can only work within one revolution, break down or converge slowly when the transfer angle is near zero or 180 degrees, and their multi-revolution limitations are either ignored or barely addressed. Despite claims of robustness, many Lambert algorithms fail without notice, and the users seldom have a clue why. The DerAstrodynamics lambert2 algorithm, which is based on the analytic solution formulated by Sun, works for any number of revolutions and converges rapidly at any transfer angle. It provides significant capability enhancements over every other Lambert algorithm in use today. These include improved speed, accuracy, robustness, and multirevolution capabilities as well as implementation simplicity. Additionally, the lambert2 algorithm provides a powerful tool for solving the angles-only problem without artificial singularities (pointed out by Gooding in Reference 16), which involves 3 lines of sight captured by optical sensors, or systems such as the Air Force Space Surveillance System (AFSSS). The analytic solution is derived from the extended Godal’s time equation by Sun, while the iterative method of solution is that of Laguerre, modified for robustness. The Keplerian solution of a Lambert algorithm can be extended to include the non-Keplerian terms of the Vinti algorithm via a simple targeting technique (References 17 to 19). Accurate analytic non-Keplerian trajectories can be predicted for satellites and ballistic missiles, while performing at least 100 times faster in speed than most numerical integration methods.

  14. ELECTRONICS UPGRADE TO THE SAVANNAH RIVER NATIONAL LABORATORY COULOMETER FOR PLUTONIUM AND NEPTUNIUM ASSAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cordaro, J.; Holland, M.; Reeves, G.

    The Savannah River Site (SRS) has the analytical measurement capability to perform high-precision plutonium concentration measurements by controlled-potential coulometry. State-of-the-art controlled-potential coulometers were designed and fabricated by the Savannah River National Laboratory and installed in the Analytical Laboratories process control laboratory. The Analytical Laboratories uses coulometry for routine accountability measurements of and for verification of standard preparations used to calibrate other plutonium measurement systems routinely applied to process control, nuclear safety, and other accountability applications. The SRNL Coulometer has a demonstrated measurement reliability of {approx}0.05% for 10 mg samples. The system has also been applied to the characterization of neptuniummore » standard solutions with a comparable reliability. The SRNL coulometer features: a patented current integration system; continuous electrical calibration versus Faraday's Constants and Ohm's Law; the control-potential adjustment technique for enhanced application of the Nernst Equation; a wide operating room temperature range; and a fully automated instrument control and data acquisition capability. Systems have been supplied to the International Atomic Energy Agency (IAEA), Russia, Japanese Atomic Energy Agency (JAEA) and the New Brunswick Laboratory (NBL). The most recent vintage of electronics was based on early 1990's integrated circuits. Many of the components are no longer available. At the request of the IAEA and the Department of State, SRNL has completed an electronics upgrade of their controlled-potential coulometer design. Three systems have built with the new design, one for the IAEA which was installed at SAL in May 2011, one system for Los Alamos National Laboratory, (LANL) and one for the SRS Analytical Laboratory. The LANL and SRS systems are undergoing startup testing with installation scheduled for this summer.« less

  15. A Delayed Neutron Counting System for the Analysis of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Sellers, Madison Theresa

    Nuclear forensic analysis is a modem science that uses numerous analytical techniques to identify and attribute nuclear materials in the event of a nuclear explosion, radiological terrorist attack or the interception of illicit nuclear material smuggling. The Canadian Department of National Defence has participated in recent international exercises that have highlighted the Nation's requirement to develop nuclear forensics expertise, protocol and capabilities, specifically pertaining to the analysis of special nuclear materials (SNM). A delayed neutron counting (DNC) system has been designed and established at the Royal Military College of Canada (RMC) to enhance the Government's SNM analysis capabilities. This analytical technique complements those already at RMC by providing a rapid and non-destructive method for the analysis of the fissile isotopes of both uranium (U) and plutonium (Pu). The SLOWPOKE-2 reactor at RMC produces a predominately thermal neutron flux. These neutrons induce fission in the SNM isotopes 233U, 235U and 239Pu releasing prompt fast neutrons, energy and radioactive fission fragments. Some of these fission fragments undergo beta - decay and subsequently emit neutrons, which can be recorded by an array of sensitive 3He detectors. The significant time period between the fission process and the release of these neutrons results in their identification as 'delayed neutrons'. The recorded neutron spectrum varies with time and the count rate curve is unique to each fissile isotope. In-house software, developed by this project, can analyze this delayed neutron curve and provides the fissile mass in the sample. Extensive characterization of the DNC system has been performed with natural U samples with 235 U content ranging from 2--7 microg. The system efficiency and dead time behaviour determined by the natural uranium sample analyses were validated by depleted uranium samples with similar quantities of 235 U resulting in a typical relative error of 3.6%. The system has accurately determined 235U content over three orders of magnitude with 235U amounts as low as 10 ng. The results have also been proven to be independent of small variations in total analyte volume and geometry, indicating that it is an ideal technique for the analysis of samples containing SNM in a variety of different matrices. The Analytical Sciences Group at RMC plans to continue DNC system development to include 233U and 239pu analysis and mixtures of SNM isotopes. Keywords: delayed neutron counting, special nuclear materials, nuclear forensics.

  16. A Label-Free Porous Silicon Immunosensor for Broad Detection of Opiates in a Blind Clinical Study and Result Comparison to Commercial Analytical Chemistry Techniques

    PubMed Central

    Bonanno, Lisa M.; Kwong, Tai C.; DeLouise, Lisa A.

    2010-01-01

    In this work we evaluate for the first time the performance of a label-free porous silicon (PSi) immunosensor assay in a blind clinical study designed to screen authentic patient urine specimens for a broad range of opiates. The PSi opiate immunosensor achieved 96% concordance with liquid chromatography-mass spectrometry/tandem mass spectrometry (LC-MS/MS) results on samples that underwent standard opiate testing (n=50). In addition, successful detection of a commonly abused opiate, oxycodone, resulted in 100% qualitative agreement between the PSi opiate sensor and LC-MS/MS. In contrast, a commercial broad opiate immunoassay technique (CEDIA®) achieved 65% qualitative concordance with LC-MS/MS. Evaluation of important performance attributes including precision, accuracy, and recovery was completed on blank urine specimens spiked with test analytes. Variability of morphine detection as a model opiate target was < 9% both within-run and between-day at and above the cutoff limit of 300 ng ml−1. This study validates the analytical screening capability of label-free PSi opiate immunosensors in authentic patient samples and is the first semi-quantitative demonstration of the technology’s successful clinical use. These results motivate future development of PSi technology to reduce complexity and cost of diagnostic testing particularly in a point-of-care setting. PMID:21062030

  17. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Sequential injection analysis with chemiluminescence detection for rapid monitoring of commercial Calendula officinalis extractions.

    PubMed

    Hughes, Rachel R; Scown, David; Lenehan, Claire E

    2015-01-01

    Plant extracts containing high levels of antioxidants are desirable due to their reported health benefits. Most techniques capable of determining the antioxidant activity of plant extracts are unsuitable for rapid at-line analysis as they require extensive sample preparation and/or long analysis times. Therefore, analytical techniques capable of real-time or pseudo real-time at-line monitoring of plant extractions, and determination of extraction endpoints, would be useful to manufacturers of antioxidant-rich plant extracts. To develop a reliable method for the rapid at-line extraction monitoring of antioxidants in plant extracts. Calendula officinalis extracts were prepared from dried flowers and analysed for antioxidant activity using sequential injection analysis (SIA) with chemiluminescence (CL) detection. The intensity of CL emission from the reaction of acidic potassium permanganate with antioxidants within the extract was used as the analytical signal. The SIA-CL method was applied to monitor the extraction of C. officinalis over the course of a batch extraction to determine the extraction endpoint. Results were compared with those from ultra high performance liquid chromatography (UHPLC). Pseudo real-time, at-line monitoring showed the level of antioxidants in a batch extract of Calendula officinalis plateaued after 100 min of extraction. These results correlated well with those of an offline UHPLC study. SIA-CL was found to be a suitable method for pseudo real-time monitoring of plant extractions and determination of extraction endpoints with respect to antioxidant concentrations. The method was applied at-line in the manufacturing industry. Copyright © 2015 John Wiley & Sons, Ltd.

  19. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers*

    PubMed Central

    Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-01-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782

  20. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  1. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  2. Four-dimensional modeling of recent vertical movements in the area of the southern California uplift

    USGS Publications Warehouse

    Vanicek, Petr; Elliot, Michael R.; Castle, Robert O.

    1979-01-01

    This paper describes an analytical technique that utilizes scattered geodetic relevelings and tide-gauge records to portray Recent vertical crustal movements that may have been characterized by spasmodic changes in velocity. The technique is based on the fitting of a time-varying algebraic surface of prescribed degree to the geodetic data treated as tilt elements and to tide-gauge readings treated as point movements. Desired variations in time can be selected as any combination of powers of vertical movement velocity and episodic events. The state of the modeled vertical displacement can be shown for any number of dates for visual display. Statistical confidence limits of the modeled displacements, derived from the density of measurements in both space and time, line length, and accuracy of input data, are also provided. The capabilities of the technique are demonstrated on selected data from the region of the southern California uplift. 

  3. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  4. Measurement of absolute regional lung air volumes from near-field x-ray speckles.

    PubMed

    Leong, Andrew F T; Paganin, David M; Hooper, Stuart B; Siew, Melissa L; Kitchen, Marcus J

    2013-11-18

    Propagation-based phase contrast x-ray (PBX) imaging yields high contrast images of the lung where airways that overlap in projection coherently scatter the x-rays, giving rise to a speckled intensity due to interference effects. Our previous works have shown that total and regional changes in lung air volumes can be accurately measured from two-dimensional (2D) absorption or phase contrast images when the subject is immersed in a water-filled container. In this paper we demonstrate how the phase contrast speckle patterns can be used to directly measure absolute regional lung air volumes from 2D PBX images without the need for a water-filled container. We justify this technique analytically and via simulation using the transport-of-intensity equation and calibrate the technique using our existing methods for measuring lung air volume. Finally, we show the full capabilities of this technique for measuring regional differences in lung aeration.

  5. Soluble Protein Analysis using a Compact Bench-top Flow Cytometer

    NASA Technical Reports Server (NTRS)

    Pappas, Dimitri; Kao, Shib-Hsin; Cyr, Johnathan

    2004-01-01

    Future space exploration missions will require analytical technology capable of providing both autonomous medical care to the crew and investigative capabilities to researchers. While several promising candidate technologies exist for further development, flow cytometry is an attractive technology as it offers both crew health (blood cell count, leukocyte differential, etc.) and a wide array of biochemistry and immunology assays. research settings, the application of this technique to soluble protein analysis is also possible. Proteomic beads using fluorescent dyes for optical encoding were used to monitor six cytokines simultaneously in cell medium of cell cultures in stationary and rotating cell culture systems. The results of this work demonstrate that a compact flow cytometer, such as a system proposed for space flight, can detect a variety of soluble proteins for crew health and biotechnology experiments during long-term missions.

  6. Statistical and temporal irradiance fluctuations modeling for a ground-to-geostationary satellite optical link.

    PubMed

    Camboulives, A-R; Velluet, M-T; Poulenard, S; Saint-Antonin, L; Michau, V

    2018-02-01

    An optical communication link performance between the ground and a geostationary satellite can be impaired by scintillation, beam wandering, and beam spreading due to its propagation through atmospheric turbulence. These effects on the link performance can be mitigated by tracking and error correction codes coupled with interleaving. Precise numerical tools capable of describing the irradiance fluctuations statistically and of creating an irradiance time series are needed to characterize the benefits of these techniques and optimize them. The wave optics propagation methods have proven their capability of modeling the effects of atmospheric turbulence on a beam, but these are known to be computationally intensive. We present an analytical-numerical model which provides good results on the probability density functions of irradiance fluctuations as well as a time series with an important saving of time and computational resources.

  7. Scanning lidar fluorosensor for remote diagnostic of surfaces

    NASA Astrophysics Data System (ADS)

    Caneve, Luisa; Colao, Francesco; Fantoni, Roberta; Fiorani, Luca

    2013-08-01

    Scanning hyperspectral systems based on laser induced fluorescence (LIF) have been developed and realized at the ENEA allowing to obtain information of analytical and qualitative interest on different materials by the study of the emission of fluorescence. This technique, for a surface analysis, is fast, remote, not invasive and specific. A new compact setup capable of fast 2D monochromatic images acquisition on up to 90 different spectral channels in the visible/UV range will be presented. It has been recently built with the aim to increase the performances in terms of space resolution, time resolved capabilities and data acquisition speed. Major achievements have been reached by a critical review of the optical design. The results recently obtained with in-situ measurements of interest for applications in the field of cultural heritage will be shown. 2001 Elsevier Science. All rights reserved

  8. Nuclear Resonance Fluorescence to Measure Plutonium Mass in Spent Nuclear Fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ludewigt, Bernhard A; Quiter, Brian J.; Ambers, Scott D.

    2011-01-14

    The Next Generation Safeguard Initiative (NGSI) of the U.S Department of Energy is supporting a multi-lab/university collaboration to quantify the plutonium (Pu) mass in spent nuclear fuel (SNF) assemblies and to detect the diversion of pins with non-destructive assay (NDA) methods. The following 14 NDA techniques are being studied: Delayed Neutrons, Differential Die-Away, Differential Die-Away Self-Interrogation, Lead Slowing Down Spectrometer, Neutron Multiplicity, Passive Neutron Albedo Reactivity, Total Neutron (Gross Neutron), X-Ray Fluorescence, {sup 252}Cf Interrogation with Prompt Neutron Detection, Delayed Gamma, Nuclear Resonance Fluorescence, Passive Prompt Gamma, Self-integration Neutron Resonance Densitometry, and Neutron Resonance Transmission Analysis. Understanding and maturity ofmore » the techniques vary greatly, ranging from decades old, well-understood methods to new approaches. Nuclear Resonance Fluorescence (NRF) is a technique that had not previously been studied for SNF assay or similar applications. Since NRF generates isotope-specific signals, the promise and appeal of the technique lies in its potential to directly measure the amount of a specific isotope in an SNF assay target. The objectives of this study were to design and model suitable NRF measurement methods, to quantify capabilities and corresponding instrumentation requirements, and to evaluate prospects and the potential of NRF for SNF assay. The main challenge of the technique is to achieve the sensitivity and precision, i.e., to accumulate sufficient counting statistics, required for quantifying the mass of Pu isotopes in SNF assemblies. Systematic errors, considered a lesser problem for a direct measurement and only briefly discussed in this report, need to be evaluated for specific instrument designs in the future. Also, since the technical capability of using NRF to measure Pu in SNF has not been established, this report does not directly address issues such as cost, size, development time, nor concerns related to the use of Pu in measurement systems. This report discusses basic NRF measurement concepts, i.e., backscatter and transmission methods, and photon source and {gamma}-ray detector options in Section 2. An analytical model for calculating NRF signal strengths is presented in Section 3 together with enhancements to the MCNPX code and descriptions of modeling techniques that were drawn upon in the following sections. Making extensive use of the model and MCNPX simulations, the capabilities of the backscatter and transmission methods based on bremsstrahlung or quasi-monoenergetic photon sources were analyzed as described in Sections 4 and 5. A recent transmission experiment is reported on in Appendix A. While this experiment was not directly part of this project, its results provide an important reference point for our analytical estimates and MCNPX simulations. Used fuel radioactivity calculations, the enhancements to the MCNPX code, and details of the MCNPX simulations are documented in the other appendices.« less

  9. Thermal Characterization of Defects in Aircraft Structures Via Spatially Controlled Heat Application

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    1997-01-01

    Recent advances in thermal imaging technology have spawned a number of new thermal NDE techniques that provide quantitative information about flaws in aircraft structures. Thermography has a number of advantages as an inspection technique. It is a totally noncontacting, nondestructive, imaging technology capable of inspecting a large area in a matter of a few seconds. The development of fast, inexpensive image processors have aided in the attractiveness of thermography as an NDE technique. These image processors have increased the signal to noise ratio of thermography and facilitated significant advances in post-processing. The resulting digital images enable archival records for comparison with later inspections thus providing a means of monitoring the evolution of damage in a particular structure. The National Aeronautics and Space Administration's Langley Research Center has developed a thermal NDE technique designed to image a number of potential flaws in aircraft structures. The technique involves injecting a small, spatially controlled heat flux into the outer surface of an aircraft. Images of fatigue cracking, bond integrity and material loss due to corrosion are generated from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to analyze the resulting thermal images. Spatial tailoring of the heat coupled with the analysis techniques represent a significant improvement in the delectability of flaws over conventional thermal imaging. Results of laboratory experiments on fabricated crack, disbond and material loss samples will be presented to demonstrate the capabilities of the technique. An integral part of the development of this technology is the use of analytic and computational modeling. The experimental results will be compared with these models to demonstrate the utility of such an approach.

  10. SOIL AND SEDIMENT SAMPLING METHODS | Science ...

    EPA Pesticide Factsheets

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological

  11. Army medical laboratory telemedicine: role of mass spectrometry in telediagnosis for chemical and biological defense.

    PubMed

    Smith, J R; Shih, M L; Price, E O; Platoff, G E; Schlager, J J

    2001-12-01

    An army medical field laboratory presently has the capability of performing standard protocols developed at the US Army Medical Research Institute of Chemical Defense for verification of nerve agent or sulfur mustard exposure. The protocols analyze hydrolysis products of chemical warfare agents using gas chromatography/mass spectrometry. Additionally, chemical warfare agents can produce alkylated or phosphorylated proteins following human exposure that have long biological half-lives and can be used as diagnostic biomarkers of chemical agent exposure. An analytical technique known as matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF/MS) currently is being examined for its potential to analyze these biomarkers. The technique is capable of detecting large biomolecules and modifications made to them. Its fast analysis time makes MALDI-TOF/MS technology suitable for screening casualties from chemical or biological attacks. Basic operation requires minimal training and the instrument has the potential to become field-portable. The limitation of the technique is that the generated data may require considerable expertise from knowledgeable personnel for consultation to ensure correct interpretation. The interaction between research scientists and field personnel in the acquisition of data and its interpretation via advanced digital telecommunication technologies can enhance rapid diagnosis and subsequently improve patient care in remote areas. Copyright 2001 John Wiley & Sons, Ltd.

  12. Stable oxygen and hydrogen isotopes of brines - comparing isotope ratio mass spectrometry and isotope ratio infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Ahrens, Christian; Koeniger, Paul; van Geldern, Robert; Stadler, Susanne

    2013-04-01

    Today's standard analytical methods for high precision stable isotope analysis of fluids are gas-water equilibration and high temperature pyrolysis coupled to isotope ratio mass spectrometers (IRMS). In recent years, relatively new laser-based analytical instruments entered the market that are said to allow high isotope precision data on nearly every media. This optical technique is referred to as isotope ratio infrared spectroscopy (IRIS). The objective of this study is to evaluate the capability of this new instrument type for highly saline solutions and a comparison of the analytical results with traditional IRMS analysis. It has been shown for the equilibration method that the presence of salts influences the measured isotope values depending on the salt concentration (see Lécuyer et al, 2009; Martineau, 2012). This so-called 'isotope salt effect' depends on the salt type and salt concentration. These factors change the activity in the fluid and therefore shift the isotope ratios measured by the equilibration method. Consequently, correction factors have to be applied to these analytical data. Direct conversion techniques like pyrolysis or the new laser instruments allow the measurement of the water molecule from the sample directly and should therefore not suffer from the salt effect, i.e. no corrections of raw values are necessary. However, due to high salt concentrations this might cause technical problems with the analytical hardware and may require labor-intensive sample preparation (e.g. vacuum distillation). This study evaluates the salt isotope effect for the IRMS equilibration technique (Thermo Gasbench II coupled to Delta Plus XP) and the laser-based IRIS instruments with liquid injection (Picarro L2120-i). Synthetic salt solutions (NaCl, KCl, CaCl2, MgCl2, MgSO4, CaSO4) and natural brines collected from the Stassfurt Salt Anticline (Germany; Stadler et al., 2012) were analysed with both techniques. Salt concentrations ranged from seawater salinity up to full saturation. References Lécuyer, C. et al. (2009). Chem. Geol., 264, 122-126. [doi:10.1016/j.chemgeo.2009.02.017] Martineau, F. et al. (2012). Chem. Geol., 291, 236-240. [doi:10.1016/j.chemgeo.2011.10.017] Stadler, S. et al. (2012). Chem. Geol., 294-295, 226-242. [doi:10.1016/j.chemgeo.2011.12.006

  13. Hybrid dynamic radioactive particle tracking (RPT) calibration technique for multiphase flow systems

    NASA Astrophysics Data System (ADS)

    Khane, Vaibhav; Al-Dahhan, Muthanna H.

    2017-04-01

    The radioactive particle tracking (RPT) technique has been utilized to measure three-dimensional hydrodynamic parameters for multiphase flow systems. An analytical solution to the inverse problem of the RPT technique, i.e. finding the instantaneous tracer positions based upon instantaneous counts received in the detectors, is not possible. Therefore, a calibration to obtain a counts-distance map is needed. There are major shortcomings in the conventional RPT calibration method due to which it has limited applicability in practical applications. In this work, the design and development of a novel dynamic RPT calibration technique are carried out to overcome the shortcomings of the conventional RPT calibration method. The dynamic RPT calibration technique has been implemented around a test reactor with 1foot in diameter and 1 foot in height using Cobalt-60 as an isotopes tracer particle. Two sets of experiments have been carried out to test the capability of novel dynamic RPT calibration. In the first set of experiments, a manual calibration apparatus has been used to hold a tracer particle at known static locations. In the second set of experiments, the tracer particle was moved vertically downwards along a straight line path in a controlled manner. The obtained reconstruction results about the tracer particle position were compared with the actual known position and the reconstruction errors were estimated. The obtained results revealed that the dynamic RPT calibration technique is capable of identifying tracer particle positions with a reconstruction error between 1 to 5.9 mm for the conditions studied which could be improved depending on various factors outlined here.

  14. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. IMS alone is useful, but its coupling with mass spectrometry (MS) and front-end separations has been extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information in biological and environmental sample analyses. Multiple studies in disease screening and environmental evaluations have even shown these IMS-based multidimensional separations extract information not possible with each technique individually. This review highlights 3-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography (GC),more » supercritical fluid chromatography (SFC), liquid chromatography (LC), solid phase extractions (SPE), capillary electrophoresis (CE), field asymmetric ion mobility spectrometry (FAIMS), and microfluidic devices. The origination, current state, various applications, and future capabilities for these multidimensional approaches are described to provide insight into the utility and potential of each technique.« less

  15. Quantitative measurement of solvation shells using frequency modulated atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Uchihashi, T.; Higgins, M.; Nakayama, Y.; Sader, J. E.; Jarvis, S. P.

    2005-03-01

    The nanoscale specificity of interaction measurements and additional imaging capability of the atomic force microscope make it an ideal technique for measuring solvation shells in a variety of liquids next to a range of materials. Unfortunately, the widespread use of atomic force microscopy for the measurement of solvation shells has been limited by uncertainties over the dimensions, composition and durability of the tip during the measurements, and problems associated with quantitative force calibration of the most sensitive dynamic measurement techniques. We address both these issues by the combined use of carbon nanotube high aspect ratio probes and quantifying the highly sensitive frequency modulation (FM) detection technique using a recently developed analytical method. Due to the excellent reproducibility of the measurement technique, additional information regarding solvation shell size as a function of proximity to the surface has been obtained for two very different liquids. Further, it has been possible to identify differences between chemical and geometrical effects in the chosen systems.

  16. Preliminary geological investigation of AIS data at Mary Kathleen, Queensland, Australia

    NASA Technical Reports Server (NTRS)

    Huntington, J. F.; Green, A. A.; Craig, M. D.; Cocks, T. D.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) was flown over granitic, volcanic, and calc-silicate terrain around the Mary Kathleen Uranium Mine in Queensland, in a test of its mineralocial mapping capabilities. An analysis strategy and restoration and enhancement techniques were developed to process the 128 band AIS data. A preliminary analysis of one of three AIS flight lines shows that the data contains considerable spectral variation but that it is also contaminated by second-order leakage of radiation from the near-infrared region. This makes the recognition of expected spectral absorption shapes very difficult. The effect appears worst in terrains containing considerable vegetation. Techniques that try to predict this supplementary radiation coupled with the log residual analytical technique show that expected mineral absorption spectra can be derived. The techniques suggest that with additional refinement correction procedures, the Australian AIS data may be revised. Application of the log residual analysis method has proved very successful on the cuprite, Nevada data set, and for highlighting the alunite, linite, and SiOH mineralogy.

  17. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Real-Time Monitoring of Cellular Bioenergetics with a Multi-Analyte Screen-Printed Electrode

    PubMed Central

    McKenzie, Jennifer R.; Cognata, Andrew C.; Davis, Anna N.; Wikswo, John P.; Cliffel, David E.

    2016-01-01

    Real-time monitoring of changes to cellular bioenergetics can provide new insights into mechanisms of action for disease and toxicity. This work describes the development of a multi-analyte screen-printed electrode for the detection of analytes central to cellular bioenergetics: glucose, lactate, oxygen, and pH. Platinum screen-printed electrodes were designed in-house and printed by Pine Research Instrumentation. Electrochemical plating techniques were used to form quasi-reference and pH electrodes. A Dimatix materials inkjet printer was used to deposit enzyme and polymer films to form sensors for glucose, lactate, and oxygen. These sensors were evaluated in bulk solution and microfluidic environments, and found to behave reproducibly and possess a lifetime of up to six weeks. Linear ranges and limits of detection for enzyme-based sensors were found to have an inverse relationship with enzyme loading, and iridium oxide pH sensors were found to have super-Nernstian responses. Preliminary measurements where the sensor was enclosed within a microfluidic channel with RAW 264.7 macrophages were performed to demonstrate the sensors’ capabilities for performing real-time microphysiometry measurements. PMID:26125545

  19. Dyes assay for measuring physicochemical parameters.

    PubMed

    Moczko, Ewa; Meglinski, Igor V; Bessant, Conrad; Piletsky, Sergey A

    2009-03-15

    A combination of selective fluorescent dyes has been developed for simultaneous quantitative measurements of several physicochemical parameters. The operating principle of the assay is similar to electronic nose and tongue systems, which combine nonspecific or semispecific elements for the determination of diverse analytes and chemometric techniques for multivariate data analysis. The analytical capability of the proposed mixture is engendered by changes in fluorescence signal in response to changes in environment such as pH, temperature, ionic strength, and presence of oxygen. The signal is detected by a three-dimensional spectrofluorimeter, and the acquired data are processed using an artificial neural network (ANN) for multivariate calibration. The fluorescence spectrum of a solution of selected dyes allows discreet reading of emission maxima of all dyes composing the mixture. The variations in peaks intensities caused by environmental changes provide distinctive fluorescence patterns which can be handled in the same way as the signals collected from nose/tongue electrochemical or piezoelectric devices. This optical system opens possibilities for rapid, inexpensive, real-time detection of a multitude of physicochemical parameters and analytes of complex samples.

  20. Strategies for Distinguishing Abiotic Chemistry from Martian Biochemistry in Samples Returned from Mars

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Burton, A. S.; Callahan, M. P.; Elsila, J. E.; Stern, J. C.; Dworkin, J. P.

    2012-01-01

    A key goal in the search for evidence of extinct or extant life on Mars will be the identification of chemical biosignatures including complex organic molecules common to all life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, and nucleobases, which serve as the structural basis of information storage in DNA and RNA. However, many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1]. Therefore, an important challenge in the search for evidence of life on Mars will be distinguishing between abiotic chemistry of either meteoritic or martian origin from any chemical biosignatures from an extinct or extant martian biota. Although current robotic missions to Mars, including the 2011 Mars Science Laboratory (MSL) and the planned 2018 ExoMars rovers, will have the analytical capability needed to identify these key classes of organic molecules if present [2,3], return of a diverse suite of martian samples to Earth would allow for much more intensive laboratory studies using a broad array of extraction protocols and state-of-theart analytical techniques for bulk and spatially resolved characterization, molecular detection, and isotopic and enantiomeric compositions that may be required for unambiguous confirmation of martian life. Here we will describe current state-of-the-art laboratory analytical techniques that have been used to characterize the abundance and distribution of amino acids and nucleobases in meteorites, Apollo samples, and comet- exposed materials returned by the Stardust mission with an emphasis on their molecular characteristics that can be used to distinguish abiotic chemistry from biochemistry as we know it. The study of organic compounds in carbonaceous meteorites is highly relevant to Mars sample return analysis, since exogenous organic matter should have accumulated in the martian regolith over the last several billion years and the analytical techniques previously developed for the study of extraterrestrial materials can be applied to martian samples.

  1. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  2. Visual analysis of large heterogeneous social networks by semantic and structural abstraction.

    PubMed

    Shen, Zeqian; Ma, Kwan-Liu; Eliassi-Rad, Tina

    2006-01-01

    Social network analysis is an active area of study beyond sociology. It uncovers the invisible relationships between actors in a network and provides understanding of social processes and behaviors. It has become an important technique in a variety of application areas such as the Web, organizational studies, and homeland security. This paper presents a visual analytics tool, OntoVis, for understanding large, heterogeneous social networks, in which nodes and links could represent different concepts and relations, respectively. These concepts and relations are related through an ontology (also known as a schema). OntoVis is named such because it uses information in the ontology associated with a social network to semantically prune a large, heterogeneous network. In addition to semantic abstraction, OntoVis also allows users to do structural abstraction and importance filtering to make large networks manageable and to facilitate analytic reasoning. All these unique capabilities of OntoVis are illustrated with several case studies.

  3. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  4. Riemann-Hilbert technique scattering analysis of metamaterial-based asymmetric 2D open resonators

    NASA Astrophysics Data System (ADS)

    Kamiński, Piotr M.; Ziolkowski, Richard W.; Arslanagić, Samel

    2017-12-01

    The scattering properties of metamaterial-based asymmetric two-dimensional open resonators excited by an electric line source are investigated analytically. The resonators are, in general, composed of two infinite and concentric cylindrical layers covered with an infinitely thin, perfect conducting shell that has an infinite axial aperture. The line source is oriented parallel to the cylinder axis. An exact analytical solution of this problem is derived. It is based on the dual-series approach and its transformation to the equivalent Riemann-Hilbert problem. Asymmetric metamaterial-based configurations are found to lead simultaneously to large enhancements of the radiated power and to highly steerable Huygens-like directivity patterns; properties not attainable with the corresponding structurally symmetric resonators. The presented open resonator designs are thus interesting candidates for many scientific and engineering applications where enhanced directional near- and far-field responses, tailored with beam shaping and steering capabilities, are highly desired.

  5. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  6. High throughput-screening of animal urine samples: It is fast but is it also reliable?

    PubMed

    Kaufmann, Anton

    2016-05-01

    Advanced analytical technologies like ultra-high-performance liquid chromatography coupled to high resolution mass spectrometry can be used for veterinary drug screening of animal urine. The technique is sufficiently robust and reliable to detect veterinary drugs in urine samples of animals where the maximum residue limit of these compounds in organs like muscle, kidney, or liver has been exceeded. The limitations and possibilities of the technique are discussed. The most critical point is the variability of the drug concentration ratio between the tissue and urine. Ways to manage the false positive and false negatives are discussed. The capability to confirm findings and the possibility of semi-targeted analysis are also addressed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Llames, Rene Lim

    1991-01-01

    Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.

  8. Nondestructive testing of Scout rocket motors

    NASA Technical Reports Server (NTRS)

    Oaks, A. E.

    1972-01-01

    The nondestructive tests applied to Scout rocket motors were reviewed and appraised. Analytical techniques were developed to evaluate the capabilities of the radiographic and ultrasonic procedures used. Major problem areas found were the inadequacy of high voltage radiography for detecting unbonds and propellant cracks having narrow widths, the inability to relate the ultrasonic signals received from flat-bottomed holes in standards to those received from real defects and in the general area of the specification of acceptance criteria and how these were to be met. To counter the deficiencies noted, analyses were conducted to the potential utility of radiometric, acoustic, holographic and thermographic techniques for motor and nozzle bond inspection, a new approach to qualifying magnetic particle inspection and the application of acoustic emission analysis to the evaluation of proof and leak test data.

  9. A Bloom Filter-Powered Technique Supporting Scalable Semantic Discovery in Data Service Networks

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Shi, R.; Bao, Q.; Lee, T. J.; Ramachandran, R.

    2016-12-01

    More and more Earth data analytics software products are published onto the Internet as a service, in the format of either heavyweight WSDL service or lightweight RESTful API. Such reusable data analytics services form a data service network, which allows Earth scientists to compose (mashup) services into value-added ones. Therefore, it is important to have a technique that is capable of helping Earth scientists quickly identify appropriate candidate datasets and services in the global data service network. Most existing services discovery techniques, however, mainly rely on syntax or semantics-based service matchmaking between service requests and available services. Since the scale of the data service network is increasing rapidly, the run-time computational cost will soon become a bottleneck. To address this issue, this project presents a way of applying network routing mechanism to facilitate data service discovery in a service network, featuring scalability and performance. Earth data services are automatically annotated in Web Ontology Language for Services (OWL-S) based on their metadata, semantic information, and usage history. Deterministic Annealing (DA) technique is applied to dynamically organize annotated data services into a hierarchical network, where virtual routers are created to represent semantic local network featuring leading terms. Afterwards Bloom Filters are generated over virtual routers. A data service search request is transformed into a network routing problem in order to quickly locate candidate services through network hierarchy. A neural network-powered technique is applied to assure network address encoding and routing performance. A series of empirical study has been conducted to evaluate the applicability and effectiveness of the proposed approach.

  10. Compact and cost effective instrument for detecting drug precursors in different environments based on fluorescence polarization

    NASA Astrophysics Data System (ADS)

    Antolín-Urbaneja, J. C.; Eguizabal, I.; Briz, N.; Dominguez, A.; Estensoro, P.; Secchi, A.; Varriale, A.; Di Giovanni, S.; D'Auria, S.

    2013-05-01

    Several techniques for detecting chemical drug precursors have been developed in the last decade. Most of them are able to identify molecules at very low concentration under lab conditions. Other commercial devices are able to detect a fixed number and type of target substances based on a single detection technique providing an absence of flexibility with respect to target compounds. The construction of compact and easy to use detection systems providing screening for a large number of compounds being able to discriminate them with low false alarm rate and high probability of detection is still an open concern. Under CUSTOM project, funded by the European Commission within the FP7, a stand-alone portable sensing device based on multiple techniques is being developed. One of these techniques is based on the LED induced fluorescence polarization to detect Ephedrine and Benzyl Methyl Keton (BMK) as a first approach. This technique is highly selective with respect to the target compounds due to the generation of properly engineered fluorescent proteins which are able to bind the target analytes, as it happens in an "immune-type reaction". This paper deals with the advances in the design, construction and validation of the LED induced fluorescence sensor to detect BMK analytes. This sensor includes an analysis module based on high performance LED and PMT detector, a fluidic system to dose suitable quantities of reagents and some printed circuit boards, all of them fixed in a small structure (167mm × 193mm × 228mm) with the capability of working as a stand-alone application.

  11. CMOS Time-Resolved, Contact, and Multispectral Fluorescence Imaging for DNA Molecular Diagnostics

    PubMed Central

    Guo, Nan; Cheung, Ka Wai; Wong, Hiu Tung; Ho, Derek

    2014-01-01

    Instrumental limitations such as bulkiness and high cost prevent the fluorescence technique from becoming ubiquitous for point-of-care deoxyribonucleic acid (DNA) detection and other in-field molecular diagnostics applications. The complimentary metal-oxide-semiconductor (CMOS) technology, as benefited from process scaling, provides several advanced capabilities such as high integration density, high-resolution signal processing, and low power consumption, enabling sensitive, integrated, and low-cost fluorescence analytical platforms. In this paper, CMOS time-resolved, contact, and multispectral imaging are reviewed. Recently reported CMOS fluorescence analysis microsystem prototypes are surveyed to highlight the present state of the art. PMID:25365460

  12. What Can You Do with a Returned Sample of Martian Dust?

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.; Nakamura-Messenger, K.

    2007-01-01

    A major issue that we managed to successfully address for the Stardust Mission was the magnitude and manner of preliminary examination (PET) of the returned samples, which totaled much less than 1 mg. Not since Apollo and Luna days had anyone faced this issue, and the lessons of Apollo PET were not extremely useful because of the very different sample masses in this case, and the incredible advances in analytical capabilities since the 1960s. This paper reviews some of the techniques for examination of small very rare samples that would be returned from Mars missions.

  13. In situ nuclear magnetic resonance microimaging of live biofilms in a microchannel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Renslow, R. S.; Marshall, M. J.; Tucker, A. E.

    Nuclear magnetic resonance (NMR) microimaging and spectroscopy was used to interrogate fluids of biological importance (e.g., water, buffer, medium solution) and live biofilms in a microchannel compatible for analyses at ambient pressure and under vacuum. Studies using buffer, growth medium, and actively growing Shewanella oneidensis biofilms were used to demonstrate in situ NMR microimaging measurement capabilities including velocity mapping, diffusion coefficient mapping, relaxometry, localized spectroscopy, and 2D and 3D imaging within a microchannel suitable for different analytical platforms. This technique is promising for diverse applications of correlative imaging using a portable microfluidic platform.

  14. A simulation technique for predicting thickness of thermal sprayed coatings

    NASA Technical Reports Server (NTRS)

    Goedjen, John G.; Miller, Robert A.; Brindley, William J.; Leissler, George W.

    1995-01-01

    The complexity of many of the components being coated today using the thermal spray process makes the trial and error approach traditionally followed in depositing a uniform coating inadequate, thereby necessitating a more analytical approach to developing robotic trajectories. A two dimensional finite difference simulation model has been developed to predict the thickness of coatings deposited using the thermal spray process. The model couples robotic and component trajectories and thermal spraying parameters to predict coating thickness. Simulations and experimental verification were performed on a rotating disk to evaluate the predictive capabilities of the approach.

  15. Carbon dioxide sensor. [partial pressure measurement using monochromators

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Analytical techniques for measuring CO2 were evaluated and rated for use with the advanced extravehicular mobility unit. An infrared absorption concept using a dual-wavelength monochromator was selected for investigation. A breadboard carbon dioxide sensor (CDS) was assembled and tested. The CDS performance showed the capability of measuring CO2 over the range of 0 to 4.0 kPa (0 to 30 mmHg) P sub (CO2). The volume and weight of a flight configured CDS should be acceptable. It is recommended that development continue to complete the design of a flight prototype.

  16. Trace metal mapping by laser-induced breakdown spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaiser, Jozef; Novotny, Dr. Karel; Hrdlicka, A

    2012-01-01

    Abstract: Laser-Induced Breakdown Spectroscopy (LIBS) is a sensitive optical technique capable of fast multi-elemental analysis of solid, gaseous and liquid samples. The potential applications of lasers for spectrochemical analysis were developed shortly after its invention; however the massive development of LIBS is connected with the availability of powerful pulsed laser sources. Since the late 80s of 20th century LIBS dominated the analytical atomic spectroscopy scene and its application are developed continuously. Here we review the utilization of LIBS for trace elements mapping in different matrices. The main emphasis is on trace metal mapping in biological samples.

  17. A Comparative Study of Single-pulse and Double-pulse Laser-Induced Breakdown Spectroscopy with Uranium-containing Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skrodzki, P. J.; Becker, J. R.; Diwakar, P. K.

    Laser-induced breakdown spectroscopy (LIBS) holds potential advantages in special nuclear material (SNM) sensing and nuclear forensics which require rapid analysis, minimal sample preparation and stand-off distance capability. SNM, such as U, however, result in crowded emission spectra with LIBS, and characteristic emission lines are challenging to discern. It is well-known that double-pulse LIBS (DPLIBS) improves the signal intensity for analytes over conventional single-pulse LIBS (SPLIBS). This study investigates U signal in a glass matrix using DPLIBS and compares to signal features obtained using SPLIBS. DPLIBS involves sequential firing of 1.06 µm Nd:YAG pre-pulse and 10.6 µm TEA CO2 heating pulsemore » in near collinear geometry. Optimization of experimental parameters including inter-pulse delay and energy follows identification of characteristic lines and signals for bulk analyte Ca and minor constituent analyte U for both DPLIBS and SPLIBS. Spatial and temporal coupling of the two pulses in the proposed DPLIBS technique yields improvements in analytical merits with negligible further damage to the sample compared to SPLIBS. Subsequently, the study discusses optimum plasma emission conditions of U lines and relative figures of merit in both SPLIBS and DPLIBS. Investigation into plasma characteristics also addresses plausible mechanisms related to observed U analyte signal variation between SPLIBS and DPLIBS.« less

  18. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  19. Test and Analysis Capabilities of the Space Environment Effects Team at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Finckenor, M. M.; Edwards, D. L.; Vaughn, J. A.; Schneider, T. A.; Hovater, M. A.; Hoppe, D. T.

    2002-01-01

    Marshall Space Flight Center has developed world-class space environmental effects testing facilities to simulate the space environment. The combined environmental effects test system exposes temperature-controlled samples to simultaneous protons, high- and low-energy electrons, vacuum ultraviolet (VUV) radiation, and near-ultraviolet (NUV) radiation. Separate chambers for studying the effects of NUV and VUV at elevated temperatures are also available. The Atomic Oxygen Beam Facility exposes samples to atomic oxygen of 5 eV energy to simulate low-Earth orbit (LEO). The LEO space plasma simulators are used to study current collection to biased spacecraft surfaces, arcing from insulators and electrical conductivity of materials. Plasma propulsion techniques are analyzed using the Marshall magnetic mirror system. The micro light gas gun simulates micrometeoroid and space debris impacts. Candidate materials and hardware for spacecraft can be evaluated for durability in the space environment with a variety of analytical techniques. Mass, solar absorptance, infrared emittance, transmission, reflectance, bidirectional reflectance distribution function, and surface morphology characterization can be performed. The data from the space environmental effects testing facilities, combined with analytical results from flight experiments, enable the Environmental Effects Group to determine optimum materials for use on spacecraft.

  20. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  1. A graphical approach to radio frequency quadrupole design

    NASA Astrophysics Data System (ADS)

    Turemen, G.; Unel, G.; Yasatekin, B.

    2015-07-01

    The design of a radio frequency quadrupole, an important section of all ion accelerators, and the calculation of its beam dynamics properties can be achieved using the existing computational tools. These programs, originally designed in 1980s, show effects of aging in their user interfaces and in their output. The authors believe there is room for improvement in both design techniques using a graphical approach and in the amount of analytical calculations before going into CPU burning finite element analysis techniques. Additionally an emphasis on the graphical method of controlling the evolution of the relevant parameters using the drag-to-change paradigm is bound to be beneficial to the designer. A computer code, named DEMIRCI, has been written in C++ to demonstrate these ideas. This tool has been used in the design of Turkish Atomic Energy Authority (TAEK)'s 1.5 MeV proton beamline at Saraykoy Nuclear Research and Training Center (SANAEM). DEMIRCI starts with a simple analytical model, calculates the RFQ behavior and produces 3D design files that can be fed to a milling machine. The paper discusses the experience gained during design process of SANAEM Project Prometheus (SPP) RFQ and underlines some of DEMIRCI's capabilities.

  2. High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model

    NASA Astrophysics Data System (ADS)

    Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng

    The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.

  3. Mass spectrometric based approaches in urine metabolomics and biomarker discovery.

    PubMed

    Khamis, Mona M; Adamko, Darryl J; El-Aneed, Anas

    2017-03-01

    Urine metabolomics has recently emerged as a prominent field for the discovery of non-invasive biomarkers that can detect subtle metabolic discrepancies in response to a specific disease or therapeutic intervention. Urine, compared to other biofluids, is characterized by its ease of collection, richness in metabolites and its ability to reflect imbalances of all biochemical pathways within the body. Following urine collection for metabolomic analysis, samples must be immediately frozen to quench any biogenic and/or non-biogenic chemical reactions. According to the aim of the experiment; sample preparation can vary from simple procedures such as filtration to more specific extraction protocols such as liquid-liquid extraction. Due to the lack of comprehensive studies on urine metabolome stability, higher storage temperatures (i.e. 4°C) and repetitive freeze-thaw cycles should be avoided. To date, among all analytical techniques, mass spectrometry (MS) provides the best sensitivity, selectivity and identification capabilities to analyze the majority of the metabolite composition in the urine. Combined with the qualitative and quantitative capabilities of MS, and due to the continuous improvements in its related technologies (i.e. ultra high-performance liquid chromatography [UPLC] and hydrophilic interaction liquid chromatography [HILIC]), liquid chromatography (LC)-MS is unequivocally the most utilized and the most informative analytical tool employed in urine metabolomics. Furthermore, differential isotope tagging techniques has provided a solution to ion suppression from urine matrix thus allowing for quantitative analysis. In addition to LC-MS, other MS-based technologies have been utilized in urine metabolomics. These include direct injection (infusion)-MS, capillary electrophoresis-MS and gas chromatography-MS. In this article, the current progresses of different MS-based techniques in exploring the urine metabolome as well as the recent findings in providing potentially diagnostic urinary biomarkers are discussed. © 2015 Wiley Periodicals, Inc. Mass Spec Rev 36:115-134, 2017. © 2015 Wiley Periodicals, Inc.

  4. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  5. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  6. Computational split-field finite-difference time-domain evaluation of simplified tilt-angle models for parallel-aligned liquid-crystal devices

    NASA Astrophysics Data System (ADS)

    Márquez, Andrés; Francés, Jorge; Martínez, Francisco J.; Gallego, Sergi; Álvarez, Mariela L.; Calzado, Eva M.; Pascual, Inmaculada; Beléndez, Augusto

    2018-03-01

    Simplified analytical models with predictive capability enable simpler and faster optimization of the performance in applications of complex photonic devices. We recently demonstrated the most simplified analytical model still showing predictive capability for parallel-aligned liquid crystal on silicon (PA-LCoS) devices, which provides the voltage-dependent retardance for a very wide range of incidence angles and any wavelength in the visible. We further show that the proposed model is not only phenomenological but also physically meaningful, since two of its parameters provide the correct values for important internal properties of these devices related to the birefringence, cell gap, and director profile. Therefore, the proposed model can be used as a means to inspect internal physical properties of the cell. As an innovation, we also show the applicability of the split-field finite-difference time-domain (SF-FDTD) technique for phase-shift and retardance evaluation of PA-LCoS devices under oblique incidence. As a simplified model for PA-LCoS devices, we also consider the exact description of homogeneous birefringent slabs. However, we show that, despite its higher degree of simplification, the proposed model is more robust, providing unambiguous and physically meaningful solutions when fitting its parameters.

  7. Fabric phase sorptive extraction: Two practical sample pretreatment techniques for brominated flame retardants in water.

    PubMed

    Huang, Guiqi; Dong, Sheying; Zhang, Mengfei; Zhang, Haihan; Huang, Tinglin

    2016-09-15

    Sample pretreatment is the critical section for residue monitoring of hazardous pollutants. In this paper, using the cellulose fabric as host matrix, three extraction sorbents such as poly (tetrahydrofuran) (PTHF), poly (ethylene glycol) (PEG) and poly (dimethyldiphenylsiloxane) (PDMDPS), were prepared on the surface of the cellulose fabric. Two practical extraction techniques including stir bar fabric phase sorptive extraction (stir bar-FPSE) and magnetic stir fabric phase sorptive extraction (magnetic stir-FPSE) have been designed, which allow stirring of fabric phase sorbent during the whole extraction process. In the meantime, three brominated flame retardants (BFRs) [tetrabromobisphenol A (TBBPA), tetrabromobisphenol A bisallylether (TBBPA-BAE), tetrabromobisphenol A bis(2,3-dibromopropyl)ether (TBBPA-BDBPE)] in the water sample were selected as model analytes for the practical evaluation of the proposed two techniques using high-performance liquid chromatography (HPLC). Moreover, various experimental conditions affecting extraction process such as the type of fabric phase, extraction time, the amount of salt and elution conditions were also investigated. Due to the large sorbent loading capacity and unique stirring performance, both techniques possessed high extraction capability and fast extraction equilibrium. Under the optimized conditions, high recoveries (90-99%) and low limits of detection (LODs) (0.01-0.05 μg L(-1)) were achieved. In addition, the reproducibility was obtained by evaluating the intraday and interday precisions with relative standard deviations (RSDs) less than 5.1% and 6.8%, respectively. The results indicated that two pretreatment techniques were promising and practical for monitoring of hazardous pollutants in the water sample. Due to low solvent consumption and high repeated use performance, proposed techniques also could meet green analytical criteria. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Application of Interface Technology in Progressive Failure Analysis of Composite Panels

    NASA Technical Reports Server (NTRS)

    Sleight, D. W.; Lotts, C. G.

    2002-01-01

    A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.

  9. Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.

    1999-01-01

    As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.

  10. Inorganic trace analysis by mass spectrometry

    NASA Astrophysics Data System (ADS)

    Becker, Johanna Sabine; Dietze, Hans-Joachim

    1998-10-01

    Mass spectrometric methods for the trace analysis of inorganic materials with their ability to provide a very sensitive multielemental analysis have been established for the determination of trace and ultratrace elements in high-purity materials (metals, semiconductors and insulators), in different technical samples (e.g. alloys, pure chemicals, ceramics, thin films, ion-implanted semiconductors), in environmental samples (waters, soils, biological and medical materials) and geological samples. Whereas such techniques as spark source mass spectrometry (SSMS), laser ionization mass spectrometry (LIMS), laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), glow discharge mass spectrometry (GDMS), secondary ion mass spectrometry (SIMS) and inductively coupled plasma mass spectrometry (ICP-MS) have multielemental capability, other methods such as thermal ionization mass spectrometry (TIMS), accelerator mass spectrometry (AMS) and resonance ionization mass spectrometry (RIMS) have been used for sensitive mono- or oligoelemental ultratrace analysis (and precise determination of isotopic ratios) in solid samples. The limits of detection for chemical elements using these mass spectrometric techniques are in the low ng g -1 concentration range. The quantification of the analytical results of mass spectrometric methods is sometimes difficult due to a lack of matrix-fitted multielement standard reference materials (SRMs) for many solid samples. Therefore, owing to the simple quantification procedure of the aqueous solution, inductively coupled plasma mass spectrometry (ICP-MS) is being increasingly used for the characterization of solid samples after sample dissolution. ICP-MS is often combined with special sample introduction equipment (e.g. flow injection, hydride generation, high performance liquid chromatography (HPLC) or electrothermal vaporization) or an off-line matrix separation and enrichment of trace impurities (especially for characterization of high-purity materials and environmental samples) is used in order to improve the detection limits of trace elements. Furthermore, the determination of chemical elements in the trace and ultratrace concentration range is often difficult and can be disturbed through mass interferences of analyte ions by molecular ions at the same nominal mass. By applying double-focusing sector field mass spectrometry at the required mass resolution—by the mass spectrometric separation of molecular ions from the analyte ions—it is often possible to overcome these interference problems. Commercial instrumental equipment, the capability (detection limits, accuracy, precision) and the analytical application fields of mass spectrometric methods for the determination of trace and ultratrace elements and for surface analysis are discussed.

  11. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  12. The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2017-12-01

    NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing

  13. Critical review of dog detection and the influences of physiology, training, and analytical methodologies.

    PubMed

    Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M

    2018-08-01

    Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Method for characterization of low molecular weight organic acids in atmospheric aerosols using ion chromatography mass spectrometry.

    PubMed

    Brent, Lacey C; Reiner, Jessica L; Dickerson, Russell R; Sander, Lane C

    2014-08-05

    The structural composition of PM2.5 monitored in the atmosphere is usually divided by the analysis of organic carbon, black (also called elemental) carbon, and inorganic salts. The characterization of the chemical composition of aerosols represents a significant challenge to analysts, and studies are frequently limited to determination of aerosol bulk properties. To better understand the potential health effects and combined interactions of components in aerosols, a variety of measurement techniques for individual analytes in PM2.5 need to be implemented. The method developed here for the measurement of organic acids achieves class separation of aliphatic monoacids, aliphatic diacids, aromatic acids, and polyacids. The selective ion monitoring capability of a triple quadropole mass analyzer was frequently capable of overcoming instances of incomplete separations. Standard Reference Material (SRM) 1649b Urban Dust was characterized; 34 organic acids were qualitatively identified, and 6 organic acids were quantified.

  15. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    NASA Technical Reports Server (NTRS)

    Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  16. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  17. Raman spectroscopic analysis of geological and biogeological specimens of relevance to the ExoMars mission.

    PubMed

    Edwards, Howell G M; Hutchinson, Ian B; Ingley, Richard; Parnell, John; Vítek, Petr; Jehlička, Jan

    2013-06-01

    A novel miniaturized Raman spectrometer is scheduled to fly as part of the analytical instrumentation package on an ESA remote robotic lander in the ESA/Roscosmos ExoMars mission to search for evidence for extant or extinct life on Mars in 2018. The Raman spectrometer will be part of the first-pass analytical stage of the sampling procedure, following detailed surface examination by the PanCam scanning camera unit on the ExoMars rover vehicle. The requirements of the analytical protocol are stringent and critical; this study represents a laboratory blind interrogation of specimens that form a list of materials that are of relevance to martian exploration and at this stage simulates a test of current laboratory instrumentation to highlight the Raman technique strengths and possible weaknesses that may be encountered in practice on the martian surface and from which future studies could be formulated. In this preliminary exercise, some 10 samples that are considered terrestrial representatives of the mineralogy and possible biogeologically modified structures that may be identified on Mars have been examined with Raman spectroscopy, and conclusions have been drawn about the viability of the unambiguous spectral identification of biomolecular life signatures. It is concluded that the Raman spectroscopic technique does indeed demonstrate the capability to identify biomolecular signatures and the mineralogy in real-world terrestrial samples with a very high degree of success without any preconception being made about their origin and classification.

  18. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  19. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  20. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers.

    PubMed

    Chen, Yi-Ting; Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-05-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. Emission Computed Tomography: A New Technique for the Quantitative Physiologic Study of Brain and Heart in Vivo

    DOE R&D Accomplishments Database

    Phelps, M. E.; Hoffman, E. J.; Huang, S. C.; Schelbert, H. R.; Kuhl, D. E.

    1978-01-01

    Emission computed tomography can provide a quantitative in vivo measurement of regional tissue radionuclide tracer concentrations. This facility when combined with physiologic models and radioactively labeled physiologic tracers that behave in a predictable manner allow measurement of a wide variety of physiologic variables. This integrated technique has been referred to as Physiologic Tomography (PT). PT requires labeled compounds which trace physiologic processes in a known and predictable manner, and physiologic models which are appropriately formulated and validated to derive physiologic variables from ECT data. In order to effectively achieve this goal, PT requires an ECT system that is capable of performing truly quantitative or analytical measurements of tissue tracer concentrations and which has been well characterized in terms of spatial resolution, sensitivity and signal to noise ratios in the tomographic image. This paper illustrates the capabilities of emission computed tomography and provides examples of physiologic tomography for the regional measurement of cerebral and myocardial metabolic rate for glucose, regional measurement of cerebral blood volume, gated cardiac blood pools and capillary perfusion in brain and heart. Studies on patients with stroke and myocardial ischemia are also presented.

  2. Efficient Power-Transfer Capability Analysis of the TET System Using the Equivalent Small Parameter Method.

    PubMed

    Yanzhen Wu; Hu, A P; Budgett, D; Malpas, S C; Dissanayake, T

    2011-06-01

    Transcutaneous energy transfer (TET) enables the transfer of power across the skin without direct electrical connection. It is a mechanism for powering implantable devices for the lifetime of a patient. For maximum power transfer, it is essential that TET systems be resonant on both the primary and secondary sides, which requires considerable design effort. Consequently, a strong need exists for an efficient method to aid the design process. This paper presents an analytical technique appropriate to analyze complex TET systems. The system's steady-state solution in closed form with sufficient accuracy is obtained by employing the proposed equivalent small parameter method. It is shown that power-transfer capability can be correctly predicted without tedious iterative simulations or practical measurements. Furthermore, for TET systems utilizing a current-fed push-pull soft switching resonant converter, it is found that the maximum energy transfer does not occur when the primary and secondary resonant tanks are "tuned" to the nominal resonant frequency. An optimal turning point exists, corresponding to the system's maximum power-transfer capability when optimal tuning capacitors are applied.

  3. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  4. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  5. Remote sensing for oceanography: Past, present, future

    NASA Technical Reports Server (NTRS)

    Mcgoldrick, L. F.

    1984-01-01

    Oceanic dynamics was traditionally investigated by sampling from instruments in situ, yielding quantitative measurements that are intermittent in both space and time; the ocean is undersampled. The need to obtain proper sampling of the averaged quantities treated in analytical and numerical models is at present the most significant limitation on advances in physical oceanography. Within the past decade, many electromagnetic techniques for the study of the Earth and planets were applied to the study of the ocean. Now satellites promise nearly total coverage of the world's oceans using only a few days to a few weeks of observations. Both a review of the early and present techniques applied to satellite oceanography and a description of some future systems to be launched into orbit during the remainder of this century are presented. Both scientific and technologic capabilities are discussed.

  6. Electrochemical impedimetric sensor based on molecularly imprinted polymers/sol-gel chemistry for methidathion organophosphorous insecticide recognition.

    PubMed

    Bakas, Idriss; Hayat, Akhtar; Piletsky, Sergey; Piletska, Elena; Chehimi, Mohamed M; Noguer, Thierry; Rouillon, Régis

    2014-12-01

    We report here a novel method to detect methidathion organophosphorous insecticides. The sensing platform was architected by the combination of molecularly imprinted polymers and sol-gel technique on inexpensive, portable and disposable screen printed carbon electrodes. Electrochemical impedimetric detection technique was employed to perform the label free detection of the target analyte on the designed MIP/sol-gel integrated platform. The selection of the target specific monomer by electrochemical impedimetric methods was consistent with the results obtained by the computational modelling method. The prepared electrochemical MIP/sol-gel based sensor exhibited a high recognition capability toward methidathion, as well as a broad linear range and a low detection limit under the optimized conditions. Satisfactory results were also obtained for the methidathion determination in waste water samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  8. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  9. Bringing Business Intelligence to Health Information Technology Curriculum

    ERIC Educational Resources Information Center

    Zheng, Guangzhi; Zhang, Chi; Li, Lei

    2015-01-01

    Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…

  10. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  11. KEY COMPARISON: Final report on international key comparison CCQM-K53: Oxygen in nitrogen

    NASA Astrophysics Data System (ADS)

    Lee, Jeongsoon; Bok Lee, Jin; Moon, Dong Min; Seog Kim, Jin; van der Veen, Adriaan M. H.; Besley, Laurie; Heine, Hans-Joachim; Martin, Belén; Konopelko, L. A.; Kato, Kenji; Shimosaka, Takuya; Perez Castorena, Alejandro; Macé, Tatiana; Milton, Martin J. T.; Kelley, Mike; Guenther, Franklin; Botha, Angelique

    2010-01-01

    Gravimetry is used as the primary method for the preparation of primary standard gas mixtures in most national metrology institutes, and it requires the combined abilities of purity assessment, weighing technique and analytical skills. At the CCQM GAWG meeting in October 2005, it was agreed that KRISS should coordinate a key comparison, CCQM-K53, on the gravimetric preparation of gas, at a level of 100 µmol/mol of oxygen in nitrogen. KRISS compared the gravimetric value of each cylinder with an analytical instrument. A preparation for oxygen gas standard mixture requires particular care to be accurate, because oxygen is a major component of the atmosphere. Key issues for this comparison are related to (1) the gravimetric technique which needs at least two steps for dilution, (2) oxygen impurity in nitrogen, and (3) argon impurity in nitrogen. The key comparison reference value is obtained from the linear regression line (with origin) of a selected set of participants. The KCRV subset, except one, agree with each other. The standard deviation of the x-residuals of this group (which consists of NMIJ, VSL, NIST, NPL, BAM, KRISS and CENAM) is 0.056 µmol/mol and consistent with the uncertainties given to their standard mixtures. The standard deviation of the residuals of all participating laboratory is 0.182 µmol/mol. With respect to impurity analysis, overall argon amounts of the cylinders are in the region of about 3 µmol/mol however; four cylinders showed an argon amount fraction over 10 µmol/mol. Two of these are inconsistent with the KCRV subset. The explicit separation between two peaks of oxygen and argon in the GC chromatogram is essential to maintain analytical capability. Additionally oxygen impurity analysis in nitrogen is indispensable to ensure the preparative capability. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).

  12. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  13. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  14. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach.

    PubMed

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-06-01

    Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach's alpha of 75%. Data was analyzed using the decision Delphi technique. GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information.

  15. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  16. A Development Strategy for Creating a Suite of Reference Materials for the in-situ Microanalysis of Non-conventional Raw Materials

    NASA Astrophysics Data System (ADS)

    Renno, A. D.; Merchel, S.; Michalak, P. P.; Munnik, F.; Wiedenbeck, M.

    2010-12-01

    Recent economic trends regarding the supply of rare metals readily justify scientific research into non-conventional raw materials, where a particular need is a better understanding of the relationship between mineralogy, microstructure and the distribution of key metals within ore deposits (geometallurgy). Achieving these goals will require an extensive usage of in-situ microanalytical techniques capable of spatially resolving material heterogeneities which can be key for understanding better resource utilization. The availability of certified reference materials (CRMs) is an essential prerequisite for (1) validating new analytical methods, (2) demonstrating data quality to the contracting authorities, (3) supporting method development and instrument calibration, and (4) establishing traceability between new analytical approaches and existing data sets. This need has led to the granting of funding by the European Union and the German Free State of Saxony for a program to develop such reference materials . This effort will apply the following strategies during the selection of the phases: (1) will use exclusively synthetic minerals, thereby providing large volumes of homogeneous starting material. (2) will focus on matrices which are capable of incorporating many ‘important’ elements while avoid exotic compositions which would not be optimal matrix matches. (3) will emphasise those phases which remain stable during the various microanalytical procedure. This initiative will assess the homogeneity of the reference materials at sampling sizes ranging between 50 and 1 µm; it is also intended to document crystal structural homogeneity too, as this too may potentially impact specific analytical methods. As far as possible both definitive methods as well as methods involving matrix corrections will be used for determining the compositions of the of the individual materials. A critical challenge will be the validation of the determination of analytes concentrations as sub-µg sampling masses. It is planned to cooperate with those who are interested in the development of such reference materials and we invite them to take part in round-robin exercises.

  17. Liquid air cycle engines

    NASA Technical Reports Server (NTRS)

    Rosevear, Jerry

    1992-01-01

    Given here is a definition of Liquid Air Cycle Engines (LACE) and existing relevant technologies. Heat exchanger design and fabrication techniques, the handling of liquid hydrogen to achieve the greatest heat sink capabilities, and air decontamination to prevent heat exchanger fouling are discussed. It was concluded that technology needs to be extended in the areas of design and fabrication of heat exchangers to improve reliability along with weight and volume reductions. Catalysts need to be improved so that conversion can be achieved with lower quantities and lower volumes. Packaging studies need to be investigated both analytically and experimentally. Recycling with slush hydrogen needs further evaluation with experimental testing.

  18. Investigation of Advanced Radar Techniques for Atmospheric Hazard Detection with Airborne Weather Radar

    NASA Technical Reports Server (NTRS)

    Pazmany, Andrew L.

    2014-01-01

    In 2013 ProSensing Inc. conducted a study to investigate the hazard detection potential of aircraft weather radars with new measurement capabilities, such as multi-frequency, polarimetric and radiometric modes. Various radar designs and features were evaluated for sensitivity, measurement range and for detecting and quantifying atmospheric hazards in wide range of weather conditions. Projected size, weight, power consumption and cost of the various designs were also considered. Various cloud and precipitation conditions were modeled and used to conduct an analytic evaluation of the design options. This report provides an overview of the study and summarizes the conclusions and recommendations.

  19. Development of design information for molecular-sieve type regenerative CO2-removal systems

    NASA Technical Reports Server (NTRS)

    Wright, R. M.; Ruder, J. M.; Dunn, V. B.; Hwang, K. C.

    1973-01-01

    Experimental and analytic studies were conducted with molecular sieve sorbents to provide basic design information, and to develop a system design technique for regenerable CO2-removal systems for manned spacecraft. Single sorbate equilibrium data were obtained over a wide range of conditions for CO2, water, nitrogen, and oxygen on several molecular sieve and silica gel sorbents. The coadsorption of CO2 with water preloads, and with oxygen and nitrogen was experimentally evaluated. Mass-transfer, and some limited heat-transfer performance evaluations were accomplished under representative operating conditions, including the coadsorption of CO2 and water. CO2-removal system performance prediction capability was derived.

  20. Peptide Fragmentation Induced by Radicals at Atmospheric Pressure

    PubMed Central

    Vilkov, Andrey N.; Laiko, Victor V.; Doroshenko, Vladimir M.

    2009-01-01

    A novel ion dissociation technique, which is capable of providing an efficient fragmentation of peptides at essentially atmospheric pressure conditions, is developed. The fragmentation patterns observed often contain c-type fragments that are specific to ECD/ETD, along with the y-/b- fragments that are specific to CAD. In the presented experimental setup, ion fragmentation takes place within a flow reactor located in the atmospheric pressure region between the ion source and the mass spectrometer. According to a proposed mechanism, the fragmentation results from the interaction of ESI-generated analyte ions with the gas-phase radical species produced by a corona discharge source. PMID:19034885

  1. Analysis of Well-Clear Boundary Models for the Integration of UAS in the NAS

    NASA Technical Reports Server (NTRS)

    Upchurch, Jason M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Chamberlain, James P.; Consiglio, Maria C.

    2014-01-01

    The FAA-sponsored Sense and Avoid Workshop for Unmanned Aircraft Systems (UAS) defnes the concept of sense and avoid for remote pilots as "the capability of a UAS to remain well clear from and avoid collisions with other airborne traffic." Hence, a rigorous definition of well clear is fundamental to any separation assurance concept for the integration of UAS into civil airspace. This paper presents a family of well-clear boundary models based on the TCAS II Resolution Advisory logic. Analytical techniques are used to study the properties and relationships satisfied by the models. Some of these properties are numerically quantifed using statistical methods.

  2. Earth materials research: Report of a Workshop on Physics and Chemistry of Earth Materials

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The report concludes that an enhanced effort of earth materials research is necessary to advance the understanding of the processes that shape the planet. In support of such an effort, there are new classes of experiments, new levels of analytical sensitivity and precision, and new levels of theory that are now applicable in understanding the physical and chemical properties of geological materials. The application of these capabilities involves the need to upgrade and make greater use of existing facilities as well as the development of new techniques. A concomitant need is for a sample program involving their collection, synthesis, distribution, and analysis.

  3. A System Analysis for Determining Alternative Technological Issues for the Future

    NASA Technical Reports Server (NTRS)

    Magistrale, V. J.; Small, J.

    1967-01-01

    A systems engineering methodology is provided, by which future technological ventures may be examined utilizing particular national, corporate, or individual value judgments. Three matrix analyses are presented. The first matrix is concerned with the effect of technology on population increase, war, poverty, health, resources, and prejudice. The second matrix explores an analytical technique for determining the relative importance of different areas of technology. The third matrix explores how an individual or corporate entity may determine how its capability may be used for future technological opportunities. No conclusions are presented since primary effort has been placed on the methodology of determining future technological issues.

  4. Simultaneous determination of iron, cadmium, zinc, copper, nickel, lead, and uranium in seawater by stable isotope dilution spark source mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mykytiuk, A.P.; Russell, D.S.; Sturgeon, R.E.

    Trace concentrations (ng/mL) of Fe, Cd, Zn, Cu, Ni, Pb, U, and Co have been determined in seawater by stable isotope dilution spark source mass spectrometry. The seawater samples were preconcentrated on the ion exchanger Chelex-100 and the concentrate was evaporated on a graphite or silver electrode. The results are compared with those obtained by graphite furnace atomic absorption spectrometry and inductively coupled plasma emission spectrometry. The technique avoids the use of calibration standards and is capable of producing results in cases where the analyte is only partially recovered. 2 tables.

  5. Imaging of oxygen and hypoxia in cell and tissue samples.

    PubMed

    Papkovsky, Dmitri B; Dmitriev, Ruslan I

    2018-05-14

    Molecular oxygen (O 2 ) is a key player in cell mitochondrial function, redox balance and oxidative stress, normal tissue function and many common disease states. Various chemical, physical and biological methods have been proposed for measurement, real-time monitoring and imaging of O 2 concentration, state of decreased O 2 (hypoxia) and related parameters in cells and tissue. Here, we review the established and emerging optical microscopy techniques allowing to visualize O 2 levels in cells and tissue samples, mostly under in vitro and ex vivo, but also under in vivo settings. Particular examples include fluorescent hypoxia stains, fluorescent protein reporter systems, phosphorescent probes and nanosensors of different types. These techniques allow high-resolution mapping of O 2 gradients in live or post-mortem tissue, in 2D or 3D, qualitatively or quantitatively. They enable control and monitoring of oxygenation conditions and their correlation with other biomarkers of cell and tissue function. Comparison of these techniques and corresponding imaging setups, their analytical capabilities and typical applications are given.

  6. Demonstration of the feasibility of an integrated x ray laboratory for planetary exploration

    NASA Technical Reports Server (NTRS)

    Franco, E. D.; Kerner, J. A.; Koppel, L. N.; Boyle, M. J.

    1993-01-01

    The identification of minerals and elemental compositions is an important component in the geological and exobiological exploration of the solar system. X ray diffraction and fluorescence are common techniques for obtaining these data. The feasibility of combining these analytical techniques in an integrated x ray laboratory compatible with the volume, mass, and power constraints imposed by many planetary missions was demonstrated. Breadboard level hardware was developed to cover the range of diffraction lines produced by minerals, clays, and amorphous; and to detect the x ray fluorescence emissions of elements from carbon through uranium. These breadboard modules were fabricated and used to demonstrate the ability to detect elements and minerals. Additional effort is required to establish the detection limits of the breadboard modules and to integrate diffraction and fluorescence techniques into a single unit. It was concluded that this integrated x ray laboratory capability will be a valuable tool in the geological and exobiological exploration of the solar system.

  7. Intercomparison of HONO Measurements Made Using Wet-Chemical (NITROMAC) and Spectroscopic (IBBCEAS & LP/FAGE) Techniques

    NASA Astrophysics Data System (ADS)

    Dusanter, S.; Lew, M.; Bottorff, B.; Bechara, J.; Mielke, L. H.; Berke, A.; Raff, J. D.; Stevens, P. S.; Afif, C.

    2013-12-01

    A good understanding of the oxidative capacity of the atmosphere is important to tackle fundamental issues related to climate change and air quality. The hydroxyl radical (OH) is the dominant oxidant in the daytime troposphere and an accurate description of its sources in atmospheric models is of utmost importance. Recent field studies indicate higher-than-expected concentrations of HONO during the daytime, suggesting that the photolysis of HONO may be an important underestimated source of OH. Understanding the tropospheric HONO budget requires confidence in analytical instrumentation capable of selectively measuring HONO. In this presentation, we discuss an intercomparison study of HONO measurements performed during summer 2013 at the edge of a hardwood forest in Southern Indiana. This exercise involved a wet chemical technique (NITROMAC), an Incoherent Broad-Band Cavity Enhanced Absorption Spectroscopy instrument (IBBCEAS), and a Laser-Photofragmentation/Fluorescence Assay by Gas Expansion instrument (LP/FAGE). The agreement observed between the three techniques will be discussed for both ambient measurements and cross calibration experiments.

  8. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    PubMed Central

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing; Ibrahim, Yehia M.; Burnum-Johnson, Kristin E.; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Baker, Erin S.

    2017-01-01

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. Though IMS alone is useful, its coupling with mass spectrometry (MS) and front-end separations is extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information available from biological and environmental sample analyses. In fact, multiple disease screening and environmental evaluations have illustrated that the IMS-based multidimensional separations extract information that cannot be acquired with each technique individually. This review highlights three-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography, supercritical fluid chromatography, liquid chromatography, solid-phase extractions, capillary electrophoresis, field asymmetric ion mobility spectrometry, and microfluidic devices. The origination, current state, various applications, and future capabilities of these multidimensional approaches are described in detail to provide insight into their uses and benefits. PMID:28301728

  9. Nanoscale infrared spectroscopy as a non-destructive probe of extraterrestrial samples.

    PubMed

    Dominguez, Gerardo; Mcleod, A S; Gainsforth, Zack; Kelly, P; Bechtel, Hans A; Keilmann, Fritz; Westphal, Andrew; Thiemens, Mark; Basov, D N

    2014-12-09

    Advances in the spatial resolution of modern analytical techniques have tremendously augmented the scientific insight gained from the analysis of natural samples. Yet, while techniques for the elemental and structural characterization of samples have achieved sub-nanometre spatial resolution, infrared spectral mapping of geochemical samples at vibrational 'fingerprint' wavelengths has remained restricted to spatial scales >10 μm. Nevertheless, infrared spectroscopy remains an invaluable contactless probe of chemical structure, details of which offer clues to the formation history of minerals. Here we report on the successful implementation of infrared near-field imaging, spectroscopy and analysis techniques capable of sub-micron scale mineral identification within natural samples, including a chondrule from the Murchison meteorite and a cometary dust grain (Iris) from NASA's Stardust mission. Complementary to scanning electron microscopy, energy-dispersive X-ray spectroscopy and transmission electron microscopy probes, this work evidences a similarity between chondritic and cometary materials, and inaugurates a new era of infrared nano-spectroscopy applied to small and invaluable extraterrestrial samples.

  10. An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1994-01-01

    The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.

  11. Development Of Antibody-Based Fiber-Optic Sensors

    NASA Astrophysics Data System (ADS)

    Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan

    1988-06-01

    The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.

  12. MOMA Gas Chromatograph-Mass Spectrometer onboard the 2018 ExoMars Mission: results and performance

    NASA Astrophysics Data System (ADS)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Humeau, O.; van Amerom, F. H.; Danell, R.; Freissinet, C.; Brinckerhoff, W.; Gonnsen, Z.; Mahaffy, P. R.; Coll, P.; Raulin, F.; Goesmann, F.

    2015-10-01

    The Mars Organic Molecule Analyzer (MOMA) is a dual ion source linear ion trap mass spectrometer that was designed for the 2018 joint ESA-Roscosmos mission to Mars. The main scientific aim of the mission is to search for signs of extant or extinct life in the near subsurface of Mars by acquiring samples from as deep as 2 m below the surface. MOMA will be a key analytical tool in providing chemical (molecular and chiral) information from the solid samples, with particular focus on the characterization of organic content. The MOMA instrument, itself, is a joint venture for NASA and ESA to develop a mass spectrometer capable of analyzing samples from pyrolysis/chemical derivatization gas chromatography (GC) as well as ambient pressure laser desorption ionization (LDI). The combination of the two analytical techniques allows for the chemical characterization of a broad range of compounds, including volatile and non-volatile species. Generally, MOMA can provide information on elemental and molecular makeup, polarity, chirality and isotopic patterns of analyte species. Here we report on the current performance of the MOMA prototype instruments, specifically the demonstration of the gas chromatographymass spectrometry (GC-MS) mode of operation.

  13. TopicLens: Efficient Multi-Level Visual Topic Exploration of Large-Scale Document Collections.

    PubMed

    Kim, Minjeong; Kang, Kyeongpil; Park, Deokgun; Choo, Jaegul; Elmqvist, Niklas

    2017-01-01

    Topic modeling, which reveals underlying topics of a document corpus, has been actively adopted in visual analytics for large-scale document collections. However, due to its significant processing time and non-interactive nature, topic modeling has so far not been tightly integrated into a visual analytics workflow. Instead, most such systems are limited to utilizing a fixed, initial set of topics. Motivated by this gap in the literature, we propose a novel interaction technique called TopicLens that allows a user to dynamically explore data through a lens interface where topic modeling and the corresponding 2D embedding are efficiently computed on the fly. To support this interaction in real time while maintaining view consistency, we propose a novel efficient topic modeling method and a semi-supervised 2D embedding algorithm. Our work is based on improving state-of-the-art methods such as nonnegative matrix factorization and t-distributed stochastic neighbor embedding. Furthermore, we have built a web-based visual analytics system integrated with TopicLens. We use this system to measure the performance and the visualization quality of our proposed methods. We provide several scenarios showcasing the capability of TopicLens using real-world datasets.

  14. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  15. Preliminary studies of using preheated carrier gas for on-line membrane extraction of semivolatile organic compounds.

    PubMed

    Liu, Xinyu; Pawliszyn, Janusz

    2007-04-01

    In this paper, we present results for the on-line determination of semivolatile organic compounds (SVOCs) in air using membrane extraction with a sorbent interface-ion mobility spectrometry (MESI-IMS) system with a preheated carrier (stripping) gas. The mechanism of the mass transfer of SVOCs across a membrane was initially studied. In comparison with the extraction of volatile analytes, the mass transfer resistance that originated from the slow desorption from the internal membrane surface during the SVOC extraction processes should be taken into account. A preheated carrier gas system was therefore built to facilitate desorption of analytes from the internal membrane surface. With the benefit of a temperature gradient existing between the internal and external membrane surfaces, an increase in the desorption rate of a specific analyte at the internal surface and the diffusion coefficient within the membrane could be achieved while avoiding a decrease of the distribution constant on the external membrane interface. This technique improved both the extraction rate and response times of the MESI-IMS system for the analysis of SVOCs. Finally, the MESI-IMS system was shown to be capable of on-site measurement by monitoring selected polynuclear aromatic hydrocarbons emitted from cigarette smoke.

  16. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  17. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  18. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  19. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries.

    PubMed

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-15

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 - 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  20. A Newton–Krylov method with an approximate analytical Jacobian for implicit solution of Navier–Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    PubMed Central

    Asgharzadeh, Hafez; Borazjani, Iman

    2016-01-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 – 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80–90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future. PMID:28042172

  1. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    NASA Astrophysics Data System (ADS)

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for non-linear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form a preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42-74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal and full Jacobian, respectivley, when the stretching factor was increased. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  2. Hybrid methods for cybersecurity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling andmore » analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and years to hours and days for the application of new modeling and analysis capabilities to emerging threats. The development and deployment framework has been generalized into the Hybrid Framework and incor- porated into several LDRD, WFO, and DOE/CSL projects and proposals. And most importantly, the Hybrid project has provided Sandia security analysts with new, scalable, extensible analytic capabilities that have resulted in alerts not detectable using their previous work ow tool sets.« less

  3. Optics-Integrated Microfluidic Platforms for Biomolecular Analyses

    PubMed Central

    Bates, Kathleen E.; Lu, Hang

    2016-01-01

    Compared with conventional optical methods, optics implemented on microfluidic chips provide small, and often much cheaper ways to interrogate biological systems from the level of single molecules up to small model organisms. The optical probing of single molecules has been used to investigate the mechanical properties of individual biological molecules; however, multiplexing of these measurements through microfluidics and nanofluidics confers many analytical advantages. Optics-integrated microfluidic systems can significantly simplify sample processing and allow a more user-friendly experience; alignments of on-chip optical components are predetermined during fabrication and many purely optical techniques are passively controlled. Furthermore, sample loss from complicated preparation and fluid transfer steps can be virtually eliminated, a particularly important attribute for biological molecules at very low concentrations. Excellent fluid handling and high surface area/volume ratios also contribute to faster detection times for low abundance molecules in small sample volumes. Although integration of optical systems with classical microfluidic analysis techniques has been limited, microfluidics offers a ready platform for interrogation of biophysical properties. By exploiting the ease with which fluids and particles can be precisely and dynamically controlled in microfluidic devices, optical sensors capable of unique imaging modes, single molecule manipulation, and detection of minute changes in concentration of an analyte are possible. PMID:27119629

  4. Plant-based Food and Feed Protein Structure Changes Induced by Gene-transformation heating and bio-ethanol processing: A Synchrotron-based Molecular Structure and Nutrition Research Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P Yu

    Unlike traditional 'wet' analytical methods which during processing for analysis often result in destruction or alteration of the intrinsic protein structures, advanced synchrotron radiation-based Fourier transform infrared microspectroscopy has been developed as a rapid and nondestructive and bioanalytical technique. This cutting-edge synchrotron-based bioanalytical technology, taking advantages of synchrotron light brightness (million times brighter than sun), is capable of exploring the molecular chemistry or structure of a biological tissue without destruction inherent structures at ultra-spatial resolutions. In this article, a novel approach is introduced to show the potential of the advanced synchrotron-based analytical technology, which can be used to study plant-basedmore » food or feed protein molecular structure in relation to nutrient utilization and availability. Recent progress was reported on using synchrotron-based bioanalytical technique synchrotron radiation-based Fourier transform infrared microspectroscopy and diffused reflectance infrared Fourier transform spectroscopy to detect the effects of gene-transformation (Application 1), autoclaving (Application 2), and bio-ethanol processing (Application 3) on plant-based food and feed protein structure changes on a molecular basis. The synchrotron-based technology provides a new approach for plant-based protein structure research at ultra-spatial resolutions at cellular and molecular levels.« less

  5. Guided Text Search Using Adaptive Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Symons, Christopher T; Senter, James K

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interactsmore » with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.« less

  6. Analysis of Listeria using exogenous volatile organic compound metabolites and their detection by static headspace-multi-capillary column-gas chromatography-ion mobility spectrometry (SHS-MCC-GC-IMS).

    PubMed

    Taylor, Carl; Lough, Fraser; Stanforth, Stephen P; Schwalbe, Edward C; Fowlis, Ian A; Dean, John R

    2017-07-01

    Listeria monocytogenes is a Gram-positive bacterium and an opportunistic food-borne pathogen which poses significant risk to the immune-compromised and pregnant due to the increased likelihood of acquiring infection and potential transmission of infection to the unborn child. Conventional methods of analysis suffer from either long turn-around times or lack the ability to discriminate between Listeria spp. reliably. This paper investigates an alternative method of detecting Listeria spp. using two novel enzyme substrates that liberate exogenous volatile organic compounds in the presence of α-mannosidase and D-alanyl aminopeptidase. The discriminating capabilities of this approach for identifying L. monocytogenes from other species of Listeria are investigated. The liberated volatile organic compounds (VOCs) are detected using an automated analytical technique based on static headspace-multi-capillary column-gas chromatography-ion mobility spectrometry (SHS-MCC-GC-IMS). The results obtained by SHS-MCC-GC-IMS are compared with those obtained by the more conventional analytical technique of headspace-solid phase microextraction-gas chromatography-mass spectrometry (HS-SPME-GC-MS). The results found that it was possible to differentiate between L. monocytogenes and L. ivanovii, based on their VOC response from α-mannosidase activity.

  7. Quasi-monodimensional polyaniline nanostructures for enhanced molecularly imprinted polymer-based sensing.

    PubMed

    Berti, Francesca; Todros, Silvia; Lakshmi, Dhana; Whitcombe, Michael J; Chianella, Iva; Ferroni, Matteo; Piletsky, Sergey A; Turner, Anthony P F; Marrazza, Giovanna

    2010-10-15

    Recent advances in nanotechnology have allowed significant progress in utilising cutting-edge techniques associated with nanomaterials and nano-fabrication to expand the scope and capability of biosensors to a new level of novelty and functionality. The aim of this work was the development and characterisation of conductive polyaniline (PANI) nanostructures for applications in electrochemical biosensing. We explore a simple, inexpensive and fast route to grow PANI nanotubes, arranged in an ordered structure directly on an electrode surface, by electrochemical polymerisation using alumina nanoporous membranes as a 'nano-mould'. The deposited nanostructures have been characterised electrochemically and morphologically prior to grafting with a molecularly imprinted polymer (MIP) receptor in order to create a model sensor for catechol detection. In this way, PANI nanostructures resulted in a conductive nanowire system which allowed direct electrical connection between the electrode and the synthetic receptor (MIP). To our knowledge, this is the first example of integration between molecularly imprinted polymers and PANI nanostructured electrodes. The advantages of using nanostructures in this particular biosensing application have been evaluated by comparing the analytical performance of the sensor with an analogous non-nanostructured MIP-sensor for catechol detection that was previously developed. A significantly lower limit of detection for catechol has been obtained (29 nM, one order of magnitude), thus demonstrating that the nanostructures are capable of improving the analytical performance of the sensor. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Use of artificial intelligence in analytical systems for the clinical laboratory

    PubMed Central

    Truchaud, Alain; Ozawa, Kyoichi; Pardue, Harry; Schnipelsky, Paul

    1995-01-01

    The incorporation of information-processing technology into analytical systems in the form of standard computing software has recently been advanced by the introduction of artificial intelligence (AI), both as expert systems and as neural networks. This paper considers the role of software in system operation, control and automation, and attempts to define intelligence. AI is characterized by its ability to deal with incomplete and imprecise information and to accumulate knowledge. Expert systems, building on standard computing techniques, depend heavily on the domain experts and knowledge engineers that have programmed them to represent the real world. Neural networks are intended to emulate the pattern-recognition and parallel processing capabilities of the human brain and are taught rather than programmed. The future may lie in a combination of the recognition ability of the neural network and the rationalization capability of the expert system. In the second part of the paper, examples are given of applications of AI in stand-alone systems for knowledge engineering and medical diagnosis and in embedded systems for failure detection, image analysis, user interfacing, natural language processing, robotics and machine learning, as related to clinical laboratories. It is concluded that AI constitutes a collective form of intellectual propery, and that there is a need for better documentation, evaluation and regulation of the systems already being used in clinical laboratories. PMID:18924784

  9. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Impact of automation on mass spectrometry.

    PubMed

    Zhang, Yan Victoria; Rockwood, Alan

    2015-10-23

    Mass spectrometry coupled to liquid chromatography (LC-MS and LC-MS/MS) is an analytical technique that has rapidly grown in popularity in clinical practice. In contrast to traditional technology, mass spectrometry is superior in many respects including resolution, specificity, multiplex capability and has the ability to measure analytes in various matrices. Despite these advantages, LC-MS/MS remains high cost, labor intensive and has limited throughput. This specialized technology requires highly trained personnel and therefore has largely been limited to large institutions, academic organizations and reference laboratories. Advances in automation will be paramount to break through this bottleneck and increase its appeal for routine use. This article reviews these challenges, shares perspectives on essential features for LC-MS/MS total automation and proposes a step-wise and incremental approach to achieve total automation through reducing human intervention, increasing throughput and eventually integrating the LC-MS/MS system into the automated clinical laboratory operations. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Instrumental neutron activation analysis for studying size-fractionated aerosols

    NASA Astrophysics Data System (ADS)

    Salma, Imre; Zemplén-Papp, Éva

    1999-10-01

    Instrumental neutron activation analysis (INAA) was utilized for studying aerosol samples collected into a coarse and a fine size fraction on Nuclepore polycarbonate membrane filters. As a result of the panoramic INAA, 49 elements were determined in an amount of about 200-400 μg of particulate matter by two irradiations and four γ-spectrometric measurements. The analytical calculations were performed by the absolute ( k0) standardization method. The calibration procedures, application protocol and the data evaluation process are described and discussed. They make it possible now to analyse a considerable number of samples, with assuring the quality of the results. As a means of demonstrating the system's analytical capabilities, the concentration ranges, median or mean atmospheric concentrations and detection limits are presented for an extensive series of aerosol samples collected within the framework of an urban air pollution study in Budapest. For most elements, the precision of the analysis was found to be beyond the uncertainty represented by the sampling techniques and sample variability.

  12. Small Gas Turbine Combustor Primary Zone Study

    NASA Technical Reports Server (NTRS)

    Sullivan, R. E.; Young, E. R.; Miles, G. A.; Williams, J. R.

    1983-01-01

    A development process is described which consists of design, fabrication, and preliminary test evaluations of three approaches to internal aerodynamic primary zone flow patterns: (1) conventional double vortex swirl stabilization; (2) reverse flow swirl stabilization; and (3) large single vortex flow system. Each concept incorporates special design features aimed at extending the performance capability of the small engine combustor. Since inherent geometry of these combustors result in small combustion zone height and high surface area to volume ratio, design features focus on internal aerodynamics, fuel placement, and advanced cooling. The combustors are evaluated on a full scale annular combustor rig. A correlation of the primary zone performance with the overall performance is accomplished using three intrusion type gas sampling probes located at the exit of the primary zone section. Empirical and numerical methods are used for designing and predicting the performance of the three combustor concepts and their subsequent modifications. The calibration of analytical procedures with actual test results permits an updating of the analytical design techniques applicable to small reverse flow annular combustors.

  13. [Sustainability analysis of an evaluation policy: the case of primary health care in Brazil].

    PubMed

    Felisberto, Eronildo; Freese, Eduardo; Bezerra, Luciana Caroline Albuquerque; Alves, Cinthia Kalyne de Almeida; Samico, Isabella

    2010-06-01

    This study analyzes the sustainability of Brazil's National Policy for the Evaluation of Primary Health Care, based on the identification and categorization of representative critical events in the institutionalization process. This was an evaluative study of two analytical units: Federal Management of Primary Health Care and State Health Secretariats, using multiple case studies with data collected through interviews and institutional documents, using the critical incidents technique. Events that were temporally classified as specific to implementation, sustainability, and mixed were categorized analytically as pertaining to memory, adaptation, values, and rules. Federal management and one of the State Health Secretariats showed medium-level sustainability, while the other State Secretariat showed strong sustainability. The results indicate that the events were concurrent and suggest a weighting process, since the adaptation of activities, adequacy, and stabilization of resources displayed a strong influence on the others. Innovations and the development of technical capability are considered the most important results for sustainability.

  14. Enantioseparation by Capillary Electrophoresis Using Ionic Liquids as Chiral Selectors.

    PubMed

    Greño, Maider; Marina, María Luisa; Castro-Puyana, María

    2018-11-02

    Capillary electrophoresis (CE) is one of the most widely employed analytical techniques to achieve enantiomeric separations. In spite of the fact that there are many chiral selectors commercially available to perform enantioseparations by CE, one of the most relevant topics in this field is the search for new selectors capable of providing high enantiomeric resolutions. Chiral ionic liquids (CILs) have interesting characteristics conferring them a high potential in chiral separations although only some of them are commercially available. The aim of this article is to review all the works published on the use of CILs as chiral selectors in the development of enantioselective methodologies by CE, covering the period from 2006 (when the first research work on this topic was published) to 2017. The use of CILs as sole chiral selectors, as chiral selectors in dual systems or as chiral ligands will be considered. This review also provides detailed analytical information on the experimental conditions used to carry out enantioseparations in different fields as well as on the separation mechanism involved.

  15. Automated headspace solid-phase dynamic extraction to analyse the volatile fraction of food matrices.

    PubMed

    Bicchi, Carlo; Cordero, Chiara; Liberto, Erica; Rubiolo, Patrizia; Sgorbini, Barbara

    2004-01-23

    High concentration capacity headspace techniques (headspace solid-phase microextraction (HS-SPME) and headspace sorptive extraction (HSSE)) are a bridge between static and dynamic headspace, since they give high concentration factors as does dynamic headspace (D-HS), and are as easy to apply and as reproducible as static headspace (S-HS). In 2000, Chromtech (Idstein, Germany) introduced an inside-needle technique for vapour and liquid sampling, solid-phase dynamic extraction (SPDE), also known as "the magic needle". In SPDE, analytes are concentrated on a 50 microm film of polydimethylsiloxane (PDMS) and activated carbon (10%) coated onto the inside wall of the stainless steel needle (5 cm) of a 2.5 ml gas tight syringe. When SPDE is used for headspace sampling (HS-SPDE), a fixed volume of the headspace of the sample under investigation is sucked up an appropriate number of times with the gas tight syringe and an analyte amount suitable for a reliable GC or GC-MS analysis accumulates in the polymer coating the needle wall. This article describes the preliminary results of both a study on the optimisation of sampling parameters conditioning HS-SPDE recovery, through the analysis of a standard mixture of highly volatile compounds (beta-pinene, isoamyl acetate and linalool) and of the HS-SPDE-GC-MS analyses of aromatic plants and food matrices. This study shows that HS-SPDE is a successful technique for HS-sampling with high concentration capability, good repeatability and intermediate precision, also when it is compared to HS-SPME.

  16. Sample analysis at Mars

    NASA Astrophysics Data System (ADS)

    Coll, P.; Cabane, M.; Mahaffy, P. R.; Brinckerhoff, W. B.; Sam Team

    The next landed missions to Mars, such as the planned Mars Science Laboratory and ExoMars, will require sample analysis capabilities refined well beyond what has been flown to date. A key science objective driving this requirement is the determination of the carbon inventory of Mars, and particularly the detection of organic compounds. The Sample Analysis at Mars (SAM) suite consists of a group of tightly-integrated experiments that would analyze samples delivered directly from a coring drill or by a facility sample processing and delivery (SPAD) mechanism. SAM consists of an advanced GC/MS system and a laser desorption mass spectrometer (LDMS). The combined capabilities of these techniques can address Mars science objectives with much improved sensitivity, resolution, and analytical breadth over what has been previously possible in situ. The GC/MS system analyzes the bulk composition (both molecular and isotopic) of solid-phase and atmospheric samples. Solid samples are introduced with a highly flexible chemical derivatization/pyrolysis subsystem (Pyr/GC/MS) that is significantly more capable than the mass spectrometers on Viking. The LDMS analyzes local elemental and molecular composition in solid samples vaporized and ionized with a pulsed laser. We will describe how each of these capabilities has particular strengths that can achieve key measurement objectives at Mars. In addition, the close codevelopment of the GC/MS and LDMS along with a sample manipulation system enables the the sharing of resources, the correlation of results, and the utilization of certain approaches that would not be possible with separate instruments. For instance, the same samples could be analyzed with more than one technique, increasing efficiency and providing cross-checks for quantification. There is also the possibility of combining methods, such as by permitting TOF-MS analyses of evolved gas (Pyr/EI-TOF-MS) or GC/MS analyses of laser evaporated gas (LD-GC/MS).

  17. Can neutral analytes be concentrated by transient isotachophoresis in micellar electrokinetic chromatography and how much?

    PubMed

    Matczuk, Magdalena; Foteeva, Lidia S; Jarosz, Maciej; Galanski, Markus; Keppler, Bernhard K; Hirokawa, Takeshi; Timerbaev, Andrei R

    2014-06-06

    Transient isotachophoresis (tITP) is a versatile sample preconcentration technique that uses ITP to focus electrically charged analytes at the initial stage of CE analysis. However, according to the ruling principle of tITP, uncharged analytes are beyond its capacity while being separated and detected by micellar electrokinetic chromatography (MEKC). On the other hand, when these are charged micelles that undergo the tITP focusing, one can anticipate the concentration effect, resulting from the formation of transient micellar stack at moving sample/background electrolyte (BGE) boundary, which increasingly accumulates the analytes. This work expands the enrichment potential of tITP for MEKC by demonstrating the quantitative analysis of uncharged metal-based drugs from highly saline samples and introducing to the BGE solution anionic surfactants and buffer (terminating) co-ions of different mobility and concentration to optimize performance. Metallodrugs of assorted lipophilicity were chosen so as to explore whether their varying affinity toward micelles plays the role. In addition to altering the sample and BGE composition, optimization of the detection capability was achieved due to fine-tuning operational variables such as sample volume, separation voltage and pressure, etc. The results of optimization trials shed light on the mechanism of micellar tITP and render effective determination of selected drugs in human urine, with practical limits of detection using conventional UV detector. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Analytical and functional similarity of Amgen biosimilar ABP 215 to bevacizumab.

    PubMed

    Seo, Neungseon; Polozova, Alla; Zhang, Mingxuan; Yates, Zachary; Cao, Shawn; Li, Huimin; Kuhns, Scott; Maher, Gwendolyn; McBride, Helen J; Liu, Jennifer

    ABP 215 is a biosimilar product to bevacizumab. Bevacizumab acts by binding to vascular endothelial growth factor A, inhibiting endothelial cell proliferation and new blood vessel formation, thereby leading to tumor vasculature normalization. The ABP 215 analytical similarity assessment was designed to assess the structural and functional similarity of ABP 215 and bevacizumab sourced from both the United States (US) and the European Union (EU). Similarity assessment was also made between the US- and EU-sourced bevacizumab to assess the similarity between the two products. The physicochemical properties and structural similarity of ABP 215 and bevacizumab were characterized using sensitive state-of-the-art analytical techniques capable of detecting small differences in product attributes. ABP 215 has the same amino acid sequence and exhibits similar post-translational modification profiles compared to bevacizumab. The functional similarity assessment employed orthogonal assays designed to interrogate all expected biological activities, including those known to affect the mechanisms of action for ABP 215 and bevacizumab. More than 20 batches of bevacizumab (US) and bevacizumab (EU), and 13 batches of ABP 215 representing unique drug substance lots were assessed for similarity. The large dataset allows meaningful comparisons and garners confidence in the overall conclusion for the analytical similarity assessment of ABP 215 to both US- and EU-sourced bevacizumab. The structural and purity attributes, and biological properties of ABP 215 are demonstrated to be highly similar to those of bevacizumab.

  19. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  20. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  1. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  2. Telematics Options and Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Cabell

    This presentation describes the data tracking and analytical capabilities of telematics devices. Federal fleet managers can use the systems to keep their drivers safe, maintain a fuel efficient fleet, ease their reporting burden, and save money. The presentation includes an example of how much these capabilities can save fleets.

  3. Raman Spectroscopic Analysis of Geological and Biogeological Specimens of Relevance to the ExoMars Mission

    PubMed Central

    Edwards, Howell G.M.; Ingley, Richard; Parnell, John; Vítek, Petr; Jehlička, Jan

    2013-01-01

    Abstract A novel miniaturized Raman spectrometer is scheduled to fly as part of the analytical instrumentation package on an ESA remote robotic lander in the ESA/Roscosmos ExoMars mission to search for evidence for extant or extinct life on Mars in 2018. The Raman spectrometer will be part of the first-pass analytical stage of the sampling procedure, following detailed surface examination by the PanCam scanning camera unit on the ExoMars rover vehicle. The requirements of the analytical protocol are stringent and critical; this study represents a laboratory blind interrogation of specimens that form a list of materials that are of relevance to martian exploration and at this stage simulates a test of current laboratory instrumentation to highlight the Raman technique strengths and possible weaknesses that may be encountered in practice on the martian surface and from which future studies could be formulated. In this preliminary exercise, some 10 samples that are considered terrestrial representatives of the mineralogy and possible biogeologically modified structures that may be identified on Mars have been examined with Raman spectroscopy, and conclusions have been drawn about the viability of the unambiguous spectral identification of biomolecular life signatures. It is concluded that the Raman spectroscopic technique does indeed demonstrate the capability to identify biomolecular signatures and the mineralogy in real-world terrestrial samples with a very high degree of success without any preconception being made about their origin and classification. Key Words: Biosignatures—Mars Exploration Rovers—Raman spectroscopy—Search for life (biosignatures)—Planetary instrumentation. Astrobiology 13, 543–549. PMID:23758166

  4. Electrochemical Quartz Crystal Nanobalance (EQCN) Based Biosensor for Sensitive Detection of Antibiotic Residues in Milk.

    PubMed

    Bhand, Sunil; Mishra, Geetesh K

    2017-01-01

    An electrochemical quartz crystal nanobalance (EQCN), which provides real-time analysis of dynamic surface events, is a valuable tool for analyzing biomolecular interactions. EQCN biosensors are based on mass-sensitive measurements that can detect small mass changes caused by chemical binding to small piezoelectric crystals. Among the various biosensors, the piezoelectric biosensor is considered one of the most sensitive analytical techniques, capable of detecting antigens at picogram levels. EQCN is an effective monitoring technique for regulation of the antibiotics below the maximum residual limit (MRL). The analysis of antibiotic residues requires high sensitivity, rapidity, reliability and cost effectiveness. For analytical purposes the general approach is to take advantage of the piezoelectric effect by immobilizing a biosensing layer on top of the piezoelectric crystal. The sensing layer usually comprises a biological material such as an antibody, enzymes, or aptamers having high specificity and selectivity for the target molecule to be detected. The biosensing layer is usually functionalized using surface chemistry modifications. When these bio-functionalized quartz crystals are exposed to a particular substance of interest (e.g., a substrate, inhibitor, antigen or protein), binding interaction occurs. This causes a frequency or mass change that can be used to determine the amount of material interacted or bound. EQCN biosensors can easily be automated by using a flow injection analysis (FIA) setup coupled through automated pumps and injection valves. Such FIA-EQCN biosensors have great potential for the detection of different analytes such as antibiotic residues in various matrices such as water, waste water, and milk.

  5. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  6. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. On-chip collection of particles and cells by AC electroosmotic pumping and dielectrophoresis using asymmetric microelectrodes.

    PubMed

    Melvin, Elizabeth M; Moore, Brandon R; Gilchrist, Kristin H; Grego, Sonia; Velev, Orlin D

    2011-09-01

    The recent development of microfluidic "lab on a chip" devices requiring sample sizes <100 μL has given rise to the need to concentrate dilute samples and trap analytes, especially for surface-based detection techniques. We demonstrate a particle collection device capable of concentrating micron-sized particles in a predetermined area by combining AC electroosmosis (ACEO) and dielectrophoresis (DEP). The planar asymmetric electrode pattern uses ACEO pumping to induce equal, quadrilateral flow directed towards a stagnant region in the center of the device. A number of system parameters affecting particle collection efficiency were investigated including electrode and gap width, chamber height, applied potential and frequency, and number of repeating electrode pairs and electrode geometry. The robustness of the on-chip collection design was evaluated against varying electrolyte concentrations, particle types, and particle sizes. These devices are amenable to integration with a variety of detection techniques such as optical evanescent waveguide sensing.

  8. Uncertainty management by relaxation of conflicting constraints in production process scheduling

    NASA Technical Reports Server (NTRS)

    Dorn, Juergen; Slany, Wolfgang; Stary, Christian

    1992-01-01

    Mathematical-analytical methods as used in Operations Research approaches are often insufficient for scheduling problems. This is due to three reasons: the combinatorial complexity of the search space, conflicting objectives for production optimization, and the uncertainty in the production process. Knowledge-based techniques, especially approximate reasoning and constraint relaxation, are promising ways to overcome these problems. A case study from an industrial CIM environment, namely high-grade steel production, is presented to demonstrate how knowledge-based scheduling with the desired capabilities could work. By using fuzzy set theory, the applied knowledge representation technique covers the uncertainty inherent in the problem domain. Based on this knowledge representation, a classification of jobs according to their importance is defined which is then used for the straightforward generation of a schedule. A control strategy which comprises organizational, spatial, temporal, and chemical constraints is introduced. The strategy supports the dynamic relaxation of conflicting constraints in order to improve tentative schedules.

  9. A Global Optimization Methodology for Rocket Propulsion Applications

    NASA Technical Reports Server (NTRS)

    2001-01-01

    While the response surface method is an effective method in engineering optimization, its accuracy is often affected by the use of limited amount of data points for model construction. In this chapter, the issues related to the accuracy of the RS approximations and possible ways of improving the RS model using appropriate treatments, including the iteratively re-weighted least square (IRLS) technique and the radial-basis neural networks, are investigated. A main interest is to identify ways to offer added capabilities for the RS method to be able to at least selectively improve the accuracy in regions of importance. An example is to target the high efficiency region of a fluid machinery design space so that the predictive power of the RS can be maximized when it matters most. Analytical models based on polynomials, with controlled level of noise, are used to assess the performance of these techniques.

  10. A study of elevated temperature testing techniques for the fatigue behavior of PMCS: Application to T650-35/AMB21

    NASA Technical Reports Server (NTRS)

    Gyekenyesi, Andrew L.; Gastelli, Michael G.; Ellis, John R.; Burke, Christopher S.

    1995-01-01

    An experimental study was conducted to investigate the mechanical behavior of a T650-35/AMB21 eight-harness satin weave polymer composite system. Emphasis was placed on the development and refinement of techniques used in elevated temperature uniaxial PMC testing. Issues such as specimen design, gripping, strain measurement, and temperature control and measurement were addressed. Quasi-static tensile and fatigue properties (R(sub sigma) = 0.1) were examined at room and elevated temperatures. Stiffness degradation and strain accumulation during fatigue cycling were recorded to monitor damage progression and provide insight for future analytical modeling efforts. Accomplishments included an untabbed dog-bone specimen design which consistently failed in the gage section, accurate temperature control and assessment, and continuous in-situ strain measurement capability during fatigue loading at elevated temperatures. Finally, strain accumulation and stiffness degradation during fatigue cycling appeared to be good indicators of damage progression.

  11. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  12. Geographic Information System (GIS) capabilities in traffic accident information management: a qualitative approach

    PubMed Central

    Ahmadi, Maryam; Valinejadi, Ali; Goodarzi, Afshin; Safari, Ameneh; Hemmat, Morteza; Majdabadi, Hesamedin Askari; Mohammadi, Ali

    2017-01-01

    Background Traffic accidents are one of the more important national and international issues, and their consequences are important for the political, economical, and social level in a country. Management of traffic accident information requires information systems with analytical and accessibility capabilities to spatial and descriptive data. Objective The aim of this study was to determine the capabilities of a Geographic Information System (GIS) in management of traffic accident information. Methods This qualitative cross-sectional study was performed in 2016. In the first step, GIS capabilities were identified via literature retrieved from the Internet and based on the included criteria. Review of the literature was performed until data saturation was reached; a form was used to extract the capabilities. In the second step, study population were hospital managers, police, emergency, statisticians, and IT experts in trauma, emergency and police centers. Sampling was purposive. Data was collected using a questionnaire based on the first step data; validity and reliability were determined by content validity and Cronbach’s alpha of 75%. Data was analyzed using the decision Delphi technique. Results GIS capabilities were identified in ten categories and 64 sub-categories. Import and process of spatial and descriptive data and so, analysis of this data were the most important capabilities of GIS in traffic accident information management. Conclusion Storing and retrieving of descriptive and spatial data, providing statistical analysis in table, chart and zoning format, management of bad structure issues, determining the cost effectiveness of the decisions and prioritizing their implementation were the most important capabilities of GIS which can be efficient in the management of traffic accident information. PMID:28848627

  13. Preliminary system design of a Three Arm Capture Mechanism (TACM) flight demonstration article

    NASA Technical Reports Server (NTRS)

    Schaefer, Otto; Stasi, Bill

    1993-01-01

    The overall objective of the Three Arm Capture Mechanism (TACM) is to serve as a demonstration of capability for capture of objects in space. These objects could be satellites, expended boosters, pieces of debris, etc.; anything of significant size. With this capability we can significantly diminish the danger of major collisions of debris with valuable space assets and with each other, which would otherwise produce many smaller, high velocity pieces of debris which also become concerns. The captured objects would be jettisoned into the atmosphere, relocated in 'parking' orbits, or recovered for disposition or refurbishment. The dollar value of satellites launched into space continues to grow along with the cost of insurance; having a capture capability takes a positive step towards diminishing this added cost. The effort covered is a planning step towards a flight demonstration of the satellite capture capability. Based on the requirement to capture a communication class satellite, its associated booster, or both, a preliminary system definition of a retrieval kit is defined. The objective of the flight demonstration is to demonstrate the techniques proposed to perform the mission and to obtain data on technical issues requiring an in situ space environment. The former especially includes issues such as automated image recognition techniques and control strategies that enable an unmanned vehicle to rendezvous and capture a satellite, contact dynamics between the two bodies, and the flight segment level of automation required to support the mission. A development plan for the operational retrieval capability includes analysis work, computer and ground test simulations, and finally a flight demonstration. A concept to perform a selected mission capturing a precessing communications satellite is described. Further development efforts using analytical tools and laboratory facilities are required prior to reaching the point at which a full commitment to the flight demonstration design can be made.

  14. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  15. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Trahan, Alexis Chanel

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a variety of spontaneous fission-driven fresh fuel assemblies at Los Alamos National Laboratory and the BeRP ball at the Nevada National Security Site. The development of the new, improved analysis and characterization methods with the DDSI instrument makes it a viable technique for implementation in a facility to meet material control and safeguards needs.

  16. Experimental and analytical studies of high heat flux components for fusion experimental reactor

    NASA Astrophysics Data System (ADS)

    Araki, Masanori

    1993-03-01

    In this report, the experimental and analytical results concerning the development of plasma facing components of ITER are described. With respect to developing high heat removal structures for the divertor plates, an externally-finned swirl tube was developed based on the results of critical heat flux (CHF) experiments on various tube structures. As the result, the burnout heat flux, which also indicates incident CHF, of 41 (+/-) 1 MW/sq m was achieved in the externally-finned swirl tube. The applicability of existing CHF correlations based on uniform heating conditions was evaluated by comparing the CHF experimental data with the smooth and the externally-finned tubes under one-sided heating condition. As the results, experimentally determined CHF data for straight tube show good agreement, for the externally-finned tube, no existing correlations are available for prediction of the CHF. With respect to the evaluation of the bonds between carbon-based material and heat sink metal, results of brazing tests were compared with the analytical results by three dimensional model with temperature-dependent thermal and mechanical properties. Analytical results showed that residual stresses from brazing can be estimated by the analytical three directional stress values instead of the equivalent stress value applied. In the analytical study on the separatrix sweeping for effectively reducing surface heat fluxes on the divertor plate, thermal response of the divertor plate was analyzed under ITER relevant heat flux conditions and has been tested. As the result, it has been demonstrated that application of the sweeping technique is very effective for improvement in the power handling capability of the divertor plate and that the divertor mock-up has withstood a large number of additional cyclic heat loads.

  17. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    PubMed

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Purdue Rare Isotope Measurement Laboratory

    NASA Astrophysics Data System (ADS)

    Caffee, M.; Elmore, D.; Granger, D.; Muzikar, P.

    2002-12-01

    The Purdue Rare Isotope Measurement Laboratory (PRIME Lab) is a dedicated research and service facility for accelerator mass spectrometry. AMS is an ultra-sensitive analytical technique used to measure low levels of long-lived cosmic-ray-produced and anthropogenic radionuclides, and rare trace elements. We measure 10Be (T1/2 = 1.5 My), 26Al (.702 My), 36Cl (.301 My), and 129I (16 My), in geologic samples. Applications include dating the cosmic-ray-exposure time of rocks on Earth's surface, determining rock and sediment burial ages, measuring the erosion rates of rocks and soils, and tracing and dating ground water. We perform sample preparation and separation chemistries for these radio-nuclides for our internal research activities and for those external researchers not possessing this capability. Our chemical preparation laboratories also serve as training sites for members of the geoscience community developing these techniques at their institutions. Research at Purdue involves collaborators among members of the Purdue Departments of Physics, Earth and Atmospheric Sciences, Chemistry, Agronomy, and Anthropology. We also collaborate and serve numerous scientists from other institutions. We are currently in the process of modernizing the facility with the goals of higher precision for routinely measured radio-nuclides, increased sample throughput, and the development of new measurement capabilities for the geoscience community.

  19. Status of Fuel Development and Manufacturing for Space Nuclear Reactors at BWX Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carmack, W.J.; Husser, D.L.; Mohr, T.C.

    2004-02-04

    New advanced nuclear space propulsion systems will soon seek a high temperature, stable fuel form. BWX Technologies Inc (BWXT) has a long history of fuel manufacturing. UO2, UCO, and UCx have been fabricated at BWXT for various US and international programs. Recent efforts at BWXT have focused on establishing the manufacturing techniques and analysis capabilities needed to provide a high quality, high power, compact nuclear reactor for use in space nuclear powered missions. To support the production of a space nuclear reactor, uranium nitride has recently been manufactured by BWXT. In addition, analytical chemistry and analysis techniques have been developedmore » to provide verification and qualification of the uranium nitride production process. The fabrication of a space nuclear reactor will require the ability to place an unclad fuel form into a clad structure for assembly into a reactor core configuration. To this end, BWX Technologies has reestablished its capability for machining, GTA welding, and EB welding of refractory metals. Specifically, BWX Technologies has demonstrated GTA welding of niobium flat plate and EB welding of niobium and Nb-1Zr tubing. In performing these demonstration activities, BWX Technologies has established the necessary infrastructure to manufacture UO2, UCx, or UNx fuel, components, and complete reactor assemblies in support of space nuclear programs.« less

  20. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    NASA Astrophysics Data System (ADS)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.

  1. Earth resources mission performance studies. Volume 2: Simulation results

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Simulations were made at three month intervals to investigate the EOS mission performance over the four seasons of the year. The basic objectives of the study were: (1) to evaluate the ability of an EOS type system to meet a representative set of specific collection requirements, and (2) to understand the capabilities and limitations of the EOS that influence the system's ability to satisfy certain collection objectives. Although the results were obtained from a consideration of a two sensor EOS system, the analysis can be applied to any remote sensing system having similar optical and operational characteristics. While the category related results are applicable only to the specified requirement configuration, the results relating to general capability and limitations of the sensors can be applied in extrapolating to other U.S. based EOS collection requirements. The TRW general purpose mission simulator and analytic techniques discussed in this report can be applied to a wide range of collection and planning problems of earth orbiting imaging systems.

  2. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  3. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  4. Leveraging the big-data revolution: CMS is expanding capabilities to spur health system transformation.

    PubMed

    Brennan, Niall; Oelschlaeger, Allison; Cox, Christine; Tavenner, Marilyn

    2014-07-01

    As the largest single payer for health care in the United States, the Centers for Medicare and Medicaid Services (CMS) generates enormous amounts of data. Historically, CMS has faced technological challenges in storing, analyzing, and disseminating this information because of its volume and privacy concerns. However, rapid progress in the fields of data architecture, storage, and analysis--the big-data revolution--over the past several years has given CMS the capabilities to use data in new and innovative ways. We describe the different types of CMS data being used both internally and externally, and we highlight a selection of innovative ways in which big-data techniques are being used to generate actionable information from CMS data more effectively. These include the use of real-time analytics for program monitoring and detecting fraud and abuse and the increased provision of data to providers, researchers, beneficiaries, and other stakeholders. Project HOPE—The People-to-People Health Foundation, Inc.

  5. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  6. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  7. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  8. On-Chip Microfluidic Components for In Situ Analysis, Separation, and Detection of Amino Acids

    NASA Technical Reports Server (NTRS)

    Zheng, Yun; Getty, Stephanie; Dworkin, Jason; Balvin, Manuel; Kotecki, Carl

    2013-01-01

    The Astrobiology Analytical Laboratory at GSFC has identified amino acids in meteorites and returned cometary samples by using liquid chromatography-electrospray ionization time-of-flight mass spectrometry (LCMS). These organic species are key markers for life, having the property of chirality that can be used to distinguish biological from non-biological amino acids. One of the critical components in the benchtop instrument is liquid chromatography (LC) analytical column. The commercial LC analytical column is an over- 250-mm-long and 4.6-mm-diameter stainless steel tube filled with functionized microbeads as stationary phase to separate the molecular species based on their chemistry. Miniaturization of this technique for spaceflight is compelling for future payloads for landed missions targeting astrobiology objectives. A commercial liquid chromatography analytical column consists of an inert cylindrical tube filled with a stationary phase, i.e., microbeads, that has been functionalized with a targeted chemistry. When analyte is sent through the column by a pressurized carrier fluid (typically a methanol/ water mixture), compounds are separated in time due to differences in chemical interactions with the stationary phase. Different species of analyte molecules will interact more strongly with the column chemistry, and will therefore take longer to traverse the column. In this way, the column will separate molecular species based on their chemistry. A lab-on-chip liquid analysis tool was developed. The microfluidic analytical column is capable of chromatographically separating biologically relevant classes of molecules based on their chemistry. For this analytical column, fabrication, low leak rate, and stationary phase incorporation of a serpentine microchannel were demonstrated that mimic the dimensions of a commercial LC column within a 5 10 1 mm chip. The microchannel in the chip has a 75- micrometer-diameter oval-shaped cross section. The serpentine microchannel has four different lengths: 40, 60, 80, and 100 mm. Functionized microbeads were filled inside the microchannel to separate molecular species based on their chemistry.

  9. The use of biochemical methods in extraterrestrial life detection

    NASA Astrophysics Data System (ADS)

    McDonald, Gene

    2006-08-01

    Instrument development for in situ extraterrestrial life detection focuses primarily on the ability to distinguish between biological and non-biological material, mostly through chemical analysis for potential biosignatures (e.g., biogenic minerals, enantiomeric excesses). In constrast, biochemical analysis techniques commonly applied to Earth life focus primarily on the exploration of cellular and molecular processes, not on the classification of a given system as biological or non-biological. This focus has developed because of the relatively large functional gap between life and non-life on Earth today. Life on Earth is very diverse from an environmental and physiological point of view, but is highly conserved from a molecular point of view. Biochemical analysis techniques take advantage of this similarity of all terrestrial life at the molecular level, particularly through the use of biologically-derived reagents (e.g., DNA polymerases, antibodies), to enable analytical methods with enormous sensitivity and selectivity. These capabilities encourage consideration of such reagents and methods for use in extraterrestrial life detection instruments. The utility of this approach depends in large part on the (unknown at this time) degree of molecular compositional differences between extraterrestrial and terrestrial life. The greater these differences, the less useful laboratory biochemical techniques will be without significant modification. Biochemistry and molecular biology methods may need to be "de-focused" in order to produce instruments capable of unambiguously detecting a sufficiently wide range of extraterrestrial biochemical systems. Modern biotechnology tools may make that possible in some cases.

  10. Infrared Spectroscopy as a Chemical Fingerprinting Tool

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2003-01-01

    Infrared (IR) spectroscopy is a powerful analytical tool in the chemical fingerprinting of materials. Any sample material that will interact with infrared light produces a spectrum and, although normally associated with organic materials, inorganic compounds may also be infrared active. The technique is rapid, reproducible and usually non-invasive to the sample. That it is non-invasive allows for additional characterization of the original material using other analytical techniques including thermal analysis and RAMAN spectroscopic techniques. With the appropriate accessories, the technique can be used to examine samples in liquid, solid or gas phase. Both aqueous and non-aqueous free-flowing solutions can be analyzed, as can viscous liquids such as heavy oils and greases. Solid samples of varying sizes and shapes may also be examined and with the addition of microscopic IR (microspectroscopy) capabilities, minute materials such as single fibers and threads may be analyzed. With the addition of appropriate software, microspectroscopy can be used for automated discrete point or compositional surface area mapping, with the latter providing a means to record changes in the chemical composition of a material surface over a defined area. Due to the ability to characterize gaseous samples, IR spectroscopy can also be coupled with thermal processes such as thermogravimetric (TG) analyses to provide both thermal and chemical data in a single run. In this configuration, solids (or liquids) heated in a TG analyzer undergo decomposition, with the evolving gases directed into the IR spectrometer. Thus, information is provided on the thermal properties of a material and the order in which its chemical constituents are broken down during incremental heating. Specific examples of these varied applications will be cited, with data interpretation and method limitations further discussed.

  11. Quantification of rapid environmental redox processes with quick-scanning x-ray absorption spectroscopy (Q-XAS)

    PubMed Central

    Ginder-Vogel, Matthew; Landrot, Gautier; Fischel, Jason S.; Sparks, Donald L.

    2009-01-01

    Quantification of the initial rates of environmental reactions at the mineral/water interface is a fundamental prerequisite to determining reaction mechanisms and contaminant transport modeling and predicting environmental risk. Until recently, experimental techniques with adequate time resolution and elemental sensitivity to measure initial rates of the wide variety of environmental reactions were quite limited. Techniques such as electron paramagnetic resonance and Fourier transform infrared spectroscopies suffer from limited elemental specificity and poor sensitivity to inorganic elements, respectively. Ex situ analysis of batch and stirred-flow systems provides high elemental sensitivity; however, their time resolution is inadequate to characterize rapid environmental reactions. Here we apply quick-scanning x-ray absorption spectroscopy (Q-XAS), at sub-second time-scales, to measure the initial oxidation rate of As(III) to As(V) by hydrous manganese(IV) oxide. Using Q-XAS, As(III) and As(V) concentrations were determined every 0.98 s in batch reactions. The initial apparent As(III) depletion rate constants (t < 30 s) measured with Q-XAS are nearly twice as large as rate constants measured with traditional analytical techniques. Our results demonstrate the importance of developing analytical techniques capable of analyzing environmental reactions on the same time scale as they occur. Given the high sensitivity, elemental specificity, and time resolution of Q-XAS, it has many potential applications. They could include measuring not only redox reactions but also dissolution/precipitation reactions, such as the formation and/or reductive dissolution of Fe(III) (hydr)oxides, solid-phase transformations (i.e., formation of layered-double hydroxide minerals), or almost any other reaction occurring in aqueous media that can be measured using x-ray absorption spectroscopy. PMID:19805269

  12. Synchrotron IR microspectroscopy for protein structure analysis: Potential and questions

    DOE PAGES

    Yu, Peiqiang

    2006-01-01

    Synchrotron radiation-based Fourier transform infrared microspectroscopy (S-FTIR) has been developed as a rapid, direct, non-destructive, bioanalytical technique. This technique takes advantage of synchrotron light brightness and small effective source size and is capable of exploring the molecular chemical make-up within microstructures of a biological tissue without destruction of inherent structures at ultra-spatial resolutions within cellular dimension. To date there has been very little application of this advanced technique to the study of pure protein inherent structure at a cellular level in biological tissues. In this review, a novel approach was introduced to show the potential of the newly developed, advancedmore » synchrotron-based analytical technology, which can be used to localize relatively “pure“ protein in the plant tissues and relatively reveal protein inherent structure and protein molecular chemical make-up within intact tissue at cellular and subcellular levels. Several complex protein IR spectra data analytical techniques (Gaussian and Lorentzian multi-component peak modeling, univariate and multivariate analysis, principal component analysis (PCA), and hierarchical cluster analysis (CLA) are employed to relatively reveal features of protein inherent structure and distinguish protein inherent structure differences between varieties/species and treatments in plant tissues. By using a multi-peak modeling procedure, RELATIVE estimates (but not EXACT determinations) for protein secondary structure analysis can be made for comparison purpose. The issues of pro- and anti-multi-peaking modeling/fitting procedure for relative estimation of protein structure were discussed. By using the PCA and CLA analyses, the plant molecular structure can be qualitatively separate one group from another, statistically, even though the spectral assignments are not known. The synchrotron-based technology provides a new approach for protein structure research in biological tissues at ultraspatial resolutions.« less

  13. Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies

    DTIC Science & Technology

    2011-10-01

    is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies

  14. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations.

    PubMed

    Islam, Md Shafiqul; Khan, Kamruzzaman; Akbar, M Ali; Mastroberardino, Antonio

    2014-10-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin-Bona-Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering.

  15. Performance and capacity analysis of Poisson photon-counting based Iter-PIC OCDMA systems.

    PubMed

    Li, Lingbin; Zhou, Xiaolin; Zhang, Rong; Zhang, Dingchen; Hanzo, Lajos

    2013-11-04

    In this paper, an iterative parallel interference cancellation (Iter-PIC) technique is developed for optical code-division multiple-access (OCDMA) systems relying on shot-noise limited Poisson photon-counting reception. The novel semi-analytical tool of extrinsic information transfer (EXIT) charts is used for analysing both the bit error rate (BER) performance as well as the channel capacity of these systems and the results are verified by Monte Carlo simulations. The proposed Iter-PIC OCDMA system is capable of achieving two orders of magnitude BER improvements and a 0.1 nats of capacity improvement over the conventional chip-level OCDMA systems at a coding rate of 1/10.

  16. RAPD/SCAR Approaches for Identification of Adulterant Breeds' Milk in Dairy Products.

    PubMed

    Cunha, Joana T; Domingues, Lucília

    2017-01-01

    Food safety and quality are nowadays a major consumers' concern. In the dairy industry the fraudulent addition of cheaper/lower-quality milks from nonlegitimate species/breeds compromises the quality and value of the final product. Despite the already existing approaches for identification of the species origin of milk, there is little information regarding differentiation at an intra-species level. In this protocol we describe a low-cost, sensitive, fast, and reliable analytical technique-Random Amplified Polymorphic DNA/Sequence Characterized Amplified Region (RAPD/SCAR)-capable of an efficient detection of adulterant breeds in milk mixtures used for fraudulent manufacturing of dairy products and suitable for the detection of milk adulteration in processed dairy foods.

  17. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations

    PubMed Central

    Islam, Md. Shafiqul; Khan, Kamruzzaman; Akbar, M. Ali; Mastroberardino, Antonio

    2014-01-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin–Bona–Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering. PMID:26064530

  18. Perspective: Materials informatics and big data: Realization of the "fourth paradigm" of science in materials science

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankit; Choudhary, Alok

    2016-05-01

    Our ability to collect "big data" has greatly surpassed our capability to analyze it, underscoring the emergence of the fourth paradigm of science, which is data-driven discovery. The need for data informatics is also emphasized by the Materials Genome Initiative (MGI), further boosting the emerging field of materials informatics. In this article, we look at how data-driven techniques are playing a big role in deciphering processing-structure-property-performance relationships in materials, with illustrative examples of both forward models (property prediction) and inverse models (materials discovery). Such analytics can significantly reduce time-to-insight and accelerate cost-effective materials discovery, which is the goal of MGI.

  19. Synergia: an accelerator modeling tool with 3-D space charge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amundson, James F.; Spentzouris, P.; /Fermilab

    2004-07-01

    High precision modeling of space-charge effects, together with accurate treatment of single-particle dynamics, is essential for designing future accelerators as well as optimizing the performance of existing machines. We describe Synergia, a high-fidelity parallel beam dynamics simulation package with fully three dimensional space-charge capabilities and a higher order optics implementation. We describe the computational techniques, the advanced human interface, and the parallel performance obtained using large numbers of macroparticles. We also perform code benchmarks comparing to semi-analytic results and other codes. Finally, we present initial results on particle tune spread, beam halo creation, and emittance growth in the Fermilab boostermore » accelerator.« less

  20. No Vent Tank Fill and Transfer Line Chilldown Analysis by Generalized Fluid System Simulation Program (GFSSP)

    NASA Technical Reports Server (NTRS)

    Majumdar, Alok

    2013-01-01

    The purpose of the paper is to present the analytical capability developed to model no vent chill and fill of cryogenic tank to support CPST (Cryogenic Propellant Storage and Transfer) program. Generalized Fluid System Simulation Program (GFSSP) was adapted to simulate charge-holdvent method of Tank Chilldown. GFSSP models were developed to simulate chilldown of LH2 tank in K-site Test Facility and numerical predictions were compared with test data. The report also describes the modeling technique of simulating the chilldown of a cryogenic transfer line and GFSSP models were developed to simulate the chilldown of a long transfer line and compared with test data.

  1. Nanopore with Transverse Nanoelectrodes for Electrical Characterization and Sequencing of DNA

    PubMed Central

    Gierhart, Brian C.; Howitt, David G.; Chen, Shiahn J.; Zhu, Zhineng; Kotecki, David E.; Smith, Rosemary L.; Collins, Scott D.

    2009-01-01

    A DNA sequencing device which integrates transverse conducting electrodes for the measurement of electrode currents during DNA translocation through a nanopore has been nanofabricated and characterized. A focused electron beam (FEB) milling technique, capable of creating features on the order of 1 nm in diameter, was used to create the nanopore. The device was characterized electrically using gold nanoparticles as an artificial analyte with both DC and AC measurement methods. Single nanoparticle/electrode interaction events were recorded. A low-noise, high-speed transimpedance current amplifier for the detection of nano to picoampere currents at microsecond time scales was designed, fabricated and tested for future integration with the nanopore device. PMID:19584949

  2. Nanopore with Transverse Nanoelectrodes for Electrical Characterization and Sequencing of DNA.

    PubMed

    Gierhart, Brian C; Howitt, David G; Chen, Shiahn J; Zhu, Zhineng; Kotecki, David E; Smith, Rosemary L; Collins, Scott D

    2008-06-16

    A DNA sequencing device which integrates transverse conducting electrodes for the measurement of electrode currents during DNA translocation through a nanopore has been nanofabricated and characterized. A focused electron beam (FEB) milling technique, capable of creating features on the order of 1 nm in diameter, was used to create the nanopore. The device was characterized electrically using gold nanoparticles as an artificial analyte with both DC and AC measurement methods. Single nanoparticle/electrode interaction events were recorded. A low-noise, high-speed transimpedance current amplifier for the detection of nano to picoampere currents at microsecond time scales was designed, fabricated and tested for future integration with the nanopore device.

  3. Visualizing the Big (and Large) Data from an HPC Resource

    NASA Astrophysics Data System (ADS)

    Sisneros, R.

    2015-10-01

    Supercomputers are built to endure painfully large simulations and contend with resulting outputs. These are characteristics that scientists are all too willing to test the limits of in their quest for science at scale. The data generated during a scientist's workflow through an HPC center (large data) is the primary target for analysis and visualization. However, the hardware itself is also capable of generating volumes of diagnostic data (big data); this presents compelling opportunities to deploy analogous analytic techniques. In this paper we will provide a survey of some of the many ways in which visualization and analysis may be crammed into the scientific workflow as well as utilized on machine-specific data.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andronov, V.A.; Zhidov, I.G.; Meskov, E.E.

    This report describes an extensive program of investigations conducted at Arzamas-16 in Russia over the past several decades. The focus of the work is on material interface instability and the mixing of two materials. Part 1 of the report discusses analytical and computational studies of hydrodynamic instabilities and turbulent mixing. The EGAK codes are described and results are illustrated for several types of unstable flow. Semiempirical turbulence transport equations are derived for the mixing of two materials, and their capabilities are illustrated for several examples. Part 2 discusses the experimental studies that have been performed to investigate instabilities and turbulentmore » mixing. Shock-tube and jelly techniques are described in considerable detail. Results are presented for many circumstances and configurations.« less

  5. Sensitivity analysis of a wing aeroelastic response

    NASA Technical Reports Server (NTRS)

    Kapania, Rakesh K.; Eldred, Lloyd B.; Barthelemy, Jean-Francois M.

    1991-01-01

    A variation of Sobieski's Global Sensitivity Equations (GSE) approach is implemented to obtain the sensitivity of the static aeroelastic response of a three-dimensional wing model. The formulation is quite general and accepts any aerodynamics and structural analysis capability. An interface code is written to convert one analysis's output to the other's input, and visa versa. Local sensitivity derivatives are calculated by either analytic methods or finite difference techniques. A program to combine the local sensitivities, such as the sensitivity of the stiffness matrix or the aerodynamic kernel matrix, into global sensitivity derivatives is developed. The aerodynamic analysis package FAST, using a lifting surface theory, and a structural package, ELAPS, implementing Giles' equivalent plate model are used.

  6. New York State energy-analytic information system: first-stage implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allentuck, J.; Carroll, O.; Fiore, L.

    1979-09-01

    So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less

  7. Big Data Analytics and Machine Intelligence Capability Development at NASA Langley Research Center: Strategy, Roadmap, and Progress

    NASA Technical Reports Server (NTRS)

    Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward

    2016-01-01

    In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.

  8. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  9. Nondestructive quantification of analyte diffusion in cornea and sclera using optical coherence tomography.

    PubMed

    Ghosn, Mohamad G; Tuchin, Valery V; Larin, Kirill V

    2007-06-01

    Noninvasive functional imaging, monitoring, and quantification of analytes transport in epithelial ocular tissues are extremely important for therapy and diagnostics of many eye diseases. In this study the authors investigated the capability of optical coherence tomography (OCT) for noninvasive monitoring and quantification of diffusion of different analytes in sclera and cornea of rabbit eyes. A portable time-domain OCT system with wavelength of 1310 +/- 15 nm, output power of 3.5 mW, and resolution of 25 mum was used in this study. Diffusion of different analytes was monitored and quantified in rabbit cornea and sclera of whole eyeballs. Diffusion of water, metronidazole (0.5%), dexamethasone (0.2%), ciprofloxacin (0.3%), mannitol (20%), and glucose solution (20%) were examined, and their permeability coefficients were calculated by using OCT signal slope and depth-resolved amplitude methods. Permeability coefficients were calculated as a function of time and tissue depth. For instance, mannitol was found to have a permeability coefficient of (8.99 +/- 1.43) x 10(-6) cm/s in cornea and (6.18 +/- 1.08) x 10(-6) cm/s in sclera. The permeability coefficient of drugs with small concentrations (where water was the major solvent) was found to be in the range of that of water in the same tissue type, whereas permeability coefficients of higher concentrated solutions varied significantly. Results suggest that the OCT technique might be a powerful tool for noninvasive diffusion studies of different analytes in ocular tissues. However, additional methods of OCT signal acquisition and processing are required to study the diffusion of agents of small concentrations.

  10. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  11. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation.

    PubMed

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  12. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  13. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Miniaturized Temperature-Controlled Planar Chromatography (Micro-TLC) as a Versatile Technique for Fast Screening of Micropollutants and Biomarkers Derived from Surface Water Ecosystems and During Technological Processes of Wastewater Treatment.

    PubMed

    Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2017-07-01

    There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.

  15. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  16. Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders

    NASA Technical Reports Server (NTRS)

    Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)

    2002-01-01

    A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.

  17. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  18. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis.

    PubMed

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.

  19. Coffee-ring effects in laser desorption/ionization mass spectrometry.

    PubMed

    Hu, Jie-Bi; Chen, Yu-Chie; Urban, Pawel L

    2013-03-05

    This report focuses on the heterogeneous distribution of small molecules (e.g. metabolites) within dry deposits of suspensions and solutions of inorganic and organic compounds with implications for chemical analysis of small molecules by laser desorption/ionization (LDI) mass spectrometry (MS). Taking advantage of the imaging capabilities of a modern mass spectrometer, we have investigated the occurrence of "coffee rings" in matrix-assisted laser desorption/ionization (MALDI) and surface-assisted laser desorption/ionization (SALDI) sample spots. It is seen that the "coffee-ring effect" in MALDI/SALDI samples can be both beneficial and disadvantageous. For example, formation of the coffee rings gives rise to heterogeneous distribution of analytes and matrices, thus compromising analytical performance and reproducibility of the mass spectrometric analysis. On the other hand, the coffee-ring effect can also be advantageous because it enables partial separation of analytes from some of the interfering molecules present in the sample. We report a "hidden coffee-ring effect" where under certain conditions the sample/matrix deposit appears relatively homogeneous when inspected by optical microscopy. Even in such cases, hidden coffee rings can still be found by implementing the MALDI-MS imaging technique. We have also found that to some extent, the coffee-ring effect can be suppressed during SALDI sample preparation. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  1. Carbon based sample supports and matrices for laser desorption/ ionization mass spectrometry.

    PubMed

    Rainer, Matthias; Najam-ul-Haq, Muhammad; Huck, Christian W; Vallant, Rainer M; Heigl, Nico; Hahn, Hans; Bakry, Rania; Bonn, Günther K

    2007-01-01

    Laser desorption/ionization mass spectrometry (LDI-MS) is a widespread and powerful technique for mass analysis allowing the soft ionization of molecules such as peptides, proteins and carbohydrates. In many applications, an energy absorbing matrix has to be added to the analytes in order to protect them from being fragmented by direct laser beam. LDI-MS in conjunction with matrix is commonly referred as matrix-assisted LDI (MALDI). One of the striking disadvantages of this method is the desorption of matrix molecules, which causes interferences originating from matrix background ions in lower mass range (< 1000 Da). This has been led to the development of a variety of different carbon based LDI sample supports, which are capable of absorbing laser light and simultaneously transfering energy to the analytes for desorption. Furthermore carbon containing sample supports are used as carrier materials for the specific binding and preconcentration of molecules out of complex samples. Their subsequent analysis with MALDI mass spectrometry allows performing studies in metabolomics and proteomics. Finally a thin layer of carbon significantly improves sensitivity concerning detection limit. Analytes in low femtomole and attomole range can be detected in this regard. In the present article, these aspects are reviewed from patents where nano-based carbon materials are comprehensively utilized.

  2. Fundamentals of Enzyme-Based Sensors

    NASA Astrophysics Data System (ADS)

    Moreno-Bondi, María C.; Benito-Peña, Elena

    One of the mayor outbreaks in the development of analytical measurement techniques was the introduction, in the mid-twentieth century, of bioprobes for the analysis of chemical and biochemical compounds in real samples. The first devices, developed in the 1950's and 1960's by Clark et al. were based on electrochemical measurements and allowed the determination of oxygen and glucose in tissues and blood samples. Later on, in the 1970's, optical transduction was coupled to enzymatically-catalyzed reactions3 and since those early days the field of application of optical biosensors has broaden up considerably. According to the definition proposed by the International Union of Pure and Applied Chemistry (IUPAC): "A biosensor is a self-contained integrated device which is capable of providing specific quantitative or semi-quantitative analytical information using a biological recognition element (biochemical receptor) which is in direct spatial contact with a transducer element. A biosensor should be clearly distinguished from a bioanalytical system, which requires additional processing steps, such as reagent addition. Furthermore, a biosensor should be distinguished from a bioprobe which is either disposable after one measurement, i.e. single use, or unable to continuously monitor the analyte concentration". The general scheme of a biosensor configuration is shown in Figure 1. Biosensors that include transducers based on integrated circuit microchips are known as biochips.

  3. Trace analysis of high-purity graphite by LA-ICP-MS.

    PubMed

    Pickhardt, C; Becker, J S

    2001-07-01

    Laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been established as a very efficient and sensitive technique for the direct analysis of solids. In this work the capability of LA-ICP-MS was investigated for determination of trace elements in high-purity graphite. Synthetic laboratory standards with a graphite matrix were prepared for the purpose of quantifying the analytical results. Doped trace elements, concentration 0.5 microg g(-1), in a laboratory standard were determined with an accuracy of 1% to +/- 7% and a relative standard deviation (RSD) of 2-13%. Solution-based calibration was also used for quantitative analysis of high-purity graphite. It was found that such calibration led to analytical results for trace-element determination in graphite with accuracy similar to that obtained by use of synthetic laboratory standards for quantification of analytical results. Results from quantitative determination of trace impurities in a real reactor-graphite sample, using both quantification approaches, were in good agreement. Detection limits for all elements of interest were determined in the low ng g(-1) concentration range. Improvement of detection limits by a factor of 10 was achieved for analyses of high-purity graphite with LA-ICP-MS under wet plasma conditions, because the lower background signal and increased element sensitivity.

  4. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Evaluating sustainable energy harvesting systems for human implantable sensors

    NASA Astrophysics Data System (ADS)

    AL-Oqla, Faris M.; Omar, Amjad A.; Fares, Osama

    2018-03-01

    Achieving most appropriate energy-harvesting technique for human implantable sensors is still challenging for the industry where keen decisions have to be performed. Moreover, the available polymeric-based composite materials are offering plentiful renewable applications that can help sustainable development as being useful for the energy-harvesting systems such as photovoltaic, piezoelectric, thermoelectric devices as well as other energy storage systems. This work presents an expert-based model capable of better evaluating and examining various available renewable energy-harvesting techniques in urban surroundings subject to various technical and economic, often conflicting, criteria. Wide evaluation criteria have been adopted in the proposed model after examining their suitability as well as ensuring the expediency and reliability of the model by worldwide experts' feedback. The model includes establishing an analytic hierarchy structure with simultaneous 12 conflicting factors to establish a systematic road map for designers to better assess such techniques for human implantable medical sensors. The energy-harvesting techniques considered were limited to Wireless, Thermoelectric, Infrared Radiator, Piezoelectric, Magnetic Induction and Electrostatic Energy Harvesters. Results have demonstrated that the best decision was in favour of wireless-harvesting technology for the medical sensors as it is preferable by most of the considered evaluation criteria in the model.

  6. Assessment of Equipment for the Determination of Nutrients in Marine Waters: A Case Study of the Microplate Technique

    NASA Astrophysics Data System (ADS)

    Aminot, A.

    1996-09-01

    An essential prerequisite for quality assurance of the colorimetric determination of nutrients in seawater is the use of suitable photometric equipment. Based on a knowledge of the optical characteristics of a particular system and the absorption coefficient of the analyte, a statistical approach can be used to predict the limit of detection and the limit of quantitation for a given determinand. The microplate technique, widely used for bioassays, is applicable to colorimetric analysis in general, and its use for the determination of nutrients in seawater has been suggested. This paper reports a theoretical assessment of its capabilities in this context and a practical check on its performance, taking the determination of nitrite in seawater as typical. The conclusion is that short optical path length and insufficient repeatability of the absorbance measurement render it unsuitable for the determination of the low concentrations generally encountered in marine work, with the possible exception of nitrate. The perceived advantage of high-speed analysis is a secondary consideration in the overall process of determining nutrients, and the microplate technique's small scale of operation is a definite disadvantage as this increases the risk of exposure to contamination problems, in comparison with conventional techniques.

  7. Covariance mapping techniques

    NASA Astrophysics Data System (ADS)

    Frasinski, Leszek J.

    2016-08-01

    Recent technological advances in the generation of intense femtosecond pulses have made covariance mapping an attractive analytical technique. The laser pulses available are so intense that often thousands of ionisation and Coulomb explosion events will occur within each pulse. To understand the physics of these processes the photoelectrons and photoions need to be correlated, and covariance mapping is well suited for operating at the high counting rates of these laser sources. Partial covariance is particularly useful in experiments with x-ray free electron lasers, because it is capable of suppressing pulse fluctuation effects. A variety of covariance mapping methods is described: simple, partial (single- and multi-parameter), sliced, contingent and multi-dimensional. The relationship to coincidence techniques is discussed. Covariance mapping has been used in many areas of science and technology: inner-shell excitation and Auger decay, multiphoton and multielectron ionisation, time-of-flight and angle-resolved spectrometry, infrared spectroscopy, nuclear magnetic resonance imaging, stimulated Raman scattering, directional gamma ray sensing, welding diagnostics and brain connectivity studies (connectomics). This review gives practical advice for implementing the technique and interpreting the results, including its limitations and instrumental constraints. It also summarises recent theoretical studies, highlights unsolved problems and outlines a personal view on the most promising research directions.

  8. UPb ages of zircon rims: A new analytical method using the air-abrasion technique

    USGS Publications Warehouse

    Aleinikoff, J.N.; Winegarden, D.L.; Walter, M.

    1990-01-01

    We present a new technique for directly dating, by conventional techniques, the rims of zircons. Several circumstances, such as a xenocrystic or inherited component in igneous zircon and metamorphic overgrowths on igneous cores, can result in grains with physically distinct age components. Pneumatic abrasion has been previously shown by Krogh to remove overgrowths and damaged areas of zircon, leaving more resistant and isotopically less disturbed parts available for analysis. A new abrader design, which is capable of very gently grinding only tips and interfacial edges of even needle-like grains, permits easy collection of abraded material for dating. Five examples demonstrate the utility of the "dust-collecting" technique, including two studies that compare conventional, ion microprobe and abrader data. Common Pb may be strongly concentrated in the outermost zones of many zircons and this Pb is not easily removed by leaching (even in weak HF). Thus, the benefit of removing only the outermost zones (and avoiding mixing of age components) is somewhat compromised by the much higher common Pb contents which result in less precise age determinations. A very brief abrasion to remove the high common Pb zones prior to collection of material for dating is selected. ?? 1990.

  9. Sensing Free Sulfur Dioxide in Wine

    PubMed Central

    Monro, Tanya M.; Moore, Rachel L.; Nguyen, Mai-Chi; Ebendorff-Heidepriem, Heike; Skouroumounis, George K.; Elsey, Gordon M.; Taylor, Dennis K.

    2012-01-01

    Sulfur dioxide (SO2) is important in the winemaking process as it aids in preventing microbial growth and the oxidation of wine. These processes and others consume the SO2 over time, resulting in wines with little SO2 protection. Furthermore, SO2 and sulfiting agents are known to be allergens to many individuals and for that reason their levels need to be monitored and regulated in final wine products. Many of the current techniques for monitoring SO2 in wine require the SO2 to be separated from the wine prior to analysis. This investigation demonstrates a technique capable of measuring free sulfite concentrations in low volume liquid samples in white wine. This approach adapts a known colorimetric reaction to a suspended core optical fiber sensing platform, and exploits the interaction between guided light located within the fiber voids and a mixture of the wine sample and a colorimetric analyte. We have shown that this technique enables measurements to be made without dilution of the wine samples, thus paving the way towards real time in situ wine monitoring. PMID:23112627

  10. Trends in hard X-ray fluorescence mapping: environmental applications in the age of fast detectors.

    PubMed

    Lombi, E; de Jonge, M D; Donner, E; Ryan, C G; Paterson, D

    2011-06-01

    Environmental samples are extremely diverse but share a tendency for heterogeneity and complexity. This heterogeneity poses methodological challenges when investigating biogeochemical processes. In recent years, the development of analytical tools capable of probing element distribution and speciation at the microscale have allowed this challenge to be addressed. Of these available tools, laterally resolved synchrotron techniques such as X-ray fluorescence mapping are key methods for the in situ investigation of micronutrients and inorganic contaminants in environmental samples. This article demonstrates how recent advances in X-ray fluorescence detector technology are bringing new possibilities to environmental research. Fast detectors are helping to circumvent major issues such as X-ray beam damage of hydrated samples, as dwell times during scanning are reduced. They are also helping to reduce temporal beamtime requirements, making particularly time-consuming techniques such as micro X-ray fluorescence (μXRF) tomography increasingly feasible. This article focuses on μXRF mapping of nutrients and metalloids in environmental samples, and suggests that the current divide between mapping and speciation techniques will be increasingly blurred by the development of combined approaches.

  11. Investigating biomolecular recognition at the cell surface using atomic force microscopy.

    PubMed

    Wang, Congzhou; Yadavalli, Vamsi K

    2014-05-01

    Probing the interaction forces that drive biomolecular recognition on cell surfaces is essential for understanding diverse biological processes. Force spectroscopy has been a widely used dynamic analytical technique, allowing measurement of such interactions at the molecular and cellular level. The capabilities of working under near physiological environments, combined with excellent force and lateral resolution make atomic force microscopy (AFM)-based force spectroscopy a powerful approach to measure biomolecular interaction forces not only on non-biological substrates, but also on soft, dynamic cell surfaces. Over the last few years, AFM-based force spectroscopy has provided biophysical insight into how biomolecules on cell surfaces interact with each other and induce relevant biological processes. In this review, we focus on describing the technique of force spectroscopy using the AFM, specifically in the context of probing cell surfaces. We summarize recent progress in understanding the recognition and interactions between macromolecules that may be found at cell surfaces from a force spectroscopy perspective. We further discuss the challenges and future prospects of the application of this versatile technique. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Real time en face Fourier-domain optical coherence tomography with direct hardware frequency demodulation

    PubMed Central

    Biedermann, Benjamin R.; Wieser, Wolfgang; Eigenwillig, Christoph M.; Palte, Gesa; Adler, Desmond C.; Srinivasan, Vivek J.; Fujimoto, James G.; Huber, Robert

    2009-01-01

    We demonstrate en face swept source optical coherence tomography (ss-OCT) without requiring a Fourier transformation step. The electronic optical coherence tomography (OCT) interference signal from a k-space linear Fourier domain mode-locked laser is mixed with an adjustable local oscillator, yielding the analytic reflectance signal from one image depth for each frequency sweep of the laser. Furthermore, a method for arbitrarily shaping the spectral intensity profile of the laser is presented, without requiring the step of numerical apodization. In combination, these two techniques enable sampling of the in-phase and quadrature signal with a slow analog-to-digital converter and allow for real-time display of en face projections even for highest axial scan rates. Image data generated with this technique is compared to en face images extracted from a three-dimensional OCT data set. This technique can allow for real-time visualization of arbitrarily oriented en face planes for the purpose of alignment, registration, or operator-guided survey scans while simultaneously maintaining the full capability of high-speed volumetric ss-OCT functionality. PMID:18978919

  13. On the power spectral density of quadrature modulated signals. [satellite communication

    NASA Technical Reports Server (NTRS)

    Yan, T. Y.

    1981-01-01

    The conventional (no-offset) quadriphase modulation technique suffers from the fact that hardlimiting will restore the frequency sidelobes removed by proper filtering. Thus, offset keyed quadriphase modulation techniques are often proposed for satellite communication with bandpass hardlimiting. A unified theory is developed which is capable of describing the power spectral density before and after the hardlimiting process. Using the in-phase and the quadrature phase channel with arbitrary pulse shaping, analytical results are established for generalized quadriphase modulation. In particular MSK, OPSK or the recently introduced overlapped raised cosine keying all fall into this general category. It is shown that for a linear communication channel, the power spectral density of the modulated signal remains unchanged regardless of the offset delay. Furthermore, if the in phase and the quadrature phase channel have identical pulse shapes without offset, the spectrum after bandpass hardlimiting will be identical to that of the conventional QPSK modulation. Numerical examples are given for various modulation techniques. A case of different pulse shapes in the in phase and the quadrature phase channel is also considered.

  14. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  15. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  16. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  17. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  18. Optical coherence tomography in estimating molecular diffusion of drugs and analytes in ocular tissues

    NASA Astrophysics Data System (ADS)

    Ghosn, Mohamad G.; Tuchin, Valery V.; Larin, Kirill V.

    2009-02-01

    Aside from other ocular drug delivery methods, topical application and follow up drug diffusion through the cornea and sclera of the eye remain the favored method, as they impose the least pain and discomfort to the patient. However, this delivery route suffers from the low permeability of epithelial tissues and drug washout, thus reducing the effectiveness of the drug and ability to reach its target in effective concentrations. In order to better understand the behavioral characteristics of diffusion in ocular tissue, a method for noninvasive imaging of drug diffusion is needed. Due to its high resolution and depth-resolved imaging capabilities, optical coherence tomography (OCT) has been utilized in quantifying the molecular transport of different drugs and analytes in vitro in the sclera and the cornea. Diffusion of Metronidazole (0.5%), Dexamethasone (0.2%), Ciprofloxacin (0.3%), Mannitol (20%), and glucose solution (20%) in rabbit sclera and cornea were examined. Their permeability coefficients were calculated by using OCT signal slope and depth-resolved amplitude methods as function of time and tissue depth. For instance, mannitol was found to have a permeability coefficient of (8.99 +/- 1.43) × 10-6 cm/s in cornea (n=4) and (6.18 +/- 1.08) × 10-6 cm/s in sclera (n=5). We also demonstrate the capability of OCT technique for depth-resolved monitoring and quantifying of glucose diffusion in different layers of the sclera. We found that the glucose diffusion rate is not uniform throughout the tissue and is increased from approximately (2.39 +/- 0.73) × 10-6 cm/s at the epithelial side to (8.63 +/- 0.27) × 10-6 cm/s close to the endothelial side of the sclera. In addition, discrepancy in the permeability rates of glucose solutions with different concentrations was observed. Such diffusion studies could enhance our knowledge and potentially pave the way for advancements of therapeutic and diagnostic techniques in the treatment of ocular diseases.

  19. Electron-Beam Diagnostic Methods for Hypersonic Flow Diagnostics

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The purpose of this work was the evaluation of the use of electron-bean fluorescence for flow measurements during hypersonic flight. Both analytical and numerical models were developed in this investigation to evaluate quantitatively flow field imaging concepts based upon the electron beam fluorescence technique for use in flight research and wind tunnel applications. Specific models were developed for: (1) fluorescence excitation/emission for nitrogen, (2) rotational fluorescence spectrum for nitrogen, (3) single and multiple scattering of electrons in a variable density medium, (4) spatial and spectral distribution of fluorescence, (5) measurement of rotational temperature and density, (6) optical filter design for fluorescence imaging, and (7) temperature accuracy and signal acquisition time requirements. Application of these models to a typical hypersonic wind tunnel flow is presented. In particular, the capability of simulating the fluorescence resulting from electron impact ionization in a variable density nitrogen or air flow provides the capability to evaluate the design of imaging instruments for flow field mapping. The result of this analysis is a recommendation that quantitative measurements of hypersonic flow fields using electron-bean fluorescence is a tractable method with electron beam energies of 100 keV. With lower electron energies, electron scattering increases with significant beam divergence which makes quantitative imaging difficult. The potential application of the analytical and numerical models developed in this work is in the design of a flow field imaging instrument for use in hypersonic wind tunnels or onboard a flight research vehicle.

  20. Improving entrepreneurial opportunity recognition through web content analytics

    NASA Astrophysics Data System (ADS)

    Bakar, Muhamad Shahbani Abu; Azmi, Azwiyati

    2017-10-01

    The ability to recognize and develop an opportunity into a venture defines an entrepreneur. Research in opportunity recognition has been robust and focuses more on explaining the processes involved in opportunity recognition. Factors such as prior knowledge, cognitive and creative capabilities are shown to affect opportunity recognition in entrepreneurs. Prior knowledge in areas such as customer problems, ways to serve the market, and technology has been shows in various studies to be a factor that facilitates entrepreneurs to identify and recognize opportunities. Findings from research also shows that experienced entrepreneurs search and scan for information to discover opportunities. Searching and scanning for information has also been shown to help novice entrepreneurs who lack prior knowledge to narrow this gap and enable them to better identify and recognize opportunities. There is less focus in research on finding empirically proven techniques and methods to develop and enhance opportunity recognition in student entrepreneurs. This is important as the country pushes for more graduate entrepreneurs that can drive the economy. This paper aims to discuss Opportunity Recognition Support System (ORSS), an information support system to help especially student entrepreneurs in identifying and recognizing business opportunities. The ORSS aims to provide the necessary knowledge to student entrepreneurs to be able to better identify and recognize opportunities. Applying design research, theories in opportunity recognition are applied to identify the requirements for the support system and the requirements in turn dictate the design of the support system. The paper proposes the use of web content mining and analytics as two core components and techniques for the support system. Web content mining can mine the vast knowledge repositories available on the internet and analytics can provide entrepreneurs with further insights into the information needed to recognize opportunities in a given market or industry.

  1. Coordinated Analysis 101: A Joint Training Session Sponsored by LPI and ARES/JSC

    NASA Technical Reports Server (NTRS)

    Draper, D. S.; Treiman, A. H.

    2017-01-01

    The Lunar and Planetary Institute (LPI) and the Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate at NASA Johnson Space Center (JSC), co-sponsored a training session in November 2016 for four early-career scientists in the techniques of coordinated analysis. Coordinated analysis refers to the approach of systematically performing high-resolution and -precision analytical studies on astromaterials, particularly the very small particles typical of recent and near-future sample return missions such as Stardust, Hayabusa, Hayabusa2, and OSIRIS-REx. A series of successive analytical steps is chosen to be performed on the same particle, as opposed to separate subsections of a sample, in such a way that the initial steps do not compromise the results from later steps in the sequence. The data from the entire series can then be integrated for these individual specimens, revealing important in-sights obtainable no other way. ARES/JSC scientists have played a leading role in the development and application of this approach for many years. Because the coming years will bring new sample collections from these and other planned NASA and international exploration missions, it is timely to begin disseminating specialized techniques for the study of small and precious astromaterial samples. As part of the Cooperative Agreement between NASA and the LPI, this training workshop was intended as the first in a series of similar training exercises that the two organizations will jointly sponsor in the coming years. These workshops will span the range of analytical capabilities and sample types available at ARES/JSC in the Astromaterials Research and Astro-materials Acquisition and Curation Offices. Here we summarize the activities and participants in this initial training.

  2. Damage states in laminated composite three-point bend specimens: An experimental-analytical correlation study

    NASA Technical Reports Server (NTRS)

    Starbuck, J. Michael; Guerdal, Zafer; Pindera, Marek-Jerzy; Poe, Clarence C.

    1990-01-01

    Damage states in laminated composites were studied by considering the model problem of a laminated beam subjected to three-point bending. A combination of experimental and theoretical research techniques was used to correlate the experimental results with the analytical stress distributions. The analytical solution procedure was based on the stress formulation approach of the mathematical theory of elasticity. The solution procedure is capable of calculating the ply-level stresses and beam displacements for any laminated beam of finite length using the generalized plane deformation or plane stress state assumption. Prior to conducting the experimental phase, the results from preliminary analyses were examined. Significant effects in the ply-level stress distributions were seen depending on the fiber orientation, aspect ratio, and whether or not a grouped or interspersed stacking sequence was used. The experimental investigation was conducted to determine the different damage modes in laminated three-point bend specimens. The test matrix consisted of three-point bend specimens of 0 deg unidirectional, cross-ply, and quasi-isotropic stacking sequences. The dependence of the damage initiation loads and ultimate failure loads were studied, and their relation to damage susceptibility and damage tolerance of the mean configuration was discussed. Damage modes were identified by visual inspection of the damaged specimens using an optical microscope. The four fundamental damage mechanisms identified were delaminations, matrix cracking, fiber breakage, and crushing. The correlation study between the experimental results and the analytical results were performed for the midspan deflection, indentation, damage modes, and damage susceptibility.

  3. Investigation of practical applications of H infinity control theory to the design of control systems for large space structures

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis

    1988-01-01

    The applicability of H infinity control theory to the problems of large space structures (LSS) control was investigated. A complete evaluation to any technique as a candidate for large space structure control involves analytical evaluation, algorithmic evaluation, evaluation via simulation studies, and experimental evaluation. The results of analytical and algorithmic evaluations are documented. The analytical evaluation involves the determination of the appropriateness of the underlying assumptions inherent in the H infinity theory, the determination of the capability of the H infinity theory to achieve the design goals likely to be imposed on an LSS control design, and the identification of any LSS specific simplifications or complications of the theory. The resuls of the analytical evaluation are presented in the form of a tutorial on the subject of H infinity control theory with the LSS control designer in mind. The algorthmic evaluation of H infinity for LSS control pertains to the identification of general, high level algorithms for effecting the application of H infinity to LSS control problems, the identification of specific, numerically reliable algorithms necessary for a computer implementation of the general algorithms, the recommendation of a flexible software system for implementing the H infinity design steps, and ultimately the actual development of the necessary computer codes. Finally, the state of the art in H infinity applications is summarized with a brief outline of the most promising areas of current research.

  4. Moving your laboratories to the field--Advantages and limitations of the use of field portable instruments in environmental sample analysis.

    PubMed

    Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek

    2015-07-01

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Analytical and functional similarity of Amgen biosimilar ABP 215 to bevacizumab

    PubMed Central

    Seo, Neungseon; Polozova, Alla; Zhang, Mingxuan; Yates, Zachary; Cao, Shawn; Li, Huimin; Kuhns, Scott; Maher, Gwendolyn; McBride, Helen J.; Liu, Jennifer

    2018-01-01

    ABSTRACT ABP 215 is a biosimilar product to bevacizumab. Bevacizumab acts by binding to vascular endothelial growth factor A, inhibiting endothelial cell proliferation and new blood vessel formation, thereby leading to tumor vasculature normalization. The ABP 215 analytical similarity assessment was designed to assess the structural and functional similarity of ABP 215 and bevacizumab sourced from both the United States (US) and the European Union (EU). Similarity assessment was also made between the US- and EU-sourced bevacizumab to assess the similarity between the two products. The physicochemical properties and structural similarity of ABP 215 and bevacizumab were characterized using sensitive state-of-the-art analytical techniques capable of detecting small differences in product attributes. ABP 215 has the same amino acid sequence and exhibits similar post-translational modification profiles compared to bevacizumab. The functional similarity assessment employed orthogonal assays designed to interrogate all expected biological activities, including those known to affect the mechanisms of action for ABP 215 and bevacizumab. More than 20 batches of bevacizumab (US) and bevacizumab (EU), and 13 batches of ABP 215 representing unique drug substance lots were assessed for similarity. The large dataset allows meaningful comparisons and garners confidence in the overall conclusion for the analytical similarity assessment of ABP 215 to both US- and EU-sourced bevacizumab. The structural and purity attributes, and biological properties of ABP 215 are demonstrated to be highly similar to those of bevacizumab. PMID:29553864

  6. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Ultra-wideband sensors for improved magnetic resonance imaging, cardiovascular monitoring and tumour diagnostics.

    PubMed

    Thiel, Florian; Kosch, Olaf; Seifert, Frank

    2010-01-01

    The specific advantages of ultra-wideband electromagnetic remote sensing (UWB radar) make it a particularly attractive technique for biomedical applications. We partially review our activities in utilizing this novel approach for the benefit of high and ultra-high field magnetic resonance imaging (MRI) and other applications, e.g., for intensive care medicine and biomedical research. We could show that our approach is beneficial for applications like motion tracking for high resolution brain imaging due to the non-contact acquisition of involuntary head motions with high spatial resolution, navigation for cardiac MRI due to our interpretation of the detected physiological mechanical contraction of the heart muscle and for MR safety, since we have investigated the influence of high static magnetic fields on myocardial mechanics. From our findings we could conclude, that UWB radar can serve as a navigator technique for high and ultra-high field magnetic resonance imaging and can be beneficial preserving the high resolution capability of this imaging modality. Furthermore it can potentially be used to support standard ECG analysis by complementary information where sole ECG analysis fails. Further analytical investigations have proven the feasibility of this method for intracranial displacements detection and the rendition of a tumour's contrast agent based perfusion dynamic. Beside these analytical approaches we have carried out FDTD simulations of a complex arrangement mimicking the illumination of a human torso model incorporating the geometry of the antennas applied.

  8. Thermal-Structural Optimization of Integrated Cryogenic Propellant Tank Concepts for a Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Waters, W. Allen; Singer, Thomas N.; Haftka, Raphael T.

    2004-01-01

    A next generation reusable launch vehicle (RLV) will require thermally efficient and light-weight cryogenic propellant tank structures. Since these tanks will be weight-critical, analytical tools must be developed to aid in sizing the thickness of insulation layers and structural geometry for optimal performance. Finite element method (FEM) models of the tank and insulation layers were created to analyze the thermal performance of the cryogenic insulation layer and thermal protection system (TPS) of the tanks. The thermal conditions of ground-hold and re-entry/soak-through for a typical RLV mission were used in the thermal sizing study. A general-purpose nonlinear FEM analysis code, capable of using temperature and pressure dependent material properties, was used as the thermal analysis code. Mechanical loads from ground handling and proof-pressure testing were used to size the structural geometry of an aluminum cryogenic tank wall. Nonlinear deterministic optimization and reliability optimization techniques were the analytical tools used to size the geometry of the isogrid stiffeners and thickness of the skin. The results from the sizing study indicate that a commercial FEM code can be used for thermal analyses to size the insulation thicknesses where the temperature and pressure were varied. The results from the structural sizing study show that using combined deterministic and reliability optimization techniques can obtain alternate and lighter designs than the designs obtained from deterministic optimization methods alone.

  9. A Methodology for Determining Statistical Performance Compliance for Airborne Doppler Radar with Forward-Looking Turbulence Detection Capability

    NASA Technical Reports Server (NTRS)

    Bowles, Roland L.; Buck, Bill K.

    2009-01-01

    The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.

  10. Evidence for Extended Aqueous Alteration in CR Carbonaceous Chondrites

    NASA Technical Reports Server (NTRS)

    Trigo-Rodriquez, J. M.; Moyano-Cambero, C. E.; Mestres, N.; Fraxedas, J.; Zolensky, M.; Nakamura, T.; Martins, Z.

    2013-01-01

    We are currently studying the chemical interrelationships between the main rockforming components of carbonaceous chondrites (hereafter CC), e.g. silicate chondrules, refractory inclusions and metal grains, and the surrounding meteorite matrices. It is thought that the fine-grained materials that form CC matrices are representing samples of relatively unprocessed protoplanetary disk materials [1-3]. In fact, modern non-destructive analytical techniques have shown that CC matrices host a large diversity of stellar grains from many distinguishable stellar sources [4]. Aqueous alteration has played a role in homogeneizing the isotopic content that allows the identification of presolar grains [5]. On the other hand, detailed analytical techniques have found that the aqueously-altered CR, CM and CI chondrite groups contain matrices in which the organic matter has experienced significant processing concomitant to the formation of clays and other minerals. In this sense, clays have been found to be directly associated with complex organics [6, 7]. CR chondrites are particularly relevant in this context as this chondrite group contains abundant metal grains in the interstitial matrix, and inside glassy silicate chondrules. It is important because CR are known for exhibiting a large complexity of organic compounds [8-10], and only metallic Fe is considered essential in Fischer-Tropsch catalysis of organics [11-13]. Therefore, CR chondrites can be considered primitive materials capable to provide clues on the role played by aqueous alteration in the chemical evolution of their parent asteroids.

  11. High-Fidelity Generalization Method of Cells for Inelastic Periodic Multiphase Materials

    NASA Technical Reports Server (NTRS)

    Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.

    2002-01-01

    An extension of a recently-developed linear thermoelastic theory for multiphase periodic materials is presented which admits inelastic behavior of the constituent phases. The extended theory is capable of accurately estimating both the effective inelastic response of a periodic multiphase composite and the local stress and strain fields in the individual phases. The model is presently limited to materials characterized by constituent phases that are continuous in one direction, but arbitrarily distributed within the repeating unit cell which characterizes the material's periodic microstructure. The model's analytical framework is based on the homogenization technique for periodic media, but the method of solution for the local displacement and stress fields borrows concepts previously employed by the authors in constructing the higher-order theory for functionally graded materials, in contrast with the standard finite-element solution method typically used in conjunction with the homogenization technique. The present approach produces a closed-form macroscopic constitutive equation for a periodic multiphase material valid for both uniaxial and multiaxial loading. The model's predictive accuracy in generating both the effective inelastic stress-strain response and the local stress said inelastic strain fields is demonstrated by comparison with the results of an analytical inelastic solution for the axisymmetric and axial shear response of a unidirectional composite based on the concentric cylinder model, and with finite-element results for transverse loading.

  12. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  13. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  14. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  15. Energy absorption capabilities of composite sandwich panels under blast loads

    NASA Astrophysics Data System (ADS)

    Sankar Ray, Tirtha

    As blast threats on military and civilian structures continue to be a significant concern, there remains a need for improved design strategies to increase blast resistance capabilities. The approach to blast resistance proposed here is focused on dissipating the high levels of pressure induced during a blast through maximizing the potential for energy absorption of composite sandwich panels, which are a competitive structural member type due to the inherent energy absorption capabilities of fiber reinforced polymer (FRP) composites. Furthermore, the middle core in the sandwich panels can be designed as a sacrificial layer allowing for a significant amount of deformation or progressive failure to maximize the potential for energy absorption. The research here is aimed at the optimization of composite sandwich panels for blast mitigation via energy absorption mechanisms. The energy absorption mechanisms considered include absorbed strain energy due to inelastic deformation as well as energy dissipation through progressive failure of the core of the sandwich panels. The methods employed in the research consist of a combination of experimentally-validated finite element analysis (FEA) and the derivation and use of a simplified analytical model. The key components of the scope of work then includes: establishment of quantified energy absorption criteria, validation of the selected FE modeling techniques, development of the simplified analytical model, investigation of influential core architectures and geometric parameters, and investigation of influential material properties. For the parameters that are identified as being most-influential, recommended values for these parameters are suggested in conceptual terms that are conducive to designing composite sandwich panels for various blast threats. Based on reviewing the energy response characteristic of the panel under blast loading, a non-dimensional parameter AET/ ET (absorbed energy, AET, normalized by total energy, ET) was suggested to compare energy absorption capabilities of the structures under blast loading. In addition, AEweb/ET (where AEweb is the energy absorbed by the middle core) was also employed to evaluate the energy absorption contribution from the web. Taking advantage of FEA and the simplified analytical model, the influences of material properties as well as core architectures and geometries on energy absorption capabilities (quantified by AET/ ET and AEweb/E T) were investigated through parametric studies. Results from the material property investigation indicated that density of the front face sheet and strength were most influential on the energy absorption capability of the composite sandwich panels under blast loading. The study to investigate the potential effectiveness of energy absorbed via inelastic deformation compared to energy absorbed via progressive failure indicated that for practical applications (where the position of bomb is usually unknown and the panel is designed to be the same anywhere), the energy absorption via inelastic deformation is the more efficient approach. Regarding the geometric optimization, it was found that a core architecture consisting of vertically-oriented webs was ideal. The optimum values for these parameters can be generally described as those which cause the most inelasticity, but not failure, of the face sheets and webs.

  16. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Gas chromatography coupled to tunable pulsed glow discharge time-of-flight mass spectrometry for environmental analysis.

    PubMed

    Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo

    2010-05-01

    A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.

  18. EPA Region 6 Laboratory Method Specific Analytical Capabilities with Sample Concentration Range

    EPA Pesticide Factsheets

    EPA Region 6 Environmental Services Branch (ESB) Laboratory is capable of analyzing a wide range of samples with concentrations ranging for low part-per trillion (ppt) to low percent () levels, depending on the sample matrix.

  19. SPECIATION OF ARSENIC IN EXPOSURE ASSESSMENT MATRICES

    EPA Science Inventory

    The speciaton of arsenic in water, food and urine are analytical capabilities which are an essential part in arsenic risk assessment. The cancer risk associated with arsenic has been the driving force in generating the analytical research in each of these matrices. This presentat...

  20. Marine resources. [coastal processes, ice, oceanography, and living marine resources

    NASA Technical Reports Server (NTRS)

    Tilton, E. L., III

    1974-01-01

    Techniques have been developed for defining coastal circulation patterns using sediment as a natural tracer, allowing the formulation of new circulation concepts in some geographical areas and, in general, a better capability for defining the seasonal characteristics of coastal circulation. An analytical technique for measurement of absolute water depth based upon the ratios of two MSS channels has been developed. Suspended sediment has found wide use as a tracer, but a few investigators have reported limited success in measuring the type and amount of sediment quantitatively from ERTS-1 digital data. Significant progress has been made in developing techniques for using ERTS-1 data to locate, identify, and monitor sea and lake ice. Ice features greater than 70 meters in width can be detected, and both arctic and antarctic icebergs have been identified. In the application area of living marine resources, the use of ERTS-1 image-density patterns as a potential indicator of fish school location has been demonstrated for one coastal commercial resource, menhaden. ERTS-1 data have been used to locate ocean current boundaries using ERTS-1 image-density enhancement, and some techniques are under development for measurement of suspended particle concentration and chlorophyll concentration. The interrelationship of water color and surface characteristics (sea state) are also being studied to improve spectral and spatial interpretive techniques.

  1. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  2. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  3. Detection Techniques for Biomolecules using Semi-Conductor Nanocrystals and Magnetic Beads as Labels

    NASA Astrophysics Data System (ADS)

    Chatterjee, Esha

    Continued interest in the development of miniaturized and portable analytical platforms necessitates the exploration of sensitive methods for the detection of trace analytes. Nanomaterials, on account of their unique physical and chemical properties, are not only able to overcome many limitations of traditional detection reagents but also enable the exploration of many new signal transduction technologies. This dissertation presents a series of investigations of alternative detection techniques for biomolecules, involving the use of semi-conductor nanocrystals and magnetic beads as labels. Initial research focused on the development of quantum dot-encapsulating liposomes as a novel fluorescent label for immunoassays. This hybrid nanomaterial was anticipated to overcome the drawbacks presented by traditional fluorophores as well as provide significant signal amplification. Quantum dot-encapsulating liposomes were synthesized by the method of thin film hydration and characterized. The utility of these composite nanostructures for bioanalysis was demonstrated. However, the longterm instability of the liposomes hampered quantitative development. A second approach for assay development exploited the ability of gold nanoparticles to quench the optical signals obtained from quantum dots. The goal of this study was to demonstrate the feasibility of using aptamer-linked nanostructures in FRET-based quenching for the detection of proteins. Thrombin was used as the model analyte in this study. Experimental parameters for the assay were optimized. The assay simply required the mixing of the sample with the reagents and could be completed in less than an hour. The limit of detection for thrombin by this method was 5 nM. This homogeneous assay can be easily adapted for the detection of a wide variety of biochemicals. The novel technique of ferromagnetic resonance generated in magnetic bead labels was explored for signal transduction. This inductive detection technique lends itself to miniaturization, is capable of mass production and is inexpensive to fabricate. The device consisted of a microwave circuit in which a slotline and a coplanar waveguide were integrated with a biochemically activated sensor area. The magnetic beads were immobilized at the sensor area by bio-specific reactions. Experiments conducted on this prototype show promising results for using ferromagnetic resonance -based detection of magnetic labels for fabrication of portable and inexpensive sensor devices. The next stage of work addresses the issue of patterning of sensing surfaces with biomolecules. The ability to selectively immobilize biomolecules on surfaces has far-reaching applications, including sensor development. A simple and widely applicable method for the photopatterning of chitosan films with biotin was presented. Chitosan is a biocompatible and biodegradable polymer. The proposed method was capable of forming spatially defined biotin features on the order of tens of microns, together with a significant reduction of non-specific protein binding and increase in hydrophilicity of the sensor surface. The entire patterning process, inclusive of the blocking step, could be completed in under an hour. This straightforward method for the selective patterning of the biocompatible polymer chitosan is expected to be widely useful in the field of bioanalysis.

  4. Recent advances in merging photonic crystals and plasmonics for bioanalytical applications.

    PubMed

    Liu, Bing; Monshat, Hosein; Gu, Zhongze; Lu, Meng; Zhao, Xiangwei

    2018-05-29

    Photonic crystals (PhCs) and plasmonic nanostructures offer the unprecedented capability to control the interaction of light and biomolecules at the nanoscale. Based on PhC and plasmonic phenomena, a variety of analytical techniques have been demonstrated and successfully implemented in many fields, such as biological sciences, clinical diagnosis, drug discovery, and environmental monitoring. During the past decades, PhC and plasmonic technologies have progressed in parallel with their pros and cons. The merging of photonic crystals with plasmonics will significantly improve biosensor performances and enlarge the linear detection range of analytical targets. Here, we review the state-of-the-art biosensors that combine PhC and plasmonic nanomaterials for quantitative analysis. The optical mechanisms of PhCs, plasmonic crystals, and metal nanoparticles (NPs) are presented, along with their integration and potential applications. By explaining the optical coupling of photonic crystals and plasmonics, the review manifests how PhC-plasmonic hybrid biosensors can achieve the advantages, including high sensitivity, low cost, and short assay time as well. The review also discusses the challenges and future opportunities in this fascinating field.

  5. Carbon Nanotube Fiber Ionization Mass Spectrometry: A Fundamental Study of a Multi-Walled Carbon Nanotube Functionalized Corona Discharge Pin for Polycyclic Aromatic Hydrocarbons Analysis.

    PubMed

    Nahan, Keaton S; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne

    2017-11-01

    Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R 2 ) of three target PAHs with phenanthrene internal standard. Graphical Abstract ᅟ.

  6. COBRA ATD multispectral camera response model

    NASA Astrophysics Data System (ADS)

    Holmes, V. Todd; Kenton, Arthur C.; Hilton, Russell J.; Witherspoon, Ned H.; Holloway, John H., Jr.

    2000-08-01

    A new multispectral camera response model has been developed in support of the US Marine Corps (USMC) Coastal Battlefield Reconnaissance and Analysis (COBRA) Advanced Technology Demonstration (ATD) Program. This analytical model accurately estimates response form five Xybion intensified IMC 201 multispectral cameras used for COBRA ATD airborne minefield detection. The camera model design is based on a series of camera response curves which were generated through optical laboratory test performed by the Naval Surface Warfare Center, Dahlgren Division, Coastal Systems Station (CSS). Data fitting techniques were applied to these measured response curves to obtain nonlinear expressions which estimates digitized camera output as a function of irradiance, intensifier gain, and exposure. This COBRA Camera Response Model was proven to be very accurate, stable over a wide range of parameters, analytically invertible, and relatively simple. This practical camera model was subsequently incorporated into the COBRA sensor performance evaluation and computational tools for research analysis modeling toolbox in order to enhance COBRA modeling and simulation capabilities. Details of the camera model design and comparisons of modeled response to measured experimental data are presented.

  7. Visual analytics of inherently noisy crowdsourced data on ultra high resolution displays

    NASA Astrophysics Data System (ADS)

    Huynh, Andrew; Ponto, Kevin; Lin, Albert Yu-Min; Kuester, Falko

    The increasing prevalence of distributed human microtasking, crowdsourcing, has followed the exponential increase in data collection capabilities. The large scale and distributed nature of these microtasks produce overwhelming amounts of information that is inherently noisy due to the nature of human input. Furthermore, these inputs create a constantly changing dataset with additional information added on a daily basis. Methods to quickly visualize, filter, and understand this information over temporal and geospatial constraints is key to the success of crowdsourcing. This paper present novel methods to visually analyze geospatial data collected through crowdsourcing on top of remote sensing satellite imagery. An ultra high resolution tiled display system is used to explore the relationship between human and satellite remote sensing data at scale. A case study is provided that evaluates the presented technique in the context of an archaeological field expedition. A team in the field communicated in real-time with and was guided by researchers in the remote visual analytics laboratory, swiftly sifting through incoming crowdsourced data to identify target locations that were identified as viable archaeological sites.

  8. Carbon Nanotube Fiber Ionization Mass Spectrometry: A Fundamental Study of a Multi-Walled Carbon Nanotube Functionalized Corona Discharge Pin for Polycyclic Aromatic Hydrocarbons Analysis

    NASA Astrophysics Data System (ADS)

    Nahan, Keaton S.; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne

    2017-09-01

    Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R2) of three target PAHs with phenanthrene internal standard. [Figure not available: see fulltext.

  9. Can matrix solid phase dispersion (MSPD) be more simplified? Application of solventless MSPD sample preparation method for GC-MS and GC-FID analysis of plant essential oil components.

    PubMed

    Wianowska, Dorota; Dawidowicz, Andrzej L

    2016-05-01

    This paper proposes and shows the analytical capabilities of a new variant of matrix solid phase dispersion (MSPD) with the solventless blending step in the chromatographic analysis of plant volatiles. The obtained results prove that the use of a solvent is redundant as the sorption ability of the octadecyl brush is sufficient for quantitative retention of volatiles from 9 plants differing in their essential oil composition. The extraction efficiency of the proposed simplified MSPD method is equivalent to the efficiency of the commonly applied variant of MSPD with the organic dispersing liquid and pressurized liquid extraction, which is a much more complex, technically advanced and highly efficient technique of plant extraction. The equivalency of these methods is confirmed by the variance analysis. The proposed solventless MSPD method is precise, accurate, and reproducible. The recovery of essential oil components estimated by the MSPD method exceeds 98%, which is satisfactory for analytical purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A Time Domain Impedance Probe For Ultra-Fast Measurements of Electron Parameters in the Ionosphere: Results From A NASA USIP Experiment

    NASA Astrophysics Data System (ADS)

    Clark, D. C.; Spencer, E. A.; Gollapalli, R.; Kerrigan, B.

    2016-12-01

    A plasma impedance probe is used to obtain plasma parameters in the ionosphere by measuring the magnitude, shape and location of resonances in the frequency spectrum when a probe structure is driven with RF excitation. We have designed and developed a new Time Domain Impedance Probe (TDIP) capable of making measurements of absolute electron density and electron neutral collision frequency at temporal and spatial resolutions not previously attained. A single measurement can be made in a time as short as 100 microseconds, which yields much higher spatial resolution than a frequency sweep method. This method essentially consists of applying a small amplitude time limited voltage signal into a probe and measuring the resulting current response. The frequency bandwidth of the voltage signal is selected in order that the electron plasma resonances are observable. A prototype of the new instrument was flown at 08:45 EST on March 1 2016 on a NASA Undergraduate Student Instrument Progam (USIP) sounding rocket launched out of Wallops Flight Facility (Flight time was around 20 minutes). Here we analyze the data from the sounding rocket experiment, using an adaptive system identification technique to compare the measured data with analytical formulas obtained from a theoretical consideration of the time domain response. The analytical formula is calibrated to a plasma fluid finite difference time domain (PFFDTD) numerical computation before using it to analyze the rocket data from 85 km to 170 km on both upleg and downleg. Our results show that the technique works as advertised, but several issues including payload charging and signal rectification remains to be resolved. A plasma impedance probe is used to obtain plasma parameters in the ionosphere by measuring the magnitude, shape and location of resonances in the frequency spectrum when a probe structure is driven with RF excitation. We have designed and developed a new Time Domain Impedance Probe (TDIP) capable of making measurements of absolute electron density and electron neutral collision frequency at temporal and spatial resolutions not previously attained. A single measurement can be made in a time as short as 100 microseconds, which yields much higher spatial resolution than a frequency sweep method. This method essentially consists of applying a small amplitude time limited voltage signal into a probe and measuring the resulting current response. The frequency bandwidth of the voltage signal is selected in order that the electron plasma resonances are observable. A prototype of the new instrument was flown at 08:45 EST on March 1 2016 on a NASA Undergraduate Student Instrument Progam (USIP) sounding rocket launched out of Wallops Flight Facility (Flight time was around 20 minutes). Here we analyze the data from the sounding rocket experiment, using an adaptive system identification technique to compare the measured data with analytical formulas obtained from a theoretical consideration of the time domain response. The analytical formula is calibrated to a plasma fluid finite difference time domain (PFFDTD) numerical computation before using it to analyze the rocket data from 85 km to 170 km on both upleg and downleg. Our results show that the technique works as advertised, but several issues including payload charging and signal rectification remains to be resolved.

  11. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  12. INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS

    EPA Science Inventory

    A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...

  13. Anticipating Surprise: Analysis for Strategic Warning

    DTIC Science & Technology

    2002-12-01

    Intentions versus Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2 Introduction to the Analytical Method ...Analysis . . . . . . . . . . . . . . . . . . . . . . 32 Specifics of the Analytical Method . . . . . . . . . . . . . . . . . . . . . . . . 42 3...intelligence. Why is it that “no one’’—a slight but not great exaggeration—believes in the indications method , despite its demonstrably good record in these

  14. Comparison of closed loop model with flight test results

    NASA Technical Reports Server (NTRS)

    George, F. L.

    1981-01-01

    An analytic technique capable of predicting the landing characteristics of proposed aircraft configurations in the early stages of design was developed. In this analysis, a linear pilot-aircraft closed loop model was evaluated using experimental data generated with the NT-33 variable stability in-flight simulator. The pilot dynamics are modeled as inner and outer servo loop closures around aircraft pitch attitude, and altitude rate-of-change respectively. The landing flare maneuver is of particular interest as recent experience with military and other highly augmented vehicles shows this task to be relatively demanding, and potentially a critical design point. A unique feature of the pilot model is the incorporation of an internal model of the pilot's desired flight path for the flare maneuver.

  15. Improved Estimation of Electron Temperature from Rocket-borne Impedance Probes

    NASA Astrophysics Data System (ADS)

    Rowland, D. E.; Wolfinger, K.; Stamm, J. D.

    2017-12-01

    The impedance probe technique is a well known method for determining high accuracy measurements of electron number density in the Earth's ionosphere. We present analysis of impedance probe data from several sounding rockets at low, mid-, and auroral latitudes, including high cadence estimates of the electron temperature, derived from analytical fits to the antenna impedance curves. These estimates compare favorably with independent estimates from Langmuir Probes, but at much higher temporal and spatial resolution, providing a capability to resolve small-scale temperature fluctuations. We also present some considerations for the design of impedance probes, including assessment of the effects of resonance damping due to rocket motion, effects of wake and spin modulation, and aspect angle to the magnetic field.

  16. Space Shuttle Orbiter - Leading edge structural design/analysis and material allowables

    NASA Technical Reports Server (NTRS)

    Johnson, D. W.; Curry, D. M.; Kelly, R. E.

    1986-01-01

    Reinforced Carbon-Carbon (RCC), a structural composite whose development was targeted for the high temperature reentry environments of reusable space vehicles, has successfully demonstrated that capability on the Space Shuttle Orbiter. Unique mechanical properties, particularly at elevated temperatures up to 3000 F, make this material ideally suited for the 'hot' regions of multimission space vehicles. Design allowable characterization testing, full-scale development and qualification testing, and structural analysis techniques will be presented herein that briefly chart the history of the RCC material from infancy to eventual multimission certification for the Orbiter. Included are discussions pertaining to the development of the design allowable data base, manipulation of the test data into usable forms, and the analytical verification process.

  17. Elemental analysis by IBA and NAA — A critical comparison

    NASA Astrophysics Data System (ADS)

    Watterson, J. I. W.

    1988-12-01

    In this review neutron activation analysis (NAA) and ion beam analysis (IBA) have been compared in the context of the entire field of analytical science using the discipline of scientometrics, as developed by Braun and Lyon. This perspective on the relative achievements of the two methods is modified by considering and comparing their particular attributes and characteristics, particularly in relation to their differing degree of maturity. This assessment shows that NAA, as the more mature method, is the most widely applied nuclear technique, but the special capabilities of IBA give it the ability to provide information about surface composition and elemental distribution that is unique, while it is still relatively immature and it is not yet possible to define its ultimate role with any confidence.

  18. Critical factors determining the quantification capability of matrix-assisted laser desorption/ionization– time-of-flight mass spectrometry

    PubMed Central

    Wang, Chia-Chen; Lai, Yin-Hung; Ou, Yu-Meng; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-01-01

    Quantitative analysis with mass spectrometry (MS) is important but challenging. Matrix-assisted laser desorption/ionization (MALDI) coupled with time-of-flight (TOF) MS offers superior sensitivity, resolution and speed, but such techniques have numerous disadvantages that hinder quantitative analyses. This review summarizes essential obstacles to analyte quantification with MALDI-TOF MS, including the complex ionization mechanism of MALDI, sensitive characteristics of the applied electric fields and the mass-dependent detection efficiency of ion detectors. General quantitative ionization and desorption interpretations of ion production are described. Important instrument parameters and available methods of MALDI-TOF MS used for quantitative analysis are also reviewed. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644968

  19. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  20. Explicit solution techniques for impact with contact constraints

    NASA Technical Reports Server (NTRS)

    Mccarty, Robert E.

    1993-01-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  1. Explicit solution techniques for impact with contact constraints

    NASA Astrophysics Data System (ADS)

    McCarty, Robert E.

    1993-08-01

    Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.

  2. On Textual Analysis and Machine Learning for Cyberstalking Detection.

    PubMed

    Frommholz, Ingo; Al-Khateeb, Haider M; Potthast, Martin; Ghasem, Zinnar; Shukla, Mitul; Short, Emma

    2016-01-01

    Cyber security has become a major concern for users and businesses alike. Cyberstalking and harassment have been identified as a growing anti-social problem. Besides detecting cyberstalking and harassment, there is the need to gather digital evidence, often by the victim. To this end, we provide an overview of and discuss relevant technological means, in particular coming from text analytics as well as machine learning, that are capable to address the above challenges. We present a framework for the detection of text-based cyberstalking and the role and challenges of some core techniques such as author identification, text classification and personalisation. We then discuss PAN, a network and evaluation initiative that focusses on digital text forensics, in particular author identification.

  3. Imaging of arsenic traces in human hair by nano-SIMS 50

    NASA Astrophysics Data System (ADS)

    Audinot, J.-N.; Schneider, S.; Yegles, M.; Hallegot, P.; Wennig, R.; Migeon, H.-N.

    2004-06-01

    The nano-SIMS 50 allows ion imaging to be performed on microtomed hair cross-sections in order to determine the concentration and to localize the distribution of arsenic traces in hairs. Our study shows a linear relationship between the SIMS signal (As normalized with respect to CN) and the concentration determined by other analytical techniques. The advantages of SIMS imaging can be clearly proved by the capability to record quantitative distributions of As in the cross section. As a matter of fact, the nano-SIMS 50 images may allow differentiation between As located in the medulla, the cortex and the cuticle of the hair and thus distinguish between intoxication by indigestion and surface pollution of the sample.

  4. Advances in mass spectrometry-based cancer research and analysis: from cancer proteomics to clinical diagnostics.

    PubMed

    Timms, John F; Hale, Oliver J; Cramer, Rainer

    2016-06-01

    The last 20 years have seen significant improvements in the analytical capabilities of biological mass spectrometry (MS). Studies using advanced MS have resulted in new insights into cell biology and the etiology of diseases as well as its use in clinical applications. This review discusses recent developments in MS-based technologies and their cancer-related applications with a focus on proteomics. It also discusses the issues around translating the research findings to the clinic and provides an outline of where the field is moving. Expert commentary: Proteomics has been problematic to adapt for the clinical setting. However, MS-based techniques continue to demonstrate potential in novel clinical uses beyond classical cancer proteomics.

  5. Environmental exposure effects on composite materials for commercial aircraft

    NASA Technical Reports Server (NTRS)

    Gibbons, M. N.

    1982-01-01

    The data base for composite materials' properties as they are affected by the environments encountered in operating conditions, both in flight and at ground terminals is expanded. Absorbed moisture degrades the mechanical properties of graphite/epoxy laminates at elevated temperatures. Since airplane components are frequently exposed to atmospheric moisture, rain, and accumulated water, quantitative data are required to evaluate the amount of fluids absorbed under various environmental conditions and the subsequent effects on material properties. In addition, accelerated laboratory test techniques are developed are reliably capable of predicting long term behavior. An accelerated environmental exposure testing procedure is developed, and experimental results are correlated and compared with analytical results to establish the level of confidence for predicting composite material properties.

  6. Moving your laboratories to the field – Advantages and limitations of the use of field portable instruments in environmental sample analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek

    The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical chemistry principles. • Performance requirements in field analysis stimulate technological progress.« less

  7. Combined sensing platform for advanced diagnostics in exhaled mouse breath

    NASA Astrophysics Data System (ADS)

    Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris

    2013-03-01

    Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.

  8. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  9. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    NASA Astrophysics Data System (ADS)

    Padhee, Varsha

    Common Mode Voltage (CMV) in any power converter has been the major contributor to premature motor failures, bearing deterioration, shaft voltage build up and electromagnetic interference. Intelligent control methods like Space Vector Pulse Width Modulation (SVPWM) techniques provide immense potential and flexibility to reduce CMV, thereby targeting all the afore mentioned problems. Other solutions like passive filters, shielded cables and EMI filters add to the volume and cost metrics of the entire system. Smart SVPWM techniques therefore, come with a very important advantage of being an economical solution. This thesis discusses a modified space vector technique applied to an Indirect Matrix Converter (IMC) which results in the reduction of common mode voltages and other advanced features. The conventional indirect space vector pulse-width modulation (SVPWM) method of controlling matrix converters involves the usage of two adjacent active vectors and one zero vector for both rectifying and inverting stages of the converter. By suitable selection of space vectors, the rectifying stage of the matrix converter can generate different levels of virtual DC-link voltage. This capability can be exploited for operation of the converter in different ranges of modulation indices for varying machine speeds. This results in lower common mode voltage and improves the harmonic spectrum of the output voltage, without increasing the number of switching transitions as compared to conventional modulation. To summarize it can be said that the responsibility of formulating output voltages with a particular magnitude and frequency has been transferred solely to the rectifying stage of the IMC. Estimation of degree of distortion in the three phase output voltage is another facet discussed in this thesis. An understanding of the SVPWM technique and the switching sequence of the space vectors in detail gives the potential to estimate the RMS value of the switched output voltage of any converter. This conceivably aids the sizing and design of output passive filters. An analytical estimation method has been presented to achieve this purpose for am IMC. Knowledge of the fundamental component in output voltage can be utilized to calculate its Total Harmonic Distortion (THD). The effectiveness of the proposed SVPWM algorithms and the analytical estimation technique is substantiated by simulations in MATLAB / Simulink and experiments on a laboratory prototype of the IMC. Proper comparison plots have been provided to contrast the performance of the proposed methods with the conventional SVPWM method. The behavior of output voltage distortion and CMV with variation in operating parameters like modulation index and output frequency has also been analyzed.

  10. An optical sensing approach for the noninvasive transdermal monitoring of cortisol

    NASA Astrophysics Data System (ADS)

    Hwang, Yongsoon; Gupta, Niraj K.; Ojha, Yagya R.; Cameron, Brent D.

    2016-03-01

    Cortisol, a biomarker of stress, has recently been shown to have potential in evaluating the physiological state of individuals diagnosed with stress-related conditions including chronic fatigue syndrome. Noninvasive techniques to extract biomarkers from the body are a topic of considerable interest. One such technique to achieve this is known as reverse iontophoresis (RI) which is capable of extracting biomolecules through the skin. Unfortunately, however, the extracted levels are often considerably lower in concentration than those found in blood, thereby requiring a very sensitive analytical method with a low limit of detection. A promising sensing approach, which is well suited to handle such samples, is Surface Plasmon Resonance (SPR) spectroscopy. When coupled with aptamer modified surfaces, such sensors can achieve both selectivity and the required sensitivity. In this study, fabrication and characterization of a RIbased SPR biosensor for the measurement of cortisol has been developed. The optical mount and diffusion cell were both fabricated through the use of 3D printing techniques. The SPR sensor was configured to employ a prism couplerbased arrangement with a laser generation module and CCD line sensor. Cortisol-specific DNA aptamers were immobilized onto a gold surface to achieve the necessary selectivity. For demonstration purposes, cortisol was extracted by the RI system using a skin phantom flow system capable of generating time dependent concentration profiles. The captured sample was then transported using a micro-fluidic platform from the RI collection site to the SPR sensor for real-time monitoring. Analysis and system control was accomplished within a developed LabVIEW® program.

  11. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  12. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  13. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  14. Analytical Chemistry Developmental Work Using a 243Am Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.

    2015-02-24

    This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .

  15. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  16. Discovering charge density functionals and structure-property relationships with PROPhet: A general framework for coupling machine learning and first-principles methods

    DOE PAGES

    Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.

    2017-04-26

    Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less

  17. High-accuracy 3D Fourier forward modeling of gravity field based on the Gauss-FFT technique

    NASA Astrophysics Data System (ADS)

    Zhao, Guangdong; Chen, Bo; Chen, Longwei; Liu, Jianxin; Ren, Zhengyong

    2018-03-01

    The 3D Fourier forward modeling of 3D density sources is capable of providing 3D gravity anomalies coincided with the meshed density distribution within the whole source region. This paper firstly derives a set of analytical expressions through employing 3D Fourier transforms for calculating the gravity anomalies of a 3D density source approximated by right rectangular prisms. To reduce the errors due to aliasing and imposed periodicity as well as edge effects in the Fourier domain modeling, we develop the 3D Gauss-FFT technique to the 3D gravity anomalies forward modeling. The capability and adaptability of this scheme are tested by simple synthetic models. The results show that the accuracy of the Fourier forward methods using the Gauss-FFT with 4 Gaussian-nodes (or more) is comparable to that of the spatial modeling. In addition, the "ghost" source effects in the 3D Fourier forward gravity field due to imposed periodicity of the standard FFT algorithm are remarkably depressed by the application of the 3D Gauss-FFT algorithm. More importantly, the execution times of the 4 nodes Gauss-FFT modeling are reduced by two orders of magnitude compared with the spatial forward method. It demonstrates that the improved Fourier method is an efficient and accurate forward modeling tool for the gravity field.

  18. Discovering charge density functionals and structure-property relationships with PROPhet: A general framework for coupling machine learning and first-principles methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.

    Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less

  19. Inverse measurement of wall pressure field in flexible-wall wind tunnels using global wall deformation data

    NASA Astrophysics Data System (ADS)

    Brown, Kenneth; Brown, Julian; Patil, Mayuresh; Devenport, William

    2018-02-01

    The Kevlar-wall anechoic wind tunnel offers great value to the aeroacoustics research community, affording the capability to make simultaneous aeroacoustic and aerodynamic measurements. While the aeroacoustic potential of the Kevlar-wall test section is already being leveraged, the aerodynamic capability of these test sections is still to be fully realized. The flexibility of the Kevlar walls suggests the possibility that the internal test section flow may be characterized by precisely measuring small deflections of the flexible walls. Treating the Kevlar fabric walls as tensioned membranes with known pre-tension and material properties, an inverse stress problem arises where the pressure distribution over the wall is sought as a function of the measured wall deflection. Experimental wall deformations produced by the wind loading of an airfoil model are measured using digital image correlation and subsequently projected onto polynomial basis functions which have been formulated to mitigate the impact of measurement noise based on a finite-element study. Inserting analytic derivatives of the basis functions into the equilibrium relations for a membrane, full-field pressure distributions across the Kevlar walls are computed. These inversely calculated pressures, after being validated against an independent measurement technique, can then be integrated along the length of the test section to give the sectional lift of the airfoil. Notably, these first-time results are achieved with a non-contact technique and in an anechoic environment.

  20. Robotics in scansorial environments

    NASA Astrophysics Data System (ADS)

    Autumn, Kellar; Buehler, Martin; Cutkosky, Mark; Fearing, Ronald; Full, Robert J.; Goldman, Daniel; Groff, Richard; Provancher, William; Rizzi, Alfred A.; Saranli, Uluc; Saunders, Aaron; Koditschek, Daniel E.

    2005-05-01

    We review a large multidisciplinary effort to develop a family of autonomous robots capable of rapid, agile maneuvers in and around natural and artificial vertical terrains such as walls, cliffs, caves, trees and rubble. Our robot designs are inspired by (but not direct copies of) biological climbers such as cockroaches, geckos, and squirrels. We are incorporating advanced materials (e.g., synthetic gecko hairs) into these designs and fabricating them using state of the art rapid prototyping techniques (e.g., shape deposition manufacturing) that permit multiple iterations of design and testing with an effective integration path for the novel materials and components. We are developing novel motion control techniques to support dexterous climbing behaviors that are inspired by neuroethological studies of animals and descended from earlier frameworks that have proven analytically tractable and empirically sound. Our near term behavioral targets call for vertical climbing on soft (e.g., bark) or rough surfaces and for ascents on smooth, hard steep inclines (e.g., 60 degree slopes on metal or glass sheets) at one body length per second.

  1. POPA: A Personality and Object Profiling Assistant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreicer, J.S.

    POPA: A Personality and Object Profiling Assistant system utilizes an extension and variation of a process developed for decision analysis as a tool to quantify intuitive feelings and subjective judgments. The technique is based on a manipulation of the Analytical Hierarchy Process. The POPA system models an individual in terms of his character type, life orientation, and incentive (motivational) factors. Then an object (i.e., individual, project, situation, or policy) is modeled with respect to its three most important factors. The individual and object models are combined to indicate the influence each of the three object factors have on the individual.more » We have investigated this problem: 1) to develop a technique that models personality types in a quantitative and organized manner, 2) to develop a tool capable of evaluating the probable success of obtaining funding for proposed programs at Los Alamos National Laboratory, 3) to determine the feasibility of quantifying feelings and intuition, and 4) to better understand subjective knowledge acquisition (especially intuition). 49 refs., 10 figs., 5 tabs.« less

  2. Thermal analysis of the vertical bridgman semiconductor crystal growth technique. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Jasinski, T. J.

    1982-01-01

    The quality of semiconductor crystals grown by the vertical Bridgman technique is strongly influenced by the axial and radial variations of temperature within the charge. The relationship between the thermal parameters of the vertical Bridgman system and the thermal behavior of the charge are examined. Thermal models are developed which are capable of producing results expressable in analytical form and which can be used without recourse to extensive computer work for the preliminary thermal design of vertical Bridgman crystal growth systems. These models include the effects of thermal coupling between the furnace and the charge, charge translation rate, charge diameter, thickness and thermal conductivity of the confining crucible, thermal conductivity change and liberation of latent heat at the growth interface, and infinite charge length. The hot and cold zone regions, considered to be at spatially uniform temperatures, are separated by a gradient control region which provides added thermal design flexibility for controlling the temperature variations near the growth interface.

  3. Computer-composite mapping for geologists

    USGS Publications Warehouse

    van Driel, J.N.

    1980-01-01

    A computer program for overlaying maps has been tested and evaluated as a means for producing geologic derivative maps. Four maps of the Sugar House Quadrangle, Utah, were combined, using the Multi-Scale Data Analysis and Mapping Program, in a single composite map that shows the relative stability of the land surface during earthquakes. Computer-composite mapping can provide geologists with a powerful analytical tool and a flexible graphic display technique. Digitized map units can be shown singly, grouped with different units from the same map, or combined with units from other source maps to produce composite maps. The mapping program permits the user to assign various values to the map units and to specify symbology for the final map. Because of its flexible storage, easy manipulation, and capabilities of graphic output, the composite-mapping technique can readily be applied to mapping projects in sedimentary and crystalline terranes, as well as to maps showing mineral resource potential. ?? 1980 Springer-Verlag New York Inc.

  4. A manual for microcomputer image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rich, P.M.; Ranken, D.M.; George, J.S.

    1989-12-01

    This manual is intended to serve three basic purposes: as a primer in microcomputer image analysis theory and techniques, as a guide to the use of IMAGE{copyright}, a public domain microcomputer program for image analysis, and as a stimulus to encourage programmers to develop microcomputer software suited for scientific use. Topics discussed include the principals of image processing and analysis, use of standard video for input and display, spatial measurement techniques, and the future of microcomputer image analysis. A complete reference guide that lists the commands for IMAGE is provided. IMAGE includes capabilities for digitization, input and output of images,more » hardware display lookup table control, editing, edge detection, histogram calculation, measurement along lines and curves, measurement of areas, examination of intensity values, output of analytical results, conversion between raster and vector formats, and region movement and rescaling. The control structure of IMAGE emphasizes efficiency, precision of measurement, and scientific utility. 18 refs., 18 figs., 2 tabs.« less

  5. Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model

    NASA Astrophysics Data System (ADS)

    Kassebaum, Paul G.; Iannacchione, Germano S.

    The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.

  6. Analysis of nanoliter samples of electrolytes using a flow-through microfluorometer.

    PubMed

    Zhelyaskov, V R; Liu, S; Broderick, M P

    2000-04-01

    Several techniques have been developed to study the transport properties of nanoliter samples of renal tubule segments, such as continuous flow colorimetry and continuous fluorometry. We have extended the capability of the NANOFLO, a flow-through microfluorometer, designed for measurement of carbon dioxide, urea, ammonia, glucose, lactate, etc., to analyze sodium, calcium and chloride ions, using three commercially available fluorescent indicators for intracellular and extracellular measurements. The selection of fluorescent indicator for each electrolyte was dependent on the optimal match of the dissociation constant and the analyte concentration range of interest. Using Fluo-3 dye we achieved a detection limit for Ca2+ of 0.1 pmol and selectivity over Mg2+ of between 7:1 to 10:1. Using sodium green dye we achieved detection limit for Na+ of 12 pmol and a selectivity over K+ of 40:1. The detection limit for Cl- using lucigenin dye was 10 pmol. This technique can be readily adapted for the measurement of other physiologically important ultralow volume.

  7. On-chip collection of particles and cells by AC electroosmotic pumping and dielectrophoresis using asymmetric microelectrodes

    PubMed Central

    Melvin, Elizabeth M.; Moore, Brandon R.; Gilchrist, Kristin H.; Grego, Sonia; Velev, Orlin D.

    2011-01-01

    The recent development of microfluidic “lab on a chip” devices requiring sample sizes <100 μL has given rise to the need to concentrate dilute samples and trap analytes, especially for surface-based detection techniques. We demonstrate a particle collection device capable of concentrating micron-sized particles in a predetermined area by combining AC electroosmosis (ACEO) and dielectrophoresis (DEP). The planar asymmetric electrode pattern uses ACEO pumping to induce equal, quadrilateral flow directed towards a stagnant region in the center of the device. A number of system parameters affecting particle collection efficiency were investigated including electrode and gap width, chamber height, applied potential and frequency, and number of repeating electrode pairs and electrode geometry. The robustness of the on-chip collection design was evaluated against varying electrolyte concentrations, particle types, and particle sizes. These devices are amenable to integration with a variety of detection techniques such as optical evanescent waveguide sensing. PMID:22662040

  8. Correlation study of theoretical and experimental results for spin tests of a 1/10 scale radio control model

    NASA Technical Reports Server (NTRS)

    Bihrle, W., Jr.

    1976-01-01

    A correlation study was conducted to determine the ability of current analytical spin prediction techniques to predict the flight motions of a current fighter airplane configuration during the spin entry, the developed spin, and the spin recovery motions. The airplane math model used aerodynamics measured on an exact replica of the flight test model using conventional static and forced-oscillation wind-tunnel test techniques and a recently developed rotation-balance test apparatus capable of measuring aerodynamics under steady spinning conditions. An attempt was made to predict the flight motions measured during stall/spin flight testing of an unpowered, radio-controlled model designed to be a 1/10 scale, dynamically-scaled model of a current fighter configuration. Comparison of the predicted and measured flight motions show that while the post-stall and spin entry motions were not well-predicted, the developed spinning motion (a steady flat spin) and the initial phases of the spin recovery motion are reasonably well predicted.

  9. Analysis of Environmental Contamination resulting from Catastrophic Incidents: Part two: Building Laboratory Capability by Selecting and Developing Analytical Methodologies

    EPA Science Inventory

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...

  10. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  11. GPR Imaging for Deeply Buried Objects: A Comparative Study Based on FDTD Models and Field Experiments

    NASA Technical Reports Server (NTRS)

    Tilley, roger; Dowla, Farid; Nekoogar, Faranak; Sadjadpour, Hamid

    2012-01-01

    Conventional use of Ground Penetrating Radar (GPR) is hampered by variations in background environmental conditions, such as water content in soil, resulting in poor repeatability of results over long periods of time when the radar pulse characteristics are kept the same. Target objects types might include voids, tunnels, unexploded ordinance, etc. The long-term objective of this work is to develop methods that would extend the use of GPR under various environmental and soil conditions provided an optimal set of radar parameters (such as frequency, bandwidth, and sensor configuration) are adaptively employed based on the ground conditions. Towards that objective, developing Finite Difference Time Domain (FDTD) GPR models, verified by experimental results, would allow us to develop analytical and experimental techniques to control radar parameters to obtain consistent GPR images with changing ground conditions. Reported here is an attempt at developing 20 and 3D FDTD models of buried targets verified by two different radar systems capable of operating over different soil conditions. Experimental radar data employed were from a custom designed high-frequency (200 MHz) multi-static sensor platform capable of producing 3-D images, and longer wavelength (25 MHz) COTS radar (Pulse EKKO 100) capable of producing 2-D images. Our results indicate different types of radar can produce consistent images.

  12. The multi-resolution capability of Tchebichef moments and its applications to the analysis of fluorescence excitation-emission spectra

    NASA Astrophysics Data System (ADS)

    Li, Bao Qiong; Wang, Xue; Li Xu, Min; Zhai, Hong Lin; Chen, Jing; Liu, Jin Jin

    2018-01-01

    Fluorescence spectroscopy with an excitation-emission matrix (EEM) is a fast and inexpensive technique and has been applied to the detection of a very wide range of analytes. However, serious scattering and overlapping signals hinder the applications of EEM spectra. In this contribution, the multi-resolution capability of Tchebichef moments was investigated in depth and applied to the analysis of two EEM data sets (data set 1 consisted of valine-tyrosine-valine, tryptophan-glycine and phenylalanine, and data set 2 included vitamin B1, vitamin B2 and vitamin B6) for the first time. By means of the Tchebichef moments with different orders, the different information in the EEM spectra can be represented. It is owing to this multi-resolution capability that the overlapping problem was solved, and the information of chemicals and scatterings were separated. The obtained results demonstrated that the Tchebichef moment method is very effective, which provides a promising tool for the analysis of EEM spectra. It is expected that the applications of Tchebichef moment method could be developed and extended in complex systems such as biological fluids, food, environment and others to deal with the practical problems (overlapped peaks, unknown interferences, baseline drifts, and so on) with other spectra.

  13. Optical Microresonators for Sensing and Transduction: A Materials Perspective.

    PubMed

    Heylman, Kevin D; Knapper, Kassandra A; Horak, Erik H; Rea, Morgan T; Vanga, Sudheer K; Goldsmith, Randall H

    2017-08-01

    Optical microresonators confine light to a particular microscale trajectory, are exquisitely sensitive to their microenvironment, and offer convenient readout of their optical properties. Taken together, this is an immensely attractive combination that makes optical microresonators highly effective as sensors and transducers. Meanwhile, advances in material science, fabrication techniques, and photonic sensing strategies endow optical microresonators with new functionalities, unique transduction mechanisms, and in some cases, unparalleled sensitivities. In this progress report, the operating principles of these sensors are reviewed, and different methods of signal transduction are evaluated. Examples are shown of how choice of materials must be suited to the analyte, and how innovations in fabrication and sensing are coupled together in a mutually reinforcing cycle. A tremendously broad range of capabilities of microresonator sensors is described, from electric and magnetic field sensing to mechanical sensing, from single-molecule detection to imaging and spectroscopy, from operation at high vacuum to in live cells. Emerging sensing capabilities are highlighted and put into context in the field. Future directions are imagined, where the diverse capabilities laid out are combined and advances in scalability and integration are implemented, leading to the creation of a sensor unparalleled in sensitivity and information content. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. High-Throughput Incubation and Quantification of Agglutination Assays in a Microfluidic System.

    PubMed

    Castro, David; Conchouso, David; Kodzius, Rimantas; Arevalo, Arpys; Foulds, Ian G

    2018-06-04

    In this paper, we present a two-phase microfluidic system capable of incubating and quantifying microbead-based agglutination assays. The microfluidic system is based on a simple fabrication solution, which requires only laboratory tubing filled with carrier oil, driven by negative pressure using a syringe pump. We provide a user-friendly interface, in which a pipette is used to insert single droplets of a 1.25-µL volume into a system that is continuously running and therefore works entirely on demand without the need for stopping, resetting or washing the system. These assays are incubated by highly efficient passive mixing with a sample-to-answer time of 2.5 min, a 5⁻10-fold improvement over traditional agglutination assays. We study system parameters such as channel length, incubation time and flow speed to select optimal assay conditions, using the streptavidin-biotin interaction as a model analyte quantified using optical image processing. We then investigate the effect of changing the concentration of both analyte and microbead concentrations, with a minimum detection limit of 100 ng/mL. The system can be both low- and high-throughput, depending on the rate at which assays are inserted. In our experiments, we were able to easily produce throughputs of 360 assays per hour by simple manual pipetting, which could be increased even further by automation and parallelization. Agglutination assays are a versatile tool, capable of detecting an ever-growing catalog of infectious diseases, proteins and metabolites. A system such as this one is a step towards being able to produce high-throughput microfluidic diagnostic solutions with widespread adoption. The development of analytical techniques in the microfluidic format, such as the one presented in this work, is an important step in being able to continuously monitor the performance and microfluidic outputs of organ-on-chip devices.

  15. Experiences with semiautomatic aerotriangulation on digital photogrammetric stations

    NASA Astrophysics Data System (ADS)

    Kersten, Thomas P.; Stallmann, Dirk

    1995-12-01

    With the development of higher-resolution scanners, faster image-handling capabilities, and higher-resolution screens, digital photogrammetric workstations promise to rival conventional analytical plotters in functionality, i.e. in the degree of automation in data capture and processing, and in accuracy. The availability of high quality digital image data and inexpensive high capacity fast mass storage offers the capability to perform accurate semi- automatic or automatic triangulation of digital aerial photo blocks on digital photogrammetric workstations instead of analytical plotters. In this paper, we present our investigations and results on two photogrammetric triangulation blocks, the OEEPE (European Organisation for Experimental Photogrammetric Research) test block (scale 1;4'000) and a Swiss test block (scale 1:12'000) using digitized images. Twenty-eight images of the OEEPE test block were scanned on the Zeiss/Intergraph PS1 and the digital images were delivered with a resolution of 15 micrometer and 30 micrometer, while 20 images of the Swiss test block were scanned on the Desktop Publishing Scanner Agfa Horizon with a resolution of 42 micrometer and on the PS1 with 15 micrometer. Measurements in the digital images were performed on the commercial Digital photogrammetric Station Leica/Helava DPW770 and with basic hard- and software components of the Digital Photogrammetric Station DIPS II, an experimental system of the Institute of Geodesy and Photogrammetry, ETH Zurich. As a reference, the analog images of both photogrammetric test blocks were measured at analytical plotters. On DIPS II measurements of fiducial marks, signalized and natural tie points were performed by least squares template and image matching, while on DPW770 all points were measured by the cross correlation technique. The observations were adjusted in a self-calibrating bundle adjustment. The comparisons between these results and the experiences with the functionality of the commercial and the experimental system are presented.

  16. Development of Novel Method for Rapid Extract of Radionuclides from Solution Using Polymer Ligand Film

    NASA Astrophysics Data System (ADS)

    Rim, Jung H.

    Accurate and fast determination of the activity of radionuclides in a sample is critical for nuclear forensics and emergency response. Radioanalytical techniques are well established for radionuclides measurement, however, they are slow and labor intensive, requiring extensive radiochemical separations and purification prior to analysis. With these limitations of current methods, there is great interest for a new technique to rapidly process samples. This dissertation describes a new analyte extraction medium called Polymer Ligand Film (PLF) developed to rapidly extract radionuclides. Polymer Ligand Film is a polymer medium with ligands incorporated in its matrix that selectively and rapidly extract analytes from a solution. The main focus of the new technique is to shorten and simplify the procedure necessary to chemically isolate radionuclides for determination by alpha spectrometry or beta counting. Five different ligands were tested for plutonium extraction: bis(2-ethylhexyl) methanediphosphonic acid (H2DEH[MDP]), di(2-ethyl hexyl) phosphoric acid (HDEHP), trialkyl methylammonium chloride (Aliquat-336), 4,4'(5')-di-t-butylcyclohexano 18-crown-6 (DtBuCH18C6), and 2-ethylhexyl 2-ethylhexylphosphonic acid (HEH[EHP]). The ligands that were effective for plutonium extraction further studied for uranium extraction. The plutonium recovery by PLFs has shown dependency on nitric acid concentration and ligand to total mass ratio. H2DEH[MDP] PLFs performed best with 1:10 and 1:20 ratio PLFs. 50.44% and 47.61% of plutonium were extracted on the surface of PLFs with 1M nitric acid for 1:10 and 1:20 PLF, respectively. HDEHP PLF provided the best combination of alpha spectroscopy resolution and plutonium recovery with 1:5 PLF when used with 0.1M nitric acid. The overall analyte recovery was lower than electrodeposited samples, which typically has recovery above 80%. However, PLF is designed to be a rapid field deployable screening technique and consistency is more important than recovery. PLFs were also tested using blind quality control samples and the activities were accurately measured. It is important to point out that PLFs were consistently susceptible to analytes penetrating and depositing below the surface. The internal radiation within the body of PLF is mostly contained and did not cause excessive self-attenuation and peak broadening in alpha spectroscopy. The analyte penetration issue was beneficial in the destructive analysis. H2DEH[MDP] PLF was tested with environmental samples to fully understand the capabilities and limitations of the PLF in relevant environments. The extraction system was very effective in extracting plutonium from environmental water collected from Mortandad Canyon at Los Alamos National Laboratory with minimal sample processing. Soil samples were tougher to process than the water samples. Analytes were first leached from the soil matrixes using nitric acid before processing with PLF. This approach had a limitation in extracting plutonium using PLF. The soil samples from Mortandad Canyon, which are about 1% iron by weight, were effectively processed with the PLF system. Even with certain limitations of the PLF extraction system, this technique was able to considerably decrease the sample analysis time. The entire environmental sample was analyzed within one to two days. The decrease in time can be attributed to the fact that PLF is replacing column chromatography and electrodeposition with a single step for preparing alpha spectrometry samples. The two-step process of column chromatography and electrodeposition takes a couple days to a week to complete depending on the sample. The decrease in time and the simplified procedure make this technique a unique solution for application to nuclear forensics and emergency response. A large number of samples can be quickly analyzed and selective samples can be further analyzed with more sensitive techniques based on the initial data. The deployment of a PLF system as a screening method will greatly reduce a total analysis time required to gain meaningful isotopic data for the nuclear forensics application. (Abstract shortened by UMI.)

  17. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  18. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  19. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  20. Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skalski, John

    2003-11-01

    The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less

Top