Progressive Visual Analytics: User-Driven Visual Exploration of In-Progress Analytics.
Stolper, Charles D; Perer, Adam; Gotz, David
2014-12-01
As datasets grow and analytic algorithms become more complex, the typical workflow of analysts launching an analytic, waiting for it to complete, inspecting the results, and then re-Iaunching the computation with adjusted parameters is not realistic for many real-world tasks. This paper presents an alternative workflow, progressive visual analytics, which enables an analyst to inspect partial results of an algorithm as they become available and interact with the algorithm to prioritize subspaces of interest. Progressive visual analytics depends on adapting analytical algorithms to produce meaningful partial results and enable analyst intervention without sacrificing computational speed. The paradigm also depends on adapting information visualization techniques to incorporate the constantly refining results without overwhelming analysts and provide interactions to support an analyst directing the analytic. The contributions of this paper include: a description of the progressive visual analytics paradigm; design goals for both the algorithms and visualizations in progressive visual analytics systems; an example progressive visual analytics system (Progressive Insights) for analyzing common patterns in a collection of event sequences; and an evaluation of Progressive Insights and the progressive visual analytics paradigm by clinical researchers analyzing electronic medical records.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Progress toward a cosmic dust collection facility on space station
NASA Technical Reports Server (NTRS)
Mackinnon, Ian D. R. (Editor); Carey, William C. (Editor)
1987-01-01
Scientific and programmatic progress toward the development of a cosmic dust collection facility (CDCF) for the proposed space station is documented. Topics addressed include: trajectory sensor concepts; trajectory accuracy and orbital evolution; CDCF pointing direction; development of capture devices; analytical techniques; programmatic progress; flight opportunities; and facility development.
Dumont, Elodie; De Bleye, Charlotte; Sacré, Pierre-Yves; Netchacovitch, Lauranne; Hubert, Philippe; Ziemons, Eric
2016-05-01
Over recent decades, spreading environmental concern entailed the expansion of green chemistry analytical tools. Vibrational spectroscopy, belonging to this class of analytical tool, is particularly interesting taking into account its numerous advantages such as fast data acquisition and no sample preparation. In this context, near-infrared, Raman and mainly surface-enhanced Raman spectroscopy (SERS) have thus gained interest in many fields including bioanalysis. The two former techniques only ensure the analysis of concentrated compounds in simple matrices, whereas the emergence of SERS improved the performances of vibrational spectroscopy to very sensitive and selective analyses. Complex SERS substrates were also developed enabling biomarker measurements, paving the way for SERS immunoassays. Therefore, in this paper, the strengths and weaknesses of these techniques will be highlighted with a focus on recent progress.
Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit
2017-10-13
The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Feng; Liu, Yijin; Yu, Xiqian
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
Lin, Feng; Liu, Yijin; Yu, Xiqian; ...
2017-08-30
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
Hydrogen-fueled scramjets: Potential for detailed combustor analysis
NASA Technical Reports Server (NTRS)
Beach, H. L., Jr.
1976-01-01
Combustion research related to hypersonic scramjet (supersonic combustion ramjet) propulsion is discussed from the analytical point of view. Because the fuel is gaseous hydrogen, mixing is single phase and the chemical kinetics are well known; therefore, the potential for analysis is good relative to hydro-carbon fueled engines. Recent progress in applying two and three dimensional analytical techniques to mixing and reacting flows indicates cause for optimism, and identifies several areas for continuing effort.
Contribution of Electrochemistry to the Biomedical and Pharmaceutical Analytical Sciences.
Kauffmann, Jean-Michel; Patris, Stephanie; Vandeput, Marie; Sarakbi, Ahmad; Sakira, Abdul Karim
2016-01-01
All analytical techniques have experienced major progress since the last ten years and electroanalysis is also involved in this trend. The unique characteristics of phenomena occurring at the electrode-solution interface along with the variety of electrochemical methods currently available allow for a broad spectrum of applications. Potentiometric, conductometric, voltammetric and amperometric methods are briefly reviewed with a critical view in terms of performance of the developed instrumentation with special emphasis on pharmaceutical and biomedical applications.
NASA Technical Reports Server (NTRS)
Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.;
2007-01-01
The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.
Analytical advances in pharmaceutical impurity profiling.
Holm, René; Elder, David P
2016-05-25
Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Monitoring Student Progress Using Virtual Appliances: A Case Study
ERIC Educational Resources Information Center
Romero-Zaldivar, Vicente-Arturo; Pardo, Abelardo; Burgos, Daniel; Delgado Kloos, Carlos
2012-01-01
The interactions that students have with each other, with the instructors, and with educational resources are valuable indicators of the effectiveness of a learning experience. The increasing use of information and communication technology allows these interactions to be recorded so that analytic or mining techniques are used to gain a deeper…
Progress in our understanding of cometary dust tails
NASA Technical Reports Server (NTRS)
Sekanina, Z.
1976-01-01
Various analytical techniques are employed to analyze observations on the character, composition, and size distribution of solid particles in cometary dust tails. Emphasized is the mechanical theory that includes solar gravitational attraction and solar radiation pressure to explain dust particle motions in cometary tails, as well as interactions between dust and plasma.
The flotation and adsorption of mixed collectors on oxide and silicate minerals.
Xu, Longhua; Tian, Jia; Wu, Houqin; Lu, Zhongyuan; Sun, Wei; Hu, Yuehua
2017-12-01
The analysis of flotation and adsorption of mixed collectors on oxide and silicate minerals is of great importance for both industrial applications and theoretical research. Over the past years, significant progress has been achieved in understanding the adsorption of single collectors in micelles as well as at interfaces. By contrast, the self-assembly of mixed collectors at liquid/air and solid/liquid interfaces remains a developing area as a result of the complexity of the mixed systems involved and the limited availability of suitable analytical techniques. In this work, we systematically review the processes involved in the adsorption of mixed collectors onto micelles and at interface by examining four specific points, namely, theoretical background, factors that affect adsorption, analytical techniques, and self-assembly of mixed surfactants at the mineral/liquid interface. In the first part, the theoretical background of collector mixtures is introduced, together with several core solution theories, which are classified according to their application in the analysis of physicochemical properties of mixed collector systems. In the second part, we discuss the factors that can influence adsorption, including factors related to the structure of collectors and environmental conditions. We summarize their influence on the adsorption of mixed systems, with the objective to provide guidance on the progress achieved in this field to date. Advances in measurement techniques can greatly promote our understanding of adsorption processes. In the third part, therefore, modern techniques such as optical reflectometry, neutron scattering, neutron reflectometry, thermogravimetric analysis, fluorescence spectroscopy, ultrafiltration, atomic force microscopy, analytical ultracentrifugation, X-ray photoelectron spectroscopy, Vibrational Sum Frequency Generation Spectroscopy and molecular dynamics simulations are introduced in virtue of their application. Finally, focusing on oxide and silicate minerals, we review and summarize the flotation and adsorption of three most widely used mixed surfactant systems (anionic-cationic, anionic-nonionic, and cationic-nonionic) at the liquid/mineral interface in order to fully understand the self-assembly progress. In the end, the paper gives a brief future outlook of the possible development in the mixed surfactants. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Opportunity and Challenges for Migrating Big Data Analytics in Cloud
NASA Astrophysics Data System (ADS)
Amitkumar Manekar, S.; Pradeepini, G., Dr.
2017-08-01
Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.
High-freezing-point fuel studies
NASA Technical Reports Server (NTRS)
Tolle, F. F.
1980-01-01
Considerable progress in developing the experimental and analytical techniques needed to design airplanes to accommodate fuels with less stringent low temperature specifications is reported. A computer technique for calculating fuel temperature profiles in full tanks was developed. The computer program is being extended to include the case of partially empty tanks. Ultimately, the completed package is to be incorporated into an aircraft fuel tank thermal analyser code to permit the designer to fly various thermal exposure patterns, study fuel temperatures versus time, and determine holdup.
Application of LANDSAT data to monitor land reclamation progress in Belmont County, Ohio
NASA Technical Reports Server (NTRS)
Bloemer, H. H. L.; Brumfield, J. O.; Campbell, W. J.; Witt, R. G.; Bly, B. G.
1981-01-01
Strip and contour mining techniques are reviewed as well as some studies conducted to determine the applicability of LANDSAT and associated digital image processing techniques to the surficial problems associated with mining operations. A nontraditional unsupervised classification approach to multispectral data is considered which renders increased classification separability in land cover analysis of surface mined areas. The approach also reduces the dimensionality of the data and requires only minimal analytical skills in digital data processing.
Recent Progress in Optical Biosensors Based on Smartphone Platforms
Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda
2017-01-01
With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms. PMID:29068375
Recent Progress in Optical Biosensors Based on Smartphone Platforms.
Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda
2017-10-25
With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms.
Recent development of electrochemiluminescence sensors for food analysis.
Hao, Nan; Wang, Kun
2016-10-01
Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.
PERT/CPM and Supplementary Analytical Techniques. An Analysis of Aerospace Usage
1978-09-01
of a number of new...rapid pace of technological progress in the last 75 years has spawned the development of a. number of very interesting managorial tools, and one of ...support of the oversll effort. PR L g. At one time, use of PERT was mandatory on all major L]OD acquioition contracts . Since that time, the use of
Analytics and Action in Afghanistan
2010-09-01
rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology
Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane
2008-02-01
MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.
Petchkovsky, Leon
2017-06-01
Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.
Analytical model for real time, noninvasive estimation of blood glucose level.
Adhyapak, Anoop; Sidley, Matthew; Venkataraman, Jayanti
2014-01-01
The paper presents an analytical model to estimate blood glucose level from measurements made non-invasively and in real time by an antenna strapped to a patient's wrist. Some promising success has been shown by the RIT ETA Lab research group that an antenna's resonant frequency can track, in real time, changes in glucose concentration. Based on an in-vitro study of blood samples of diabetic patients, the paper presents a modified Cole-Cole model that incorporates a factor to represent the change in glucose level. A calibration technique using the input impedance technique is discussed and the results show a good estimation as compared to the glucose meter readings. An alternate calibration methodology has been developed that is based on the shift in the antenna resonant frequency using an equivalent circuit model containing a shunt capacitor to represent the shift in resonant frequency with changing glucose levels. Work under progress is the optimization of the technique with a larger sample of patients.
Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.
Malá, Zdena; Gebauer, Petr; Boček, Petr
2017-01-01
This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analysis of Tile-Reinforced Composite Armor. Part 1; Advanced Modeling and Strength Analyses
NASA Technical Reports Server (NTRS)
Davila, C. G.; Chen, Tzi-Kang; Baker, D. J.
1998-01-01
The results of an analytical and experimental study of the structural response and strength of tile-reinforced components of the Composite Armored Vehicle are presented. The analyses are based on specialized finite element techniques that properly account for the effects of the interaction between the armor tiles, the surrounding elastomers, and the glass-epoxy sublaminates. To validate the analytical predictions, tests were conducted with panels subjected to three-point bending loads. The sequence of progressive failure events for the laminates is described. This paper describes the results of Part 1 of a study of the response and strength of tile-reinforced composite armor.
Sensitive molecular diagnostics using surface-enhanced resonance Raman scattering (SERRS)
NASA Astrophysics Data System (ADS)
Faulds, Karen; Graham, Duncan; McKenzie, Fiona; MacRae, Douglas; Ricketts, Alastair; Dougan, Jennifer
2009-02-01
Surface enhanced resonance Raman scattering (SERRS) is an analytical technique with several advantages over competitive techniques in terms of improved sensitivity and multiplexing. We have made great progress in the development of SERRS as a quantitative analytical method, in particular for the detection of DNA. SERRS is an extremely sensitive and selective technique which when applied to the detection of labelled DNA sequences allows detection limits to be obtained which rival, and in most cases, are better than fluorescence. Here the conditions are explored which will enable the successful detection of DNA using SERRS. The enhancing surface which is used is crucial and in this case suspensions of nanoparticles were used as they allow quantitative behaviour to be achieved and allow analogous systems to current fluorescence based systems to be made. The aggregation conditions required to obtain SERRS of DNA are crucial and herein we describe the use of spermine as an aggregating agent. The nature of the label which is used, be it fluorescent, positively or negatively charged also effects the SERRS response and these conditions are again explored here. We have clearly demonstrated the ability to identify the components of a mixture of 5 analytes in solution by using two different excitation wavelengths and also of a 6-plex using data analysis techniques. These conditions will allow the use of SERRS for the detection of target DNA in a meaningful diagnostic assay.
Immobilization of Fab' fragments onto substrate surfaces: A survey of methods and applications.
Crivianu-Gaita, Victor; Thompson, Michael
2015-08-15
Antibody immobilization onto surfaces has widespread applications in many different fields. It is desirable to bind antibodies such that their fragment-antigen-binding (Fab) units are oriented away from the surface in order to maximize analyte binding. The immobilization of only Fab' fragments yields benefits over the more traditional whole antibody immobilization technique. Bound Fab' fragments display higher surface densities, yielding a higher binding capacity for the analyte. The nucleophilic sulfide of the Fab' fragments allows for specific orientations to be achieved. For biosensors, this indicates a higher sensitivity and lower detection limit for a target analyte. The last thirty years have shown tremendous progress in the immobilization of Fab' fragments onto gold, Si-based, polysaccharide-based, plastic-based, magnetic, and inorganic surfaces. This review will show the current scope of Fab' immobilization techniques available and illustrate methods employed to minimize non-specific adsorption of undesirables. Furthermore, a variety of examples will be given to show the versatility of immobilized Fab' fragments in different applications and future directions of the field will be addressed, especially regarding biosensors. Copyright © 2015 Elsevier B.V. All rights reserved.
Comparison between different techniques applied to quartz CPO determination in granitoid mylonites
NASA Astrophysics Data System (ADS)
Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish
2016-04-01
Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.
Space Shuttle propulsion parameter estimation using optimal estimation techniques
NASA Technical Reports Server (NTRS)
1983-01-01
This fourth monthly progress report again contains corrections and additions to the previously submitted reports. The additions include a simplified SRB model that is directly incorporated into the estimation algorithm and provides the required partial derivatives. The resulting partial derivatives are analytical rather than numerical as would be the case using the SOBER routines. The filter and smoother routine developments have continued. These routines are being checked out.
Kranz, Christine
2014-01-21
In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.
NASA Technical Reports Server (NTRS)
Gyekenyesi, Andrew L.; Gastelli, Michael G.; Ellis, John R.; Burke, Christopher S.
1995-01-01
An experimental study was conducted to investigate the mechanical behavior of a T650-35/AMB21 eight-harness satin weave polymer composite system. Emphasis was placed on the development and refinement of techniques used in elevated temperature uniaxial PMC testing. Issues such as specimen design, gripping, strain measurement, and temperature control and measurement were addressed. Quasi-static tensile and fatigue properties (R(sub sigma) = 0.1) were examined at room and elevated temperatures. Stiffness degradation and strain accumulation during fatigue cycling were recorded to monitor damage progression and provide insight for future analytical modeling efforts. Accomplishments included an untabbed dog-bone specimen design which consistently failed in the gage section, accurate temperature control and assessment, and continuous in-situ strain measurement capability during fatigue loading at elevated temperatures. Finally, strain accumulation and stiffness degradation during fatigue cycling appeared to be good indicators of damage progression.
Chylewska, Agnieszka; Ogryzek, M; Makowski, Mariusz
2017-10-23
New analytical and molecular methods for microorganisms are being developed on various features of identification i.e. selectivity, specificity, sensitivity, rapidity and discrimination of the viable cell. The presented review was established following the current trends in improved pathogens separation and detection methods and their subsequent use in medical diagnosis. This contribution also focuses on the development of analytical and biological methods in the analysis of microorganisms, with special attention paid to bio-samples containing microbes (blood, urine, lymph, wastewater). First, the paper discusses microbes characterization, their structure, surface, properties, size and then it describes pivotal points in the bacteria, viruses and fungi separation procedure obtained by researchers in the last 30 years. According to the above, detection techniques can be classified into three categories, which were, in our opinion, examined and modified most intensively during this period: electrophoretic, nucleic-acid-based, and immunological methods. The review covers also the progress, limitations and challenges of these approaches and emphasizes the advantages of new separative techniques in selective fractionating of microorganisms. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Xu, Xiaoli; Zhang, Song; Chen, Hui; Kong, Jilie
2009-11-15
Micro-total analysis systems (microTAS) integrate different analytical operations like sample preparation, separation and detection into a single microfabricated device. With the outstanding advantages of low cost, satisfactory analytical efficiency and flexibility in design, highly integrated and miniaturized devices from the concept of microTAS have gained widespread applications, especially in biochemical assays. Electrochemistry is shown to be quite compatible with microanalytical systems for biochemical assays, because of its attractive merits such as simplicity, rapidity, high sensitivity, reduced power consumption, and sample/reagent economy. This review presents recent developments in the integration of electrochemistry in microdevices for biochemical assays. Ingenious microelectrode design and fabrication methods, and versatility of electrochemical techniques are involved. Practical applications of such integrated microsystem in biochemical assays are focused on in situ analysis, point-of-care testing and portable devices. Electrochemical techniques are apparently suited to microsystems, since easy microfabrication of electrochemical elements and a high degree of integration with multi-analytical functions can be achieved at low cost. Such integrated microsystems will play an increasingly important role for analysis of small volume biochemical samples. Work is in progress toward new microdevice design and applications.
Analytical Chemistry Laboratory. Progress report for FY 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1996. This annual report is the thirteenth for the ACL. It describes effort on continuing and new projects and contributions of the ACL staff to various programs at ANL. The ACL operates in the ANL system as a full-cost-recovery service center, but has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support to solve research problems of our clients --more » Argonne National Laboratory, the Department of Energy, and others -- and will conduct world-class research and development in analytical chemistry and its applications. Because of the diversity of research and development work at ANL, the ACL handles a wide range of analytical chemistry problems. Some routine or standard analyses are done, but the ACL usually works with commercial laboratories if our clients require high-volume, production-type analyses. It is common for ANL programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. Thus, much of the support work done by the ACL is very similar to our applied analytical chemistry research.« less
Progress in defect quantification in multi-layered structures using ultrasonic inspection
NASA Astrophysics Data System (ADS)
Dierken, Josiah; Aldrin, John C.; Holec, Robert; LaCivita, Michael; Shearer, Joshua; Lindgren, Eric
2013-01-01
This study investigates the ability to resolve flaws in aluminum panel stackups representative of aircraft structural components. Using immersion ultrasound techniques, the specimens were examined for known fatigue cracks and electric discharge machined (EDM) notches at various fastener sites. Initial assessments suggested a possible trend between measured ultrasound parameters of flaw intensity and size, and known physical defect length. To improve analytical reliability and efficiency, development of automated data analysis (ADA) algorithms has been initiated.
Analytical applications of MIPs in diagnostic assays: future perspectives.
Bedwell, Thomas S; Whitcombe, Michael J
2016-03-01
Many efforts have been made to produce artificial materials with biomimetic properties for applications in binding assays. Among these efforts, the technique of molecular imprinting has received much attention because of the high selectivity obtainable for molecules of interest, robustness of the produced polymers, simple and short synthesis, and excellent cost efficiency. In this review, progress in the field of molecularly imprinted sorbent assays is discussed-with a focus on work conducted from 2005 to date.
Nitrate biosensors and biological methods for nitrate determination.
Sohail, Manzar; Adeloju, Samuel B
2016-06-01
The inorganic nitrate (NO3‾) anion is present under a variety of both natural and artificial environmental conditions. Nitrate is ubiquitous within the environment, food, industrial and physiological systems and is mostly present as hydrated anion of a corresponding dissolved salt. Due to the significant environmental and toxicological effects of nitrate, its determination and monitoring in environmental and industrial waters are often necessary. A wide range of analytical techniques are available for nitrate determination in various sample matrices. This review discusses biosensors available for nitrate determination using the enzyme nitrate reductase (NaR). We conclude that nitrate determination using biosensors is an excellent non-toxic alternative to all other available analytical methods. Over the last fifteen years biosensing technology for nitrate analysis has progressed very well, however, there is a need to expedite the development of nitrate biosensors as a suitable alternative to non-enzymatic techniques through the use of different polymers, nanostructures, mediators and strategies to overcome oxygen interference. Copyright © 2016 Elsevier B.V. All rights reserved.
Progressive Learning of Topic Modeling Parameters: A Visual Analytics Framework.
El-Assady, Mennatallah; Sevastjanova, Rita; Sperrle, Fabian; Keim, Daniel; Collins, Christopher
2018-01-01
Topic modeling algorithms are widely used to analyze the thematic composition of text corpora but remain difficult to interpret and adjust. Addressing these limitations, we present a modular visual analytics framework, tackling the understandability and adaptability of topic models through a user-driven reinforcement learning process which does not require a deep understanding of the underlying topic modeling algorithms. Given a document corpus, our approach initializes two algorithm configurations based on a parameter space analysis that enhances document separability. We abstract the model complexity in an interactive visual workspace for exploring the automatic matching results of two models, investigating topic summaries, analyzing parameter distributions, and reviewing documents. The main contribution of our work is an iterative decision-making technique in which users provide a document-based relevance feedback that allows the framework to converge to a user-endorsed topic distribution. We also report feedback from a two-stage study which shows that our technique results in topic model quality improvements on two independent measures.
Learning Visualization Strategies: A qualitative investigation
NASA Astrophysics Data System (ADS)
Halpern, Daniel; Oh, Kyong Eun; Tremaine, Marilyn; Chiang, James; Bemis, Karen; Silver, Deborah
2015-12-01
The following study investigates the range of strategies individuals develop to infer and interpret cross-sections of three-dimensional objects. We focus on the identification of mental representations and problem-solving processes made by 11 individuals with the goal of building training applications that integrate the strategies developed by the participants in our study. Our results suggest that although spatial transformation and perspective-taking techniques are useful for visualizing cross-section problems, these visual processes are augmented by analytical thinking. Further, our study shows that participants employ general analytic strategies for extended periods which evolve through practice into a set of progressively more expert strategies. Theoretical implications are discussed and five main findings are recommended for integration into the design of education software that facilitates visual learning and comprehension.
Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...
2014-10-03
Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek
The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical chemistry principles. • Performance requirements in field analysis stimulate technological progress.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wein, G.; Rosier, B.
1998-12-31
This report provides an overview of the research programs and program components carried out by the Savannah River Ecology Laboratory. Research focused on the following: advanced analytical and spectroscopic techniques for developing novel waste isolation and stabilization technologies as well as cost-effective remediation strategies; ecologically sound management of damaged and remediation of ecological systems; ecotoxicology, remediation, and risk assessment; radioecology, including dose assessments for plants and animals exposed to environmental radiation; and other research support programs.
Structural Glycomic Analyses at High Sensitivity: A Decade of Progress
NASA Astrophysics Data System (ADS)
Alley, William R.; Novotny, Milos V.
2013-06-01
The field of glycomics has recently advanced in response to the urgent need for structural characterization and quantification of complex carbohydrates in biologically and medically important applications. The recent success of analytical glycobiology at high sensitivity reflects numerous advances in biomolecular mass spectrometry and its instrumentation, capillary and microchip separation techniques, and microchemical manipulations of carbohydrate reactivity. The multimethodological approach appears to be necessary to gain an in-depth understanding of very complex glycomes in different biological systems.
Structural Glycomic Analyses at High Sensitivity: A Decade of Progress
Alley, William R.; Novotny, Milos V.
2014-01-01
The field of glycomics has recently advanced in response to the urgent need for structural characterization and quantification of complex carbohydrates in biologically and medically important applications. The recent success of analytical glycobiology at high sensitivity reflects numerous advances in biomolecular mass spectrometry and its instrumentation, capillary and microchip separation techniques, and microchemical manipulations of carbohydrate reactivity. The multimethodological approach appears to be necessary to gain an in-depth understanding of very complex glycomes in different biological systems. PMID:23560930
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wein, G.; Rosier, B.
1997-12-31
This report provides an overview of the research programs and program components carried out by the Savannah River Ecology Laboratory. Research focused on the following: advanced analytical and spectroscopic techniques for developing novel waste isolation and stabilization technologies as well as cost-effective remediation strategies; ecologically sound management of damaged and remediation of ecological systems; ecotoxicology, remediation, and risk assessment; radioecology, including dose assessments for plants and animals exposed to environmental radiation; and other research support programs.
LA-ICP-MS Study of Trace Elements in the Chanuskij Metal
NASA Technical Reports Server (NTRS)
Petaev, Michail I.
2005-01-01
This progress report covers work done during the second year of the 3-year proposal. During this year we resolved many issues relevant to the analytical technique developed by us for measuring trace elements in meteoritic metals. This technique was used to measure concentrations of Fe, Ni, Co, Cr, Cu, Ga, Ge, As, Mo, Ru, Rh, Pd, Sb, W, Re, Os, Ir, Pt, and Au in eight large (120 - 160 microns) metal grains from both "igneous" and "metamorphic" lithologies of the Chanuskij silicate inclusions. The first application of OUT technique to metal grains from thin sections showed some limitations. Small thickness of metal grains in the thin section limited the signal to 3-4 time-slices instead of 10- 1 1 ones in polished sections of iron meteorites studied before.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less
Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model
NASA Astrophysics Data System (ADS)
Kassebaum, Paul G.; Iannacchione, Germano S.
The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, W.P.; Bushaw, B.A.; McCarthy, M.I.
1996-10-01
The Department of Energy is undertaking the enormous task of remediating defense wastes and environmental insults which have occurred over 50 years of nuclear weapons production. It is abundantly clear that significant technology advances are needed to characterize, process, and store highly radioactive waste and to remediate contaminated zones. In addition to the processing and waste form issues, analytical technologies needed for the characterization of solids, and for monitoring storage tanks and contaminated sites do not exist or are currently expensive labor-intensive tasks. This report describes progress in developing sensitive, rapid, and widely applicable laser-based mass spectrometry techniques for analysismore » of mixed chemical wastes and contaminated soils.« less
Recent Progress in Biosensors for Environmental Monitoring: A Review
2017-01-01
The environmental monitoring has been one of the priorities at the European and global scale due to the close relationship between the environmental pollution and the human health/socioeconomic development. In this field, the biosensors have been widely employed as cost-effective, fast, in situ, and real-time analytical techniques. The need of portable, rapid, and smart biosensing devices explains the recent development of biosensors with new transduction materials, obtained from nanotechnology, and for multiplexed pollutant detection, involving multidisciplinary experts. This review article provides an update on recent progress in biosensors for the monitoring of air, water, and soil pollutants in real conditions such as pesticides, potentially toxic elements, and small organic molecules including toxins and endocrine disrupting chemicals. PMID:29244756
NASA Astrophysics Data System (ADS)
The liquefaction of pre-gelatinized starch was studied with various analytical techniques to determine the effects of starch molecular weight, granule structure, granule size, and mechanical depolymerization. Also, improvements were made in the chromatographic system used to characterize starch hydrolysates. Progress is reported on protein removal. The effects of pH, temperature, and ionic strength were examined for the removal of protein from a syrup stream by adsorption on a phenolic resin. Buffered systems, which maintain more stable pH values, were also examined. Mathematical modeling of the results is in progress. The pilot plant facility is complete and in operation. Starch streams containing 1% protein are being produced by the protein extraction process.
Recent Progress in Biosensors for Environmental Monitoring: A Review.
Justino, Celine I L; Duarte, Armando C; Rocha-Santos, Teresa A P
2017-12-15
The environmental monitoring has been one of the priorities at the European and global scale due to the close relationship between the environmental pollution and the human health/socioeconomic development. In this field, the biosensors have been widely employed as cost-effective, fast, in situ, and real-time analytical techniques. The need of portable, rapid, and smart biosensing devices explains the recent development of biosensors with new transduction materials, obtained from nanotechnology, and for multiplexed pollutant detection, involving multidisciplinary experts. This review article provides an update on recent progress in biosensors for the monitoring of air, water, and soil pollutants in real conditions such as pesticides, potentially toxic elements, and small organic molecules including toxins and endocrine disrupting chemicals.
The US nuclear reaction data network. Summary of the first meeting, March 13 & 14 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-01
The first meeting of the US Nuclear Reaction Data Network (USNRDN) was held at the Colorado School of Mines, March 13-14, 1996 chaired by F. Edward Cecil. The Agenda of the meeting is attached. The Network, its mission, products and services; related nuclear data and data networks, members, and organization are described in Attachment 1. The following progress reports from the members of the USNRDN were distributed prior to the meeting and are given as Attachment 2. (1) Measurements and Development of Analytic Techniques for Basic Nuclear Physics and Nuclear Applications; (2) Nuclear Reaction Data Activities at the National Nuclearmore » Data Center; (3) Studies of nuclear reactions at very low energies; (4) Nuclear Reaction Data Activities, Nuclear Data Group; (5) Progress in Neutron Physics at Los Alamos - Experiments; (6) Nuclear Reaction Data Activities in Group T2; (7) Progress Report for the US Nuclear Reaction Data Network Meeting; (8) Nuclear Astrophysics Research Group (ORNL); (9) Progress Report from Ohio University; (10) Exciton Model Phenomenology; and (11) Progress Report for Coordination Meeting USNRDN.« less
Ojanperä, Ilkka; Kolmonen, Marjo; Pelander, Anna
2012-05-01
Clinical and forensic toxicology and doping control deal with hundreds or thousands of drugs that may cause poisoning or are abused, are illicit, or are prohibited in sports. Rapid and reliable screening for all these compounds of different chemical and pharmaceutical nature, preferably in a single analytical method, is a substantial effort for analytical toxicologists. Combined chromatography-mass spectrometry techniques with standardised reference libraries have been most commonly used for the purpose. In the last ten years, the focus has shifted from gas chromatography-mass spectrometry to liquid chromatography-mass spectrometry, because of progress in instrument technology and partly because of the polarity and low volatility of many new relevant substances. High-resolution mass spectrometry (HRMS), which enables accurate mass measurement at high resolving power, has recently evolved to the stage that is rapidly causing a shift from unit-resolution, quadrupole-dominated instrumentation. The main HRMS techniques today are time-of-flight mass spectrometry and Orbitrap Fourier-transform mass spectrometry. Both techniques enable a range of different drug-screening strategies that essentially rely on measuring a compound's or a fragment's mass with sufficiently high accuracy that its elemental composition can be determined directly. Accurate mass and isotopic pattern acts as a filter for confirming the identity of a compound or even identification of an unknown. High mass resolution is essential for improving confidence in accurate mass results in the analysis of complex biological samples. This review discusses recent applications of HRMS in analytical toxicology.
Studies of industrial emissions by accelerator-based techniques: A review of applications at CEDAD
NASA Astrophysics Data System (ADS)
Calcagnile, L.; Quarta, G.
2012-04-01
Different research activities are in progress at the Centre for Dating and Diagnostics (CEDAD), University of Salento, in the field of environmental monitoring by exploiting the potentialities given by the different experimental beam lines implemented on the 3 MV Tande-tron accelerator and dedicated to AMS (Accelerator Mass Spectrome-try) radiocarbon dating and IB A (Ion Beam Analysis). An overview of these activities is presented by showing how accelerator-based analytical techniques can be a powerful tool for monitoring the anthropogenic carbon dioxide emissions from industrial sources and for the assessment of the biogenic content in SRF (Solid Recovered Fuel) burned in WTE (Waste to Energy) plants.
Advances in Mid-Infrared Spectroscopy for Chemical Analysis
NASA Astrophysics Data System (ADS)
Haas, Julian; Mizaikoff, Boris
2016-06-01
Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP627). INL PHOTO ...
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP-627). INL PHOTO NUMBER NRTS-54-12124. Unknown Photographer, 9/21/1954 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
Food Safety Evaluation Based on Near Infrared Spectroscopy and Imaging: A Review.
Fu, Xiaping; Ying, Yibin
2016-08-17
In recent years, due to the increasing consciousness of food safety and human health, much progress has been made in developing rapid and nondestructive techniques for the evaluation of food hazards, food authentication, and traceability. Near infrared (NIR) spectroscopy and imaging techniques have gained wide acceptance in many fields because of their advantages over other analytical techniques. Following a brief introduction of NIR spectroscopy and imaging basics, this review mainly focuses on recent NIR spectroscopy and imaging applications for food safety evaluation, including (1) chemical hazards detection; (2) microbiological hazards detection; (3) physical hazards detection; (4) new technology-induced food safety concerns; and (5) food traceability. The review shows NIR spectroscopy and imaging to be effective tools that will play indispensable roles for food safety evaluation. In addition, on-line/real-time applications of these techniques promise to be a huge growth field in the near future.
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP627). INL PHOTO ...
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP-627). INL PHOTO NUMBER NRTS-54-12573. R.G. Larsen, Photographer, 10/20/1954 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP627) SHOWING INITIAL ...
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP-627) SHOWING INITIAL EXCAVATION. INL PHOTO NUMBER NRTS-54-10703. Unknown Photographer, 5/21/1954 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.
2016-12-01
Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP627) SHOWING PLACEMENT ...
CONSTRUCTION PROGRESS PHOTO OF REMOTE ANALYTICAL FACILITY (CPP-627) SHOWING PLACEMENT OF PIERS. INL PHOTO NUMBER NRTS-54-11716. Unknown Photographer, 8/20/1954 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
NASA Astrophysics Data System (ADS)
Armigliato, A.
2008-07-01
In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.
Toward improved understanding and control in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hieftje, Gary M.
1989-01-01
As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.
Combustion of bulk titanium in oxygen
NASA Technical Reports Server (NTRS)
Clark, A. F.; Moulder, J. C.; Runyan, C. C.
1975-01-01
The combustion of bulk titanium in one atmosphere oxygen is studied using laser ignition and several analytical techniques. These were high-speed color cinematography, time and space resolved spectra in the visible region, metallography (including SEM) of specimens quenched in argon gas, X-ray and chemical product analyses, and a new optical technique, the Hilbert transform method. The cinematographic application of this technique for visualizing phase objects in the combustion zone is described. The results indicate an initial vapor phase reaction immediately adjacent to the molten surface but as the oxygen uptake progresses the evaporation approaches the point of congruency and a much reduced evaporation rate. This and the accumulation of the various soluble oxides soon drive the reaction zone below the surface where gas formation causes boiling and ejection of particles. The buildup of rutile cuts off the oxygen supply and the reaction ceases.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Technical Reports Server (NTRS)
Cull, R. C.; Eltimsahy, A. H.
1982-01-01
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Investigation of energy management strategies for photovoltaic systems - An analysis technique
NASA Astrophysics Data System (ADS)
Cull, R. C.; Eltimsahy, A. H.
Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.
Spectroscopic methods for the photodiagnosis of nonmelanoma skin cancer.
Drakaki, Eleni; Vergou, Theognosia; Dessinioti, Clio; Stratigos, Alexander J; Salavastru, Carmen; Antoniou, Christina
2013-06-01
The importance of dermatological noninvasive imaging techniques has increased over the last decades, aiming at diagnosing nonmelanoma skin cancer (NMSC). Technological progress has led to the development of various analytical tools, enabling the in vivo/in vitro examination of lesional human skin with the aim to increase diagnostic accuracy and decrease morbidity and mortality. The structure of the skin layers, their chemical composition, and the distribution of their compounds permits the noninvasive photodiagnosis of skin diseases, such as skin cancers, especially for early stages of malignant tumors. An important role in the dermatological diagnosis and disease monitoring has been shown for promising spectroscopic and imaging techniques, such as fluorescence, diffuse reflectance, Raman and near-infrared spectroscopy, optical coherence tomography, and confocal laser-scanning microscopy. We review the use of these spectroscopic techniques as noninvasive tools for the photodiagnosis of NMSC.
Spectroscopic methods for the photodiagnosis of nonmelanoma skin cancer
NASA Astrophysics Data System (ADS)
Drakaki, Eleni; Vergou, Theognosia; Dessinioti, Clio; Stratigos, Alexander J.; Salavastru, Carmen; Antoniou, Christina
2013-06-01
The importance of dermatological noninvasive imaging techniques has increased over the last decades, aiming at diagnosing nonmelanoma skin cancer (NMSC). Technological progress has led to the development of various analytical tools, enabling the in vivo/in vitro examination of lesional human skin with the aim to increase diagnostic accuracy and decrease morbidity and mortality. The structure of the skin layers, their chemical composition, and the distribution of their compounds permits the noninvasive photodiagnosis of skin diseases, such as skin cancers, especially for early stages of malignant tumors. An important role in the dermatological diagnosis and disease monitoring has been shown for promising spectroscopic and imaging techniques, such as fluorescence, diffuse reflectance, Raman and near-infrared spectroscopy, optical coherence tomography, and confocal laser-scanning microscopy. We review the use of these spectroscopic techniques as noninvasive tools for the photodiagnosis of NMSC.
CONSTRUCTION PROGRESS PHOTO REMOTE ANALYTICAL FACILITY (CPP627) SHOWING EMPLACEMENT OF ...
CONSTRUCTION PROGRESS PHOTO REMOTE ANALYTICAL FACILITY (CPP-627) SHOWING EMPLACEMENT OF ROOF SLABS. INL PHOTO NUMBER NRTS-54-13463. R.G. Larsen, Photographer, 12/20/1954 - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID
Recent progress of chiral stationary phases for separation of enantiomers in gas chromatography.
Xie, Sheng-Ming; Yuan, Li-Ming
2017-01-01
Chromatography techniques based on chiral stationary phases are widely used for the separation of enantiomers. In particular, gas chromatography has developed rapidly in recent years due to its merits such as fast analysis speed, lower consumption of stationary phases and analytes, higher column efficiency, making it a better choice for chiral separation in diverse industries. This article summarizes recent progress of novel chiral stationary phases based on cyclofructan derivatives and chiral porous materials including chiral metal-organic frameworks, chiral porous organic frameworks, chiral inorganic mesoporous materials, and chiral porous organic cages in gas chromatography, covering original research papers published since 2010. The chiral recognition properties and mechanisms of separation toward enantiomers are also introduced. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
Big data in health care: using analytics to identify and manage high-risk and high-cost patients.
Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel
2014-07-01
The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.
Analysis of signal to noise enhancement using a highly selective modulation tracking filter
NASA Technical Reports Server (NTRS)
Haden, C. R.; Alworth, C. W.
1972-01-01
Experiments are reported which utilize photodielectric effects in semiconductor loaded superconducting resonant circuits for suppressing noise in RF communication systems. The superconducting tunable cavity acts as a narrow band tracking filter for detecting conventional RF signals. Analytical techniques were developed which lead to prediction of signal-to-noise improvements. Progress is reported in optimization of the experimental variables. These include improved Q, new semiconductors, improved optics, and simplification of the electronics. Information bearing signals were passed through the system, and noise was introduced into the computer model.
Thermotropic Liquid Crystal-Assisted Chemical and Biological Sensors
Honaker, Lawrence W.; Usol’tseva, Nadezhda; Mann, Elizabeth K.
2017-01-01
In this review article, we analyze recent progress in the application of liquid crystal-assisted advanced functional materials for sensing biological and chemical analytes. Multiple research groups demonstrate substantial interest in liquid crystal (LC) sensing platforms, generating an increasing number of scientific articles. We review trends in implementing LC sensing techniques and identify common problems related to the stability and reliability of the sensing materials as well as to experimental set-ups. Finally, we suggest possible means of bridging scientific findings to viable and attractive LC sensor platforms. PMID:29295530
Salivary biomarker development using genomic, proteomic and metabolomic approaches
2012-01-01
The use of saliva as a diagnostic sample provides a non-invasive, cost-efficient method of sample collection for disease screening without the need for highly trained professionals. Saliva collection is far more practical and safe compared with invasive methods of sample collection, because of the infection risk from contaminated needles during, for example, blood sampling. Furthermore, the use of saliva could increase the availability of accurate diagnostics for remote and impoverished regions. However, the development of salivary diagnostics has required technical innovation to allow stabilization and detection of analytes in the complex molecular mixture that is saliva. The recent development of cost-effective room temperature analyte stabilization methods, nucleic acid pre-amplification techniques and direct saliva transcriptomic analysis have allowed accurate detection and quantification of transcripts found in saliva. Novel protein stabilization methods have also facilitated improved proteomic analyses. Although candidate biomarkers have been discovered using epigenetic, transcriptomic, proteomic and metabolomic approaches, transcriptomic analyses have so far achieved the most progress in terms of sensitivity and specificity, and progress towards clinical implementation. Here, we review recent developments in salivary diagnostics that have been accomplished using genomic, transcriptomic, proteomic and metabolomic approaches. PMID:23114182
The phonetics of talk in interaction--introduction to the special issue.
Ogden, Richard
2012-03-01
This overview paper provides an introduction to work on naturally-occurring speech data, combining techniques of conversation analysis with techniques and methods from phonetics. The paper describes the development of the field, highlighting current challenges and progress in interdisciplinary work. It considers the role of quantification and its relationship to a qualitative methodology. It presents the conversation analytic notion of sequence as a version of context, and argues that sequences of talk constrain relevant phonetic design, and so provide one account for variability in naturally occurring speech. The paper also describes the manipulation of speech and language on many levels simultaneously. All of these themes occur and are explored in more detail in the papers contained in this special issue.
Yamini, Yadollah; Seidi, Shahram; Rezazadeh, Maryam
2014-03-03
Sample preparation is an important issue in analytical chemistry, and is often a bottleneck in chemical analysis. So, the major incentive for the recent research has been to attain faster, simpler, less expensive, and more environmentally friendly sample preparation methods. The use of auxiliary energies, such as heat, ultrasound, and microwave, is one of the strategies that have been employed in sample preparation to reach the above purposes. Application of electrical driving force is the current state-of-the-art, which presents new possibilities for simplifying and shortening the sample preparation process as well as enhancing its selectivity. The electrical driving force has scarcely been utilized in comparison with other auxiliary energies. In this review, the different roles of electrical driving force (as a powerful auxiliary energy) in various extraction techniques, including liquid-, solid-, and membrane-based methods, have been taken into consideration. Also, the references have been made available, relevant to the developments in separation techniques and Lab-on-a-Chip (LOC) systems. All aspects of electrical driving force in extraction and separation methods are too specific to be treated in this contribution. However, the main aim of this review is to provide a brief knowledge about the different fields of analytical chemistry, with an emphasis on the latest efforts put into the electrically assisted membrane-based sample preparation systems. The advantages and disadvantages of these approaches as well as the new achievements in these areas have been discussed, which might be helpful for further progress in the future. Copyright © 2013 Elsevier B.V. All rights reserved.
A review of electrochemiluminescence (ECL) in and for microfluidic analytical devices.
Kirschbaum, Stefanie E K; Baeumner, Antje J
2015-05-01
The concept and realization of microfluidic total analysis systems (microTAS) have revolutionized the analytical process by integrating the whole breadth of analytical techniques into miniaturized systems. Paramount for efficient and competitive microTAS are integrated detection strategies, which lead to low limits of detection while reducing the sample volume. The concept of electrochemiluminescence (ECL) has been intriguing ever since its introduction based on Ru(bpy)3 (2+) by Tokel and Bard [1] (J Am Chem Soc 1853:2862-2863, 1972), especially because of its immense sensitivity, nonexistent auto-luminescent background signal, and simplicity in experimental design. Therefore, integrating ECL detection into microTAS is a logical consequence to achieve simple, yet highly sensitive, sensors. However, published microanalytical devices employing ECL detection focus in general on traditional ECL chemistry and have yet to take advantage of advances made in standard bench-top ECL strategies. This review will therefore focus on the most recent advancements in microfluidic ECL approaches, but also evaluate the potential impact of bench-top ECL research progress that would further improve performance and lower limits of detection of micro analytical ECL systems, ensuring their desirability as detection principle for microTAS applications.
Lefort, R
1977-01-01
This paper deals with the analytical cure of a child between 13 and 21 months and hospitalized since birth. In a first phase, one can see the start of a relationship, then the objectal relationship to primary objects: on the one hand the object food and on the other hand the therapist within the transfert. These two objects, impossible in the reality dimension from the start, are progressively characterized by "neantisation", which can reach symbolic tone. A scene in front of the picture of a child on knees of a nurse progressively introduces Nadia to the third register: the imagery one. Her behaviour is exemplary during the 17 scences on the mirror, during which she can progressively assume the image of her unified body under the view of the other, in demonstrating that she can only do it by symbolizing primary objects, in particular "to drink nothing". The symbolisation acquired at the oral level permits to verbalise on the same mode her relationship to the anal object, i.e. on the non destructive symbolic mode. This treatment raises the question of the use of audio-visual techniques for psychotic and autistic subjects, and recuses the pedagogic use of image for such subjects, taking into account the primary importance of symbolic function on imagery function demonstrated in this paper. This is a prerequisite for any research in the field of audiovisual techniques with psychotics.
Analytical techniques for steroid estrogens in water samples - A review.
Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza
2016-12-01
In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Single-molecule spectroscopy for plastic electronics: materials analysis from the bottom-up.
Lupton, John M
2010-04-18
pi-conjugated polymers find a range of applications in electronic devices. These materials are generally highly disordered in terms of chain length and chain conformation, besides being influenced by a variety of chemical and physical defects. Although this characteristic can be of benefit in certain device applications, disorder severely complicates materials analysis. Accurate analytical techniques are, however, crucial to optimising synthetic procedures and assessing overall material purity. Fortunately, single-molecule spectroscopic techniques have emerged as an unlikely but uniquely powerful approach to unraveling intrinsic material properties from the bottom up. Building on the success of such techniques in the life sciences, single-molecule spectroscopy is finding increasing applicability in materials science, effectively enabling the dissection of the bulk down to the level of the individual molecular constituent. This article reviews recent progress in single molecule spectroscopy of conjugated polymers as used in organic electronics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hien, P.D.
1994-12-31
Over ten years since the commissioning of the Dalat nuclear research reactor a number of nuclear techniques have been developed and applied in Vietnam Manufacturing of radioisotopes and nuclear instruments, development of isotope tracer and nuclear analytical techniques for environmental studies, exploitation of filtered neutron beams, ... have been major activities of reactor utilizations. Efforts made during ten years of reactor operation have resulted also in establishing and sustaining the applications of nuclear techniques in medicine, industry, agriculture, etc. The successes achieved and lessons teamed over the past ten years are discussed illustrating the approaches taken for developing the nuclearmore » science in the conditions of a country having a very low national income and experiencing a transition from a centrally planned to a market-oriented economic system.« less
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
Analytical Chemistry Laboratory Progress Report for FY 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Boparai, A.S.; Bowers, D.L.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1994 (October 1993 through September 1994). This annual report is the eleventh for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has a research program inmore » analytical chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require significant development of methods and adaption of techniques to obtain useful analytical data. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis -- which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL. The Chemical Analysis Group uses wet- chemical and instrumental methods for elemental, compositional, and isotopic determinations in solid, liquid, and gaseous samples and provides specialized analytical services. Major instruments in this group include an ion chromatograph (IC), an inductively coupled plasma/atomic emission spectrometer (ICP/AES), spectrophotometers, mass spectrometers (including gas-analysis and thermal-ionization mass spectrometers), emission spectrographs, autotitrators, sulfur and carbon determinators, and a kinetic phosphorescence uranium analyzer.« less
Metabolomics for laboratory diagnostics.
Bujak, Renata; Struck-Lewicka, Wiktoria; Markuszewski, Michał J; Kaliszan, Roman
2015-09-10
Metabolomics is an emerging approach in a systems biology field. Due to continuous development in advanced analytical techniques and in bioinformatics, metabolomics has been extensively applied as a novel, holistic diagnostic tool in clinical and biomedical studies. Metabolome's measurement, as a chemical reflection of a current phenotype of a particular biological system, is nowadays frequently implemented to understand pathophysiological processes involved in disease progression as well as to search for new diagnostic or prognostic biomarkers of various organism's disorders. In this review, we discussed the research strategies and analytical platforms commonly applied in the metabolomics studies. The applications of the metabolomics in laboratory diagnostics in the last 5 years were also reviewed according to the type of biological sample used in the metabolome's analysis. We also discussed some limitations and further improvements which should be considered taking in mind potential applications of metabolomic research and practice. Copyright © 2014 Elsevier B.V. All rights reserved.
Predicting adverse hemodynamic events in critically ill patients.
Yoon, Joo H; Pinsky, Michael R
2018-06-01
The art of predicting future hemodynamic instability in the critically ill has rapidly become a science with the advent of advanced analytical processed based on computer-driven machine learning techniques. How these methods have progressed beyond severity scoring systems to interface with decision-support is summarized. Data mining of large multidimensional clinical time-series databases using a variety of machine learning tools has led to our ability to identify alert artifact and filter it from bedside alarms, display real-time risk stratification at the bedside to aid in clinical decision-making and predict the subsequent development of cardiorespiratory insufficiency hours before these events occur. This fast evolving filed is primarily limited by linkage of high-quality granular to physiologic rationale across heterogeneous clinical care domains. Using advanced analytic tools to glean knowledge from clinical data streams is rapidly becoming a reality whose clinical impact potential is great.
A Review on Microfluidic Paper-Based Analytical Devices for Glucose Detection
Liu, Shuopeng; Su, Wenqiong; Ding, Xianting
2016-01-01
Glucose, as an essential substance directly involved in metabolic processes, is closely related to the occurrence of various diseases such as glucose metabolism disorders and islet cell carcinoma. Therefore, it is crucial to develop sensitive, accurate, rapid, and cost effective methods for frequent and convenient detections of glucose. Microfluidic Paper-based Analytical Devices (μPADs) not only satisfying the above requirements but also occupying the advantages of portability and minimal sample consumption, have exhibited great potential in the field of glucose detection. This article reviews and summarizes the most recent improvements in glucose detection in two aspects of colorimetric and electrochemical μPADs. The progressive techniques for fabricating channels on μPADs are also emphasized in this article. With the growth of diabetes and other glucose indication diseases in the underdeveloped and developing countries, low-cost and reliably commercial μPADs for glucose detection will be in unprecedentedly demand. PMID:27941634
Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Gunay, Nur Sibel; Wang, Jing; Sun, Elaine Y; Pradines, Joël R; Farutin, Victor; Shriver, Zachary; Kaundinya, Ganesh V; Capila, Ishan
2017-02-01
Heparan sulfate (HS), a glycosaminoglycan present on the surface of cells, has been postulated to have important roles in driving both normal and pathological physiologies. The chemical structure and sulfation pattern (domain structure) of HS is believed to determine its biological function, to vary across tissue types, and to be modified in the context of disease. Characterization of HS requires isolation and purification of cell surface HS as a complex mixture. This process may introduce additional chemical modification of the native residues. In this study, we describe an approach towards thorough characterization of bovine kidney heparan sulfate (BKHS) that utilizes a variety of orthogonal analytical techniques (e.g. NMR, IP-RPHPLC, LC-MS). These techniques are applied to characterize this mixture at various levels including composition, fragment level, and overall chain properties. The combination of these techniques in many instances provides orthogonal views into the fine structure of HS, and in other instances provides overlapping / confirmatory information from different perspectives. Specifically, this approach enables quantitative determination of natural and modified saccharide residues in the HS chains, and identifies unusual structures. Analysis of partially digested HS chains allows for a better understanding of the domain structures within this mixture, and yields specific insights into the non-reducing end and reducing end structures of the chains. This approach outlines a useful framework that can be applied to elucidate HS structure and thereby provides means to advance understanding of its biological role and potential involvement in disease progression. In addition, the techniques described here can be applied to characterization of heparin from different sources.
New isotope technologies in environmental physics
NASA Astrophysics Data System (ADS)
Povinec, P. P.; Betti, M.; Jull, A. J. T.; Vojtyla, P.
2008-02-01
As the levels of radionuclides observed at present in the environment are very low, high sensitive analytical systems are required for carrying out environmental investigations. We review recent progress which has been done in low-level counting techniques in both radiometrics and mass spectrometry sectors, with emphasis on underground laboratories, Monte Carlo (GEANT) simulation of background of HPGe detectors operating in various configurations, secondary ionisation mass spectrometry, and accelerator mass spectrometry. Applications of radiometrics and mass spectrometry techniques in radioecology and climate change studies are presented and discussed as well. The review should help readers in better orientation on recent developments in the field of low-level counting and spectrometry, and to advice on construction principles of underground laboratories, as well as on criteria how to choose low or high energy mass spectrometers for environmental investigations.
Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids
Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.
2014-01-01
Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150
Current trends in sample preparation for cosmetic analysis.
Zhong, Zhixiong; Li, Gongke
2017-01-01
The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An Analysis of Earth Science Data Analytics Use Cases
NASA Technical Reports Server (NTRS)
Shie, Chung-Lin; Kempler, Steve
2014-01-01
The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.
Bhalla, Nikhil; Jolly, Pawan; Formisano, Nello
2016-01-01
Biosensors are nowadays ubiquitous in biomedical diagnosis as well as a wide range of other areas such as point-of-care monitoring of treatment and disease progression, environmental monitoring, food control, drug discovery, forensics and biomedical research. A wide range of techniques can be used for the development of biosensors. Their coupling with high-affinity biomolecules allows the sensitive and selective detection of a range of analytes. We give a general introduction to biosensors and biosensing technologies, including a brief historical overview, introducing key developments in the field and illustrating the breadth of biomolecular sensing strategies and the expansion of nanotechnological approaches that are now available. PMID:27365030
Applying the design-build-test paradigm in microbiome engineering.
Pham, Hoang Long; Ho, Chun Loong; Wong, Adison; Lee, Yung Seng; Chang, Matthew Wook
2017-12-01
The recently discovered roles of human microbiome in health and diseases have inspired research efforts across many disciplines to engineer microbiome for health benefits. In this review, we highlight recent progress in human microbiome research and how modifications to the microbiome could result in implications to human health. Furthermore, we discuss the application of a 'design-build-test' framework to expedite microbiome engineering efforts by reviewing current literature on three key aspects: design principles to engineer the human microbiome, methods to engineer microbiome with desired functions, and analytical techniques to examine complex microbiome samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Biologically inspired technologies using artificial muscles
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2005-01-01
One of the newest fields of biomimetics is the electroactive polymers (EAP) that are also known as artificial muscles. To take advantage of these materials, efforts are made worldwide to establish a strong infrastructure addressing the need for comprehensive analytical modeling of their response mechanism and develop effective processing and characterization techniques. The field is still in its emerging state and robust materials are still not readily available however in recent years significant progress has been made and commercial products have already started to appear. This paper covers the current state of- the-art and challenges to making artificial muscles and their potential biomimetic applications.
PAT-tools for process control in pharmaceutical film coating applications.
Knop, Klaus; Kleinebudde, Peter
2013-12-05
Recent development of analytical techniques to monitor the coating process of pharmaceutical solid dosage forms such as pellets and tablets are described. The progress from off- or at-line measurements to on- or in-line applications is shown for the spectroscopic methods near infrared (NIR) and Raman spectroscopy as well as for terahertz pulsed imaging (TPI) and image analysis. The common goal of all these methods is to control or at least to monitor the coating process and/or to estimate the coating end point through timely measurements. Copyright © 2013 Elsevier B.V. All rights reserved.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Su, Wen-Hao; He, Hong-Ju; Sun, Da-Wen
2017-03-24
Staple foods, including cereals, legumes, and root/tuber crops, dominate the daily diet of humans by providing valuable proteins, starch, oils, minerals, and vitamins. Quality evaluation of staple foods is primarily carried out on sensory (e.g. external defect, color), adulteration (e.g. species, origin), chemical (e.g. starch, proteins), mycotoxin (e.g. Fusarium toxin, aflatoxin), parasitic infection (e.g. weevil, beetle), and internal physiological (e.g. hollow heart, black heart) aspects. Conventional methods for the quality assessment of staple foods are always laborious, destructive, and time-consuming. Requirements for online monitoring of staple foods have been proposed to encourage the development of rapid, reagentless, and noninvasive techniques. Spectroscopic techniques, such as visible-infrared spectroscopy, Raman spectroscopy, nuclear magnetic resonance spectroscopy, and spectral imaging, have been introduced as promising analytical tools and applied for the quality evaluation of staple foods. This review summarizes the recent applications and progress of such spectroscopic techniques in determining various qualities of staple foods. Besides, challenges and future trends of these spectroscopic techniques are also presented.
NASA Astrophysics Data System (ADS)
Rostam-Khani, P.; Hopstaken, M. J. P.; Vullings, P.; Noij, G.; O'Halloran, O.; Claassen, W.
2004-06-01
Measurement of surface metal contamination on silicon wafers is essential for yield enhancement in IC manufacturing. Vapor phase decomposition coupled with either inductively coupled plasma mass spectrometry (VPD-ICP-MS), or total reflection X-ray fluorescence (VPD-TXRF), TXRF and more recently time of flight secondary ion mass spectrometry (TOF-SIMS) are used to monitor surface metal contamination. These techniques complement each other in their respective strengths and weaknesses. For reliable and accurate quantification, so-called relative sensitivity factors (RSF) are required for TOF-SIMS analysis. For quantification purposes in VPD, the collection efficiency (CE) is important to ensure complete collection of contamination. A standard procedure has been developed that combines the determination of these RSFs as well as the collection efficiency using all the analytical techniques mentioned above. Therefore, sample wafers were intentionally contaminated and analyzed (by TOF-SIMS) directly after preparation. After VPD-ICP-MS, several scanned surfaces were analyzed again by TOF-SIMS. Comparing the intensities of the specific metals before and after the VPD-DC procedure on the scanned surface allows the determination of so-called removing efficiency (RE). In general, very good agreement was obtained comparing the four analytical techniques after updating the RSFs for TOF-SIMS. Progress has been achieved concerning the CE evaluation as well as determining the RSFs more precisely for TOF-SIMS.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C
2014-01-01
Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patients pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (SBM), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or QCP) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patients physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patients condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
Suvarapu, Lakshmi Narayana; Baek, Sung-Ok
2015-01-01
This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539
ERIC Educational Resources Information Center
Hough, Susan L.; Hall, Bruce W.
The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
Progress in protein crystallography.
Dauter, Zbigniew; Wlodawer, Alexander
2016-01-01
Macromolecular crystallography evolved enormously from the pioneering days, when structures were solved by "wizards" performing all complicated procedures almost by hand. In the current situation crystal structures of large systems can be often solved very effectively by various powerful automatic programs in days or hours, or even minutes. Such progress is to a large extent coupled to the advances in many other fields, such as genetic engineering, computer technology, availability of synchrotron beam lines and many other techniques, creating the highly interdisciplinary science of macromolecular crystallography. Due to this unprecedented success crystallography is often treated as one of the analytical methods and practiced by researchers interested in structures of macromolecules, but not highly competent in the procedures involved in the process of structure determination. One should therefore take into account that the contemporary, highly automatic systems can produce results almost without human intervention, but the resulting structures must be carefully checked and validated before their release into the public domain.
NASA R and T aerospace plane vehicles: Progress and plans
NASA Technical Reports Server (NTRS)
Dixon, S. C.
1985-01-01
Progress made in key technologies such as materials, structures, aerothermodynamics, hypersonic aerodynamics, and hypersonic airbreathing propulsion are reported. Advances were made in more generic, areas such as active controls, flight computer hardware and software, and interdisciplinary analytical design methodology. These technology advances coupled with the development of and experiences with the Space Shuttle make feasible aerospace plane-type vehicles that meet the more demanding requirements of various DOD missions and/or an all-weather Shuttle II with reduced launch costs. Technology needs and high payoff technologies, and the technology advancements in propulsion, control-configured-vehicles, aerodynamics, aerothermodynamics, aerothermal loads, and materials and structures were studied. The highest payoff technologies of materials and structures including thermal-structural analysis and high temperature test techniques are emphasized. The high priority technology of propulsion, and plans, of what remains to be done rather than firm program commitments, are briefly discussed.
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
Acquisition of Real-Time Operation Analytics for an Automated Serial Sectioning System
Madison, Jonathan D.; Underwood, O. D.; Poulter, Gregory A.; ...
2017-03-22
Mechanical serial sectioning is a highly repetitive technique employed in metallography for the rendering of 3D reconstructions of microstructure. While alternate techniques such as ultrasonic detection, micro-computed tomography, and focused ion beam milling have progressed much in recent years, few alternatives provide equivalent opportunities for comparatively high resolutions over significantly sized cross-sectional areas and volumes. To that end, the introduction of automated serial sectioning systems has greatly heightened repeatability and increased data collection rates while diminishing opportunity for mishandling and other user-introduced errors. Unfortunately, even among current, state-of-the-art automated serial sectioning systems, challenges in data collection have not been fullymore » eradicated. Therefore, this paper highlights two specific advances to assist in this area; a non-contact laser triangulation method for assessment of material removal rates and a newly developed graphical user interface providing real-time monitoring of experimental progress. Furthermore, both are shown to be helpful in the rapid identification of anomalies and interruptions, while also providing comparable and less error-prone measures of removal rate over the course of these long-term, challenging, and innately destructive characterization experiments.« less
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
Contributed review: quantum cascade laser based photoacoustic detection of explosives.
Li, J S; Yu, B; Fischer, H; Chen, W; Yalin, A P
2015-03-01
Detecting trace explosives and explosive-related compounds has recently become a topic of utmost importance for increasing public security around the world. A wide variety of detection methods and an even wider range of physical chemistry issues are involved in this very challenging area. Optical sensing methods, in particular mid-infrared spectrometry techniques, have a great potential to become a more desirable tools for the detection of explosives. The small size, simplicity, high output power, long-term reliability make external cavity quantum cascade lasers (EC-QCLs) the promising spectroscopic sources for developing analytical instrumentation. This work reviews the current technical progress in EC-QCL-based photoacoustic spectroscopy for explosives detection. The potential for both close-contact and standoff configurations using this technique is completely presented over the course of approximately the last one decade.
Contributed Review: Quantum cascade laser based photoacoustic detection of explosives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, J. S., E-mail: jingsong-li@ahu.edu.cn; Yu, B.; Fischer, H.
2015-03-15
Detecting trace explosives and explosive-related compounds has recently become a topic of utmost importance for increasing public security around the world. A wide variety of detection methods and an even wider range of physical chemistry issues are involved in this very challenging area. Optical sensing methods, in particular mid-infrared spectrometry techniques, have a great potential to become a more desirable tools for the detection of explosives. The small size, simplicity, high output power, long-term reliability make external cavity quantum cascade lasers (EC-QCLs) the promising spectroscopic sources for developing analytical instrumentation. This work reviews the current technical progress in EC-QCL-based photoacousticmore » spectroscopy for explosives detection. The potential for both close-contact and standoff configurations using this technique is completely presented over the course of approximately the last one decade.« less
Biofuel metabolic engineering with biosensors.
Morgan, Stacy-Anne; Nadler, Dana C; Yokoo, Rayka; Savage, David F
2016-12-01
Metabolic engineering offers the potential to renewably produce important classes of chemicals, particularly biofuels, at an industrial scale. DNA synthesis and editing techniques can generate large pathway libraries, yet identifying the best variants is slow and cumbersome. Traditionally, analytical methods like chromatography and mass spectrometry have been used to evaluate pathway variants, but such techniques cannot be performed with high throughput. Biosensors - genetically encoded components that actuate a cellular output in response to a change in metabolite concentration - are therefore a promising tool for rapid and high-throughput evaluation of candidate pathway variants. Applying biosensors can also dynamically tune pathways in response to metabolic changes, improving balance and productivity. Here, we describe the major classes of biosensors and briefly highlight recent progress in applying them to biofuel-related metabolic pathway engineering. Copyright © 2016 Elsevier Ltd. All rights reserved.
CD147/EMMPRIN overexpression and prognosis in cancer: A systematic review and meta-analysis
Xin, Xiaoyan; Zeng, Xianqin; Gu, Huajian; Li, Min; Tan, Huaming; Jin, Zhishan; Hua, Teng; Shi, Rui; Wang, Hongbo
2016-01-01
CD147/EMMPRIN (extracellular matrix metalloproteinase inducer) plays an important role in tumor progression and a number of studies have suggested that it is an indicator of tumor prognosis. This current meta-analysis systematically reevaluated the predictive potential of CD147/EMMPRIN in various cancers. We searched PubMed and Embase databases to screen the literature. Fixed-effect and random-effect meta-analytical techniques were used to correlate CD147 expression with outcome measures. A total of 53 studies that included 68 datasets were eligible for inclusion in the final analysis. We found a significant association between CD147/EMMPRIN overexpression and adverse tumor outcomes, such as overall survival, disease-specific survival, progression-free survival, metastasis-free survival or recurrence-free survival, irrespective of the model analysis. In addition, CD147/EMMPRIN overexpression predicted a high risk for chemotherapy drugs resistance. CD147/EMMPRIN is a central player in tumor progression and predicts a poor prognosis, including in patients who have received chemo-radiotherapy. Our results provide the evidence that CD147/EMMPRIN could be a potential therapeutic target for cancers. PMID:27608940
Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.
ERIC Educational Resources Information Center
Heineman, William R.; Kissinger, Peter T.
1980-01-01
Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)
Genome wide approaches to identify protein-DNA interactions.
Ma, Tao; Ye, Zhenqing; Wang, Liguo
2018-05-29
Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Analytical Chemistry Division annual progress report for period ending November 30, 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1978-03-01
Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)
ANALYTICAL CHEMISTRY DIVISION ANNUAL PROGRESS REPORT FOR PERIOD ENDING DECEMBER 31, 1961
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1962-02-01
Research and development progress is reported on analytlcal instrumentation, dlssolver-solution analyses, special research problems, reactor projects analyses, x-ray and spectrochemical analyses, mass spectrometry, optical and electron microscopy, radiochemical analyses, nuclear analyses, inorganic preparations, organic preparations, ionic analyses, infrared spectral studies, anodization of sector coils for the Analog II Cyclotron, quality control, process analyses, and the Thermal Breeder Reactor Projects Analytical Chemistry Laboratory. (M.C.G.)
Analytical Chemistry Division annual progress report for period ending December 31, 1985
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shultz, W.D.
1986-05-01
Progress reports are presented for the four major sections of the division: analytical spectroscopy, radioactive materials laboratories, inorganic chemistry, and organic chemistry. A brief discussion of the division's role in the Laboratory's Environmental Restoration and Facilities Upgrade is given. Information about quality assurance and safety programs is presented, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited.
Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael
2018-05-01
Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.
Progress of new label-free techniques for biosensors: a review.
Sang, Shengbo; Wang, Yajun; Feng, Qiliang; Wei, Ye; Ji, Jianlong; Zhang, Wendong
2016-01-01
The detection techniques used in biosensors can be broadly classified into label-based and label-free. Label-based detection relies on the specific properties of labels for detecting a particular target. In contrast, label-free detection is suitable for the target molecules that are not labeled or the screening of analytes which are not easy to tag. Also, more types of label-free biosensors have emerged with developments in biotechnology. The latest developed techniques in label-free biosensors, such as field-effect transistors-based biosensors including carbon nanotube field-effect transistor biosensors, graphene field-effect transistor biosensors and silicon nanowire field-effect transistor biosensors, magnetoelastic biosensors, optical-based biosensors, surface stress-based biosensors and other type of biosensors based on the nanotechnology are discussed. The sensing principles, configurations, sensing performance, applications, advantages and restriction of different label-free based biosensors are considered and discussed in this review. Most concepts included in this survey could certainly be applied to the development of this kind of biosensor in the future.
Cell-free DNA and next-generation sequencing in the service of personalized medicine for lung cancer
Bennett, Catherine W.; Berchem, Guy; Kim, Yeoun Jin; El-Khoury, Victoria
2016-01-01
Personalized medicine has emerged as the future of cancer care to ensure that patients receive individualized treatment specific to their needs. In order to provide such care, molecular techniques that enable oncologists to diagnose, treat, and monitor tumors are necessary. In the field of lung cancer, cell free DNA (cfDNA) shows great potential as a less invasive liquid biopsy technique, and next-generation sequencing (NGS) is a promising tool for analysis of tumor mutations. In this review, we outline the evolution of cfDNA and NGS and discuss the progress of using them in a clinical setting for patients with lung cancer. We also present an analysis of the role of cfDNA as a liquid biopsy technique and NGS as an analytical tool in studying EGFR and MET, two frequently mutated genes in lung cancer. Ultimately, we hope that using cfDNA and NGS for cancer diagnosis and treatment will become standard for patients with lung cancer and across the field of oncology. PMID:27589834
Intelligent manipulation technique for multi-branch robotic systems
NASA Technical Reports Server (NTRS)
Chen, Alexander Y. K.; Chen, Eugene Y. S.
1990-01-01
New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.
Developments in the Identification of Glycan Biomarkers for the Detection of Cancer
Ruhaak, L. Renee; Miyamoto, Suzanne; Lebrilla, Carlito B.
2013-01-01
Changes in glycosylation readily occur in cancer and other disease states. Thanks to recent advances in the development of analytical techniques and instrumentation, especially in mass spectrometry, it is now possible to identify blood-derived glycan-based biomarkers using glycomics strategies. This review is an overview of the developments made in the search for glycan-based cancer biomarkers and the technologies currently in use. It is anticipated that the progressing instrumental and bioinformatics developments will allow the identification of relevant glycan biomarkers for the diagnosis, early detection, and monitoring of cancer treatment with sufficient sensitivity and specificity for clinical use. PMID:23365456
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Tracking Student Progression through the Core Curriculum. CCRC Analytics
ERIC Educational Resources Information Center
Hodara, Michelle; Rodriguez, Olga
2013-01-01
This report demonstrates useful methods for examining student progression through the core curriculum. The authors carry out analyses at two colleges in two different states, illustrating students' overall progression through the core curriculum and the relationship of this "core" progression to their college outcomes. By means of this analysis,…
Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z
2015-12-01
Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Bruno, C; Patin, F; Bocca, C; Nadal-Desbarats, L; Bonnier, F; Reynier, P; Emond, P; Vourc'h, P; Joseph-Delafont, K; Corcia, P; Andres, C R; Blasco, H
2018-01-30
Metabolomics is an emerging science based on diverse high throughput methods that are rapidly evolving to improve metabolic coverage of biological fluids and tissues. Technical progress has led researchers to combine several analytical methods without reporting the impact on metabolic coverage of such a strategy. The objective of our study was to develop and validate several analytical techniques (mass spectrometry coupled to gas or liquid chromatography and nuclear magnetic resonance) for the metabolomic analysis of small muscle samples and evaluate the impact of combining methods for more exhaustive metabolite covering. We evaluated the muscle metabolome from the same pool of mouse muscle samples after 2 metabolite extraction protocols. Four analytical methods were used: targeted flow injection analysis coupled with mass spectrometry (FIA-MS/MS), gas chromatography coupled with mass spectrometry (GC-MS), liquid chromatography coupled with high-resolution mass spectrometry (LC-HRMS), and nuclear magnetic resonance (NMR) analysis. We evaluated the global variability of each compound i.e., analytical (from quality controls) and extraction variability (from muscle extracts). We determined the best extraction method and we reported the common and distinct metabolites identified based on the number and identity of the compounds detected with low analytical variability (variation coefficient<30%) for each method. Finally, we assessed the coverage of muscle metabolic pathways obtained. Methanol/chloroform/water and water/methanol were the best extraction solvent for muscle metabolome analysis by NMR and MS, respectively. We identified 38 metabolites by nuclear magnetic resonance, 37 by FIA-MS/MS, 18 by GC-MS, and 80 by LC-HRMS. The combination led us to identify a total of 132 metabolites with low variability partitioned into 58 metabolic pathways, such as amino acid, nitrogen, purine, and pyrimidine metabolism, and the citric acid cycle. This combination also showed that the contribution of GC-MS was low when used in combination with other mass spectrometry methods and nuclear magnetic resonance to explore muscle samples. This study reports the validation of several analytical methods, based on nuclear magnetic resonance and several mass spectrometry methods, to explore the muscle metabolome from a small amount of tissue, comparable to that obtained during a clinical trial. The combination of several techniques may be relevant for the exploration of muscle metabolism, with acceptable analytical variability and overlap between methods However, the difficult and time-consuming data pre-processing, processing, and statistical analysis steps do not justify systematically combining analytical methods. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P Yu
Unlike traditional 'wet' analytical methods which during processing for analysis often result in destruction or alteration of the intrinsic protein structures, advanced synchrotron radiation-based Fourier transform infrared microspectroscopy has been developed as a rapid and nondestructive and bioanalytical technique. This cutting-edge synchrotron-based bioanalytical technology, taking advantages of synchrotron light brightness (million times brighter than sun), is capable of exploring the molecular chemistry or structure of a biological tissue without destruction inherent structures at ultra-spatial resolutions. In this article, a novel approach is introduced to show the potential of the advanced synchrotron-based analytical technology, which can be used to study plant-basedmore » food or feed protein molecular structure in relation to nutrient utilization and availability. Recent progress was reported on using synchrotron-based bioanalytical technique synchrotron radiation-based Fourier transform infrared microspectroscopy and diffused reflectance infrared Fourier transform spectroscopy to detect the effects of gene-transformation (Application 1), autoclaving (Application 2), and bio-ethanol processing (Application 3) on plant-based food and feed protein structure changes on a molecular basis. The synchrotron-based technology provides a new approach for plant-based protein structure research at ultra-spatial resolutions at cellular and molecular levels.« less
Recent developments in the analysis of toxic elements.
Lisk, D J
1974-06-14
One may conclude that it is impractical to confine oneself to any one analytical method since ever more sensitive instrumentation continues to be produced. However, in certain methods such as anodic stripping voltammetry and flameless atomic absorption it may be background contamination from reagent impurities and surroundings rather than instrument sensitivity which controls the limits of element detection. The problem of contamination from dust or glassware is greatly magnified when the sample size becomes ever smaller. Air entering laboratories near highways may contain trace quantities of lead, cadmium, barium, antimony, and other elements from engine exhaust. Even plastic materials contacting the sample may be suspect as a source of contamination since specific metals may be used as catalysts in the synthesis of the plastic and traces may be retained in it. Certain elements may even be deliberately added to plastics during manufacture for identification purposes. Nondestructive methods such as neutron activation and x-ray techniques thus offer great advantages not only in time but in the elimination of impurities introduced during sample ashing. Future improvements in attainable limits of detection may arise largely from progress in the ultrapurification of reagents and "clean-room" techniques. Finally, the competence of the analyst is also vitally important in the skillful operation of modern complex analytical instrumentation and in the experienced evaluation of data.
Investigation of Using Analytics in Promoting Mobile Learning Support
ERIC Educational Resources Information Center
Visali, Videhi; Swami, Niraj
2013-01-01
Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…
Some New/Old Approaches to QCD
DOE R&D Accomplishments Database
Gross, D. J.
1992-11-01
In this lecture I shall discuss some recent attempts to revive some old ideas to address the problem of solving QCD. I believe that it is timely to return to this problem which has been woefully neglected for the last decade. QCD is a permanent part of the theoretical landscape and eventually we will have to develop analytic tools for dealing with the theory in the infra-red. Lattice techniques are useful but they have not yet lived up to their promise. Even if one manages to derive the hadronic spectrum numerically, to an accuracy of 10% or even 1%, we will not be truly satisfied unless we have some analytic understanding of the results. Also, lattice Monte-Carlo methods can only be used to answer a small set of questions. Many issues of great conceptual and practical interest-in particular the calculation of scattering amplitudes, are thus far beyond lattice control. Any progress in controlling QCD in an explicit analytic, fashion would be of great conceptual value. It would also be of great practical aid to experimentalists, who must use rather ad-hoc and primitive models of QCD scattering amplitudes to estimate the backgrounds to interesting new physics. I will discuss an attempt to derive a string representation of QCD and a revival of the large N approach to QCD. Both of these ideas have a long history, many theorist-years have been devoted to their pursuit-so far with little success. I believe that it is time to try again. In part this is because of the progress in the last few years in string theory. Our increased understanding of string theory should make the attempt to discover a stringy representation of QCD easier, and the methods explored in matrix models might be employed to study the large N limit of QCD.
Gore, Sally A; Nordberg, Judith M; Palmer, Lisa A; Piorun, Mary E
2009-07-01
This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991-2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians.
Nanoflow Separation of Amino Acids for the Analysis of Cosmic Dust
NASA Technical Reports Server (NTRS)
Martin, M. P.; Glavin, D. P.; Dworkin, Jason P.
2008-01-01
The delivery of amino acids to the early Earth by interplanetary dust particles, comets, and carbonaceous meteorites could have been a significant source of the early Earth's prebiotic organic inventory. Amino acids are central to modern terrestrial biochemistry as major components of proteins and enzymes and were probably vital in the origin of life. A variety of amino acids have been detected in the CM carbonaceous meteorite Murchison, many of which are exceptionally rare in the terrestrial biosphere including a-aminoisobutyric acid (AIB) and isovaline. AIB has also been detected in a small percentage of Antarctic micrometeorite grains believed to be related to the CM meteorites We report on progress in optimizing a nanoflow liquid chromatography separation system with dual detection via laser-induced-fluorescence time of flight mass spectrometry (nLC-LIF/ToF-MS) for the analysis of o-phthaldialdehydelN-acetyl-L-cysteine (OPA/NAC) labeled amino acids in cosmic dust grains. The very low flow rates (<3 micro-L/min) of nLC over analytical LC (>0.1 ml/min) combined with <2 micron column bead sizes has the potential to produce efficient analyte ionizations andchromatograms with very sharp peaks; both increase sensitivity. The combination of the selectivity (only primary amines are derivatized), sensitivity (>4 orders of magnitude lower than traditional GC-MS techniques), and specificity (compounds identities are determined by both retention time and exact mass) makes this a compelling technique. However, the development of an analytical method to achieve separation of compounds as structurally similar as amino acid monomers and produce the sharp peaks required for maximum sensitivity is challenging.
Finalizing the Libby Action Plan Research Program | Science ...
Libby, Montana is the location of a former vermiculite mine that operated from 1923 to 1990. The vermiculite ore from the mine co-existed with amphibole asbestos, referred to as Libby Amphibole Asbestos (LAA). Combined with the cessation of the asbestos mining and processing operations, there has been significant progress in reducing the exposure to LAA in Libby, Montana. In 2009, the U.S Environmental Protection Agency (EPA) jointly with the Department of Health and Human Services (DHHS) declared a public health emergency in Libby due to observed asbestos-related health effects in the region. As part of this effort, the EPA led a cross-agency research program that conducted analytical, toxicological, and epidemiological research on the health effects of asbestos at the Libby Asbestos Superfund Site (Libby Site) in Libby, Montana. The Libby Action Plan (LAP) was initiated in 2007 to support the site-specific risk assessment for the Libby Site. The goal of the LAP research program was to explore the health effects of LAA, and determine toxicity information specific to LAA in order to accurately inform a human health risk assessment at the Libby Site. LAP research informed data gaps related to the health effects of exposure to LAA, particularly related to specific mechanisms of fiber dosimetry and toxicity (e.g., inflammatory responses), as well as investigated disease progression in exposed populations and advanced asbestos analytical techniques. This work incl
Depth-resolved monitoring of analytes diffusion in ocular tissues
NASA Astrophysics Data System (ADS)
Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.
2007-02-01
Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.
Artificial intelligence in healthcare: past, present and future.
Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun
2017-12-01
Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI.
NASA Astrophysics Data System (ADS)
Fu, Qiang; Schaaf, Peter
2018-07-01
This special issue of the high impact international peer reviewed journal Applied Surface Science represents the proceedings of the 2nd International Conference on Applied Surface Science ICASS held 12-16 June 2017 in Dalian China. The conference provided a forum for researchers in all areas of applied surface science to present their work. The main topics of the conference are in line with the most popular areas of research reported in Applied Surface Science. Thus, this issue includes current research on the role and use of surfaces in chemical and physical processes, related to catalysis, electrochemistry, surface engineering and functionalization, biointerfaces, semiconductors, 2D-layered materials, surface nanotechnology, energy, new/functional materials and nanotechnology. Also the various techniques and characterization methods will be discussed. Hence, scientific research on the atomic and molecular level of material properties investigated with specific surface analytical techniques and/or computational methods is essential for any further progress in these fields.
Artificial intelligence in healthcare: past, present and future
Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun
2017-01-01
Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI. PMID:29507784
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
Advances in analytical instrumentation have not only increased the number and types of chemicals measured, but reduced the quantitation limits, allowing these chemicals to be detected at progressively lower concentrations in various environmental matrices. Such analytical advanc...
[Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].
Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou
2014-08-01
In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).
Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.
Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun
2017-07-08
Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.
A Progressive Approach to Teaching Analytics in the Marketing Curriculum
ERIC Educational Resources Information Center
Liu, Yiyuan; Levin, Michael A.
2018-01-01
With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…
Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
NASA Astrophysics Data System (ADS)
McDonald, Robert Christopher
The purpose of this study was to explore the process of developing a learning progression (LP) on constructing explanations about sea level rise. I used a learning progressions theoretical framework informed by the situated cognition learning theory. During this exploration, I explicitly described my decision-making process as I developed and revised a hypothetical learning progression. Correspondingly, my research question was: What is a process by which a hypothetical learning progression on sea level rise is developed into an empirical learning progression using learners' explanations? To answer this question, I used a qualitative descriptive single case study with multiple embedded cases (Yin, 2014) that employed analytic induction (Denzin, 1970) to analyze data collected on middle school learners (grades 6-8). Data sources included written artifacts, classroom observations, and semi-structured interviews. Additionally, I kept a researcher journal to track my thinking about the learning progression throughout the research study. Using analytic induction to analyze collected data, I developed eight analytic concepts: participant explanation structures varied widely, global warming and ice melt cause sea level rise, participants held alternative conceptions about sea level rise, participants learned about thermal expansion as a fundamental aspect of sea level rise, participants learned to incorporate authentic scientific data, participants' mental models of the ocean varied widely, sea ice melt contributes to sea level rise, and participants held vague and alternative conceptions about how pollution impacts the ocean. I started with a hypothetical learning progression, gathered empirical data via various sources (especially semi-structured interviews), revised the hypothetical learning progression in response to those data, and ended with an empirical learning progression comprising six levels of learner thinking. As a result of developing an empirically based LP, I was able to compare two learning progressions on the same topic. By comparing my learning progression with the LP in Breslyn, McGinnis, McDonald, and Hestness (2016), I was able to confirm portions of the two learning progressions and explore different possible pathways for learners to achieve progress towards upper anchors of the LPs through targeted instruction. Implications for future LP research, curriculum, instruction, assessment, and policy related to learning progressions are presented.
2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Scanning probe microscopy of biomedical interfaces
NASA Astrophysics Data System (ADS)
Vansteenkiste, S. O.; Davies, M. C.; Roberts, C. J.; Tendler, S. J. B.; Williams, P. M.
1998-02-01
The development of the scanning probe microscopes over the past decade has provided a number of exciting new surface analytical techniques making a significant progress in the characterisation of biomedical interfaces. In this review, several examples are presented to illustrate that SPM is a powerful and promising tool for surface investigations including biomolecules, cell membranes, polymers and even living cells. The ability of the SPM instrument to monitor adhesion phenomena and provide quantitative information about intermolecular interactions is also described. Moreover, the huge potential of the scanning probe microscopes to study dynamic processes at interfaces under nearly physiological conditions is highlighted. Novel applications in the field of biochemistry, microbiology, biomaterial engineering, drug delivery and even medicine are discussed.
Bhalla, Nikhil; Jolly, Pawan; Formisano, Nello; Estrela, Pedro
2016-06-30
Biosensors are nowadays ubiquitous in biomedical diagnosis as well as a wide range of other areas such as point-of-care monitoring of treatment and disease progression, environmental monitoring, food control, drug discovery, forensics and biomedical research. A wide range of techniques can be used for the development of biosensors. Their coupling with high-affinity biomolecules allows the sensitive and selective detection of a range of analytes. We give a general introduction to biosensors and biosensing technologies, including a brief historical overview, introducing key developments in the field and illustrating the breadth of biomolecular sensing strategies and the expansion of nanotechnological approaches that are now available. © 2016 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.
Cost and Schedule Analytical Techniques Development
NASA Technical Reports Server (NTRS)
1996-01-01
This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) for the base contract year from December 1, 1994 through November 30, 1995. The Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical services and products to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02). Detailed Monthly Progress Reports were submitted to MSFC in accordance with the contract's Statement of Work Section IV "Reporting and Documentation". These reports spelled out each month's specific work accomplishments, deliverables submitted, major meetings held, and other pertinent information. This Final Report will summarize these activities at a higher level.
Equity in Irish health care financing: measurement issues.
Smith, Samantha
2010-04-01
This paper employs widely used analytic techniques for measuring equity in health care financing to update Irish results from previous analysis based on data from the late 1980s. Kakwani indices are calculated using household survey data from 1987/88 to 2004/05. Results indicate a marginally progressive financing system overall. However, interpretation of the results for the private sources of health financing is complicated. This problem is not unique to Ireland but it is argued that it may be relatively more important in the context of a complex health financing system, illustrated in this paper by the Irish system. Alternative options for improving the analysis of equity in health care financing are discussed.
Eco-analytical Methodology in Environmental Problems Monitoring
NASA Astrophysics Data System (ADS)
Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.
2017-01-01
Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
Learning Progressions and Teaching Sequences: A Review and Analysis
ERIC Educational Resources Information Center
Duschl, Richard; Maeng, Seungho; Sezen, Asli
2011-01-01
Our paper is an analytical review of the design, development and reporting of learning progressions and teaching sequences. Research questions are: (1) what criteria are being used to propose a "hypothetical learning progression/trajectory" and (2) what measurements/evidence are being used to empirically define and refine a "hypothetical learning…
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.
Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz
2009-07-15
The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Experimental and analytical determination of stability parameters for a balloon tethered in a wind
NASA Technical Reports Server (NTRS)
Redd, L. T.; Bennett, R. M.; Bland, S. R.
1973-01-01
Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.
Georgakopoulos, Costas; Saugy, Martial; Giraud, Sylvain; Robinson, Neil; Alsayrafi, Mohammed
2012-07-01
The Summer Olympic Games constitute the biggest concentration of human sports and activities in a particular place and time since 776 BCE, when the written history of the Olympic Games in Olympia began. Summer and Winter Olympic anti-doping laboratories, accredited by the International Olympic Committee in the past and the World Anti-Doping Agency in the present times, acquire worldwide interest to apply all new analytical advancements in the fight against doping in sports, hoping that this major human event will not become dirty by association with this negative phenomenon. This article summarizes the new analytical progresses, technologies and knowledge used by the Olympic laboratories, which for the vast majority of them are, eventually, incorporated into routine anti-doping analysis.
Gore, Sally A.; Nordberg, Judith M.; Palmer, Lisa A.
2009-01-01
Objective: This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Methodology: Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991–2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Results: Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. Conclusion: This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians. PMID:19626146
NASA Astrophysics Data System (ADS)
Blackie, J. R.; Robinson, M.
2007-01-01
Dr J.S.G. McCulloch was deeply involved in the establishment of research catchments in East Africa and subsequently in the UK to investigate the hydrological consequences of changes in land use. Comparison of these studies provides an insight into how influential his inputs and direction have been in the progressive development of the philosophy, the instrumentation and the analytical techniques now employed in catchment research. There were great contrasts in the environments: tropical highland (high radiation, intense rainfall) vs. temperate maritime (low radiation and frontal storms), contrasting soils and vegetation types, as well as the differing social and economic pressures in developing and developed nations. Nevertheless, the underlying scientific philosophy was common to both, although techniques had to be modified according to local conditions. As specialised instrumentation and analytical techniques were developed for the UK catchments many were also integrated into the East African studies. Many lessons were learned in the course of these studies and from the experiences of other studies around the world. Overall, a rigorous scientific approach was developed with widespread applicability. Beyond the basics of catchment selection and the quantification of the main components of the catchment water balance, this involved initiating parallel process studies to provide information on specific aspects of catchment behaviour. This information could then form the basis for models capable of extrapolation from the observed time series to other periods/hydrological events and, ultimately, the capability of predicting the consequences of changes in catchment land management to other areas in a range of climates.
Analytical Chemistry: A Literary Approach.
ERIC Educational Resources Information Center
Lucy, Charles A.
2000-01-01
Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)
Investigating biomolecular recognition at the cell surface using atomic force microscopy.
Wang, Congzhou; Yadavalli, Vamsi K
2014-05-01
Probing the interaction forces that drive biomolecular recognition on cell surfaces is essential for understanding diverse biological processes. Force spectroscopy has been a widely used dynamic analytical technique, allowing measurement of such interactions at the molecular and cellular level. The capabilities of working under near physiological environments, combined with excellent force and lateral resolution make atomic force microscopy (AFM)-based force spectroscopy a powerful approach to measure biomolecular interaction forces not only on non-biological substrates, but also on soft, dynamic cell surfaces. Over the last few years, AFM-based force spectroscopy has provided biophysical insight into how biomolecules on cell surfaces interact with each other and induce relevant biological processes. In this review, we focus on describing the technique of force spectroscopy using the AFM, specifically in the context of probing cell surfaces. We summarize recent progress in understanding the recognition and interactions between macromolecules that may be found at cell surfaces from a force spectroscopy perspective. We further discuss the challenges and future prospects of the application of this versatile technique. Copyright © 2014 Elsevier Ltd. All rights reserved.
Improved determination of vector lithospheric magnetic anomalies from MAGSAT data
NASA Technical Reports Server (NTRS)
Ravat, Dhananjay
1993-01-01
Scientific contributions made in developing new methods to isolate and map vector magnetic anomalies from measurements made by Magsat are described. In addition to the objective of the proposal, the isolation and mapping of equatorial vector lithospheric Magsat anomalies, isolation of polar ionospheric fields during the period were also studied. Significant progress was also made in isolation of polar delta(Z) component and scalar anomalies as well as integration and synthesis of various techniques of removing equatorial and polar ionospheric effects. The significant contributions of this research are: (1) development of empirical/analytical techniques in modeling ionospheric fields in Magsat data and their removal from uncorrected anomalies to obtain better estimates of lithospheric anomalies (this task was accomplished for equatorial delta(X), delta(Z), and delta(B) component and polar delta(Z) and delta(B) component measurements; (2) integration of important processing techniques developed during the last decade with the newly developed technologies of ionospheric field modeling into an optimum processing scheme; and (3) implementation of the above processing scheme to map the most robust magnetic anomalies of the lithosphere (components as well as scalar).
The expanding universe of mass analyzer configurations for biological analysis.
Calvete, Juan J
2014-01-01
Mass spectrometry (MS) is an analytical technique that measures the mass-to-charge ratio of electrically charged gas-phase particles. All mass spectrometers combine ion formation, mass analysis, and ion detection. Although mass analyzers can be regarded as sophisticated devices that manipulate ions in space and time, the rich diversity of possible ways to combine ion separation, focusing, and detection in dynamic mass spectrometers accounts for the large number of instrument designs. A historical perspective of the progress in mass spectrometry that since 1965 until today have contributed to position this technique as an indispensable tool for biological research has been recently addressed by a privileged witness of this golden age of MS (Gelpí J. Mass Spectrom 43:419-435, 2008; Gelpí J. Mass Spectrom 44:1137-1161, 2008). The aim of this chapter is to highlight the view that the operational principles of mass spectrometry can be understood by a simple mathematical language, and that an understanding of the basic concepts of mass spectrometry is necessary to take the most out of this versatile technique.
Fornari, Chiara; Balbo, Gianfranco; Halawani, Sami M; Ba-Rukab, Omar; Ahmad, Ab Rahman; Calogero, Raffaele A; Cordero, Francesca; Beccuti, Marco
2015-01-01
Nowadays multidisciplinary approaches combining mathematical models with experimental assays are becoming relevant for the study of biological systems. Indeed, in cancer research multidisciplinary approaches are successfully used to understand the crucial aspects implicated in tumor growth. In particular, the Cancer Stem Cell (CSC) biology represents an area particularly suited to be studied through multidisciplinary approaches, and modeling has significantly contributed to pinpoint the crucial aspects implicated in this theory. More generally, to acquire new insights on a biological system it is necessary to have an accurate description of the phenomenon, such that making accurate predictions on its future behaviors becomes more likely. In this context, the identification of the parameters influencing model dynamics can be advantageous to increase model accuracy and to provide hints in designing wet experiments. Different techniques, ranging from statistical methods to analytical studies, have been developed. Their applications depend on case-specific aspects, such as the availability and quality of experimental data, and the dimension of the parameter space. The study of a new model on the CSC-based tumor progression has been the motivation to design a new work-flow that helps to characterize possible system dynamics and to identify those parameters influencing such behaviors. In detail, we extended our recent model on CSC-dynamics creating a new system capable of describing tumor growth during the different stages of cancer progression. Indeed, tumor cells appear to progress through lineage stages like those of normal tissues, being their division auto-regulated by internal feedback mechanisms. These new features have introduced some non-linearities in the model, making it more difficult to be studied by solely analytical techniques. Our new work-flow, based on statistical methods, was used to identify the parameters which influence the tumor growth. The effectiveness of the presented work-flow was firstly verified on two well known models and then applied to investigate our extended CSC model. We propose a new work-flow to study in a practical and informative way complex systems, allowing an easy identification, interpretation, and visualization of the key model parameters. Our methodology is useful to investigate possible model behaviors and to establish factors driving model dynamics. Analyzing our new CSC model guided by the proposed work-flow, we found that the deregulation of CSC asymmetric proliferation contributes to cancer initiation, in accordance with several experimental evidences. Specifically, model results indicated that the probability of CSC symmetric proliferation is responsible of a switching-like behavior which discriminates between tumorigenesis and unsustainable tumor growth.
Gałuszka, Agnieszka; Migaszewski, Zdzisław M; Namieśnik, Jacek
2015-07-01
The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector), ultraviolet-visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. Copyright © 2015 Elsevier Inc. All rights reserved.
Scaling Student Success with Predictive Analytics: Reflections after Four Years in the Data Trenches
ERIC Educational Resources Information Center
Wagner, Ellen; Longanecker, David
2016-01-01
The metrics used in the US to track students do not include adults and part-time students. This has led to the development of a massive data initiative--the Predictive Analytics Reporting (PAR) framework--that uses predictive analytics to trace the progress of all types of students in the system. This development has allowed actionable,…
Multimedia Analysis plus Visual Analytics = Multimedia Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinchor, Nancy; Thomas, James J.; Wong, Pak C.
2010-10-01
Multimedia analysis has focused on images, video, and to some extent audio and has made progress in single channels excluding text. Visual analytics has focused on the user interaction with data during the analytic process plus the fundamental mathematics and has continued to treat text as did its precursor, information visualization. The general problem we address in this tutorial is the combining of multimedia analysis and visual analytics to deal with multimedia information gathered from different sources, with different goals or objectives, and containing all media types and combinations in common usage.
NASA Astrophysics Data System (ADS)
Zhao, Yang; Zhang, Lei; Zhao, Shu-Xia; Li, Yu-Fang; Gong, Yao; Dong, Lei; Ma, Wei-Guang; Yin, Wang-Bao; Yao, Shun-Chun; Lu, Ji-Dong; Xiao, Lian-Tuan; Jia, Suo-Tang
2016-12-01
Laser-induced breakdown spectroscopy (LIBS) is an emerging analytical spectroscopy technique. This review presents the main recent developments in China regarding the implementation of LIBS for coal analysis. The paper mainly focuses on the progress of the past few years in the fundamentals, data pretreatment, calibration model, and experimental issues of LIBS and its application to coal analysis. Many important domestic studies focusing on coal quality analysis have been conducted. For example, a proposed novel hybrid quantification model can provide more reproducible quantitative analytical results; the model obtained the average absolute errors (AREs) of 0.42%, 0.05%, 0.07%, and 0.17% for carbon, hydrogen, volatiles, and ash, respectively, and a heat value of 0.07 MJ/kg. Atomic/ionic emission lines and molecular bands, such as CN and C2, have been employed to generate more accurate analysis results, achieving an ARE of 0.26% and a 0.16% limit of detection (LOD) for the prediction of unburned carbon in fly ashes. Both laboratory and on-line LIBS apparatuses have been developed for field application in coal-fired power plants. We consider that both the accuracy and the repeatability of the elemental and proximate analysis of coal have increased significantly and further efforts will be devoted to realizing large-scale commercialization of coal quality analyzer in China.
Hubert, Ph; Nguyen-Huu, J-J; Boulanger, B; Chapuzet, E; Chiap, P; Cohen, N; Compagnon, P-A; Dewé, W; Feinberg, M; Lallier, M; Laurentie, M; Mercier, N; Muzard, G; Nivet, C; Valat, L
2004-11-15
This paper is the first part of a summary report of a new commission of the Société Française des Sciences et Techniques Pharmaceutiques (SFSTP). The main objective of this commission was the harmonization of approaches for the validation of quantitative analytical procedures. Indeed, the principle of the validation of theses procedures is today widely spread in all the domains of activities where measurements are made. Nevertheless, this simple question of acceptability or not of an analytical procedure for a given application, remains incompletely determined in several cases despite the various regulations relating to the good practices (GLP, GMP, ...) and other documents of normative character (ISO, ICH, FDA, ...). There are many official documents describing the criteria of validation to be tested, but they do not propose any experimental protocol and limit themselves most often to the general concepts. For those reasons, two previous SFSTP commissions elaborated validation guides to concretely help the industrial scientists in charge of drug development to apply those regulatory recommendations. If these two first guides widely contributed to the use and progress of analytical validations, they present, nevertheless, weaknesses regarding the conclusions of the performed statistical tests and the decisions to be made with respect to the acceptance limits defined by the use of an analytical procedure. The present paper proposes to review even the bases of the analytical validation for developing harmonized approach, by distinguishing notably the diagnosis rules and the decision rules. This latter rule is based on the use of the accuracy profile, uses the notion of total error and allows to simplify the approach of the validation of an analytical procedure while checking the associated risk to its usage. Thanks to this novel validation approach, it is possible to unambiguously demonstrate the fitness for purpose of a new method as stated in all regulatory documents.
Assessment of Critical-Analytic Thinking
ERIC Educational Resources Information Center
Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.
2014-01-01
National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Marine resources. [coastal processes, ice, oceanography, and living marine resources
NASA Technical Reports Server (NTRS)
Tilton, E. L., III
1974-01-01
Techniques have been developed for defining coastal circulation patterns using sediment as a natural tracer, allowing the formulation of new circulation concepts in some geographical areas and, in general, a better capability for defining the seasonal characteristics of coastal circulation. An analytical technique for measurement of absolute water depth based upon the ratios of two MSS channels has been developed. Suspended sediment has found wide use as a tracer, but a few investigators have reported limited success in measuring the type and amount of sediment quantitatively from ERTS-1 digital data. Significant progress has been made in developing techniques for using ERTS-1 data to locate, identify, and monitor sea and lake ice. Ice features greater than 70 meters in width can be detected, and both arctic and antarctic icebergs have been identified. In the application area of living marine resources, the use of ERTS-1 image-density patterns as a potential indicator of fish school location has been demonstrated for one coastal commercial resource, menhaden. ERTS-1 data have been used to locate ocean current boundaries using ERTS-1 image-density enhancement, and some techniques are under development for measurement of suspended particle concentration and chlorophyll concentration. The interrelationship of water color and surface characteristics (sea state) are also being studied to improve spectral and spatial interpretive techniques.
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Thermoelectrically cooled water trap
Micheels, Ronald H [Concord, MA
2006-02-21
A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W
2016-01-01
A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
Recent advances in merging photonic crystals and plasmonics for bioanalytical applications.
Liu, Bing; Monshat, Hosein; Gu, Zhongze; Lu, Meng; Zhao, Xiangwei
2018-05-29
Photonic crystals (PhCs) and plasmonic nanostructures offer the unprecedented capability to control the interaction of light and biomolecules at the nanoscale. Based on PhC and plasmonic phenomena, a variety of analytical techniques have been demonstrated and successfully implemented in many fields, such as biological sciences, clinical diagnosis, drug discovery, and environmental monitoring. During the past decades, PhC and plasmonic technologies have progressed in parallel with their pros and cons. The merging of photonic crystals with plasmonics will significantly improve biosensor performances and enlarge the linear detection range of analytical targets. Here, we review the state-of-the-art biosensors that combine PhC and plasmonic nanomaterials for quantitative analysis. The optical mechanisms of PhCs, plasmonic crystals, and metal nanoparticles (NPs) are presented, along with their integration and potential applications. By explaining the optical coupling of photonic crystals and plasmonics, the review manifests how PhC-plasmonic hybrid biosensors can achieve the advantages, including high sensitivity, low cost, and short assay time as well. The review also discusses the challenges and future opportunities in this fascinating field.
NASA Astrophysics Data System (ADS)
Demiray, Hilmi; El-Zahar, Essam R.
2018-04-01
We consider the nonlinear propagation of electron-acoustic waves in a plasma composed of a cold electron fluid, hot electrons obeying a trapped/vortex-like distribution, and stationary ions. The basic nonlinear equations of the above described plasma are re-examined in the cylindrical (spherical) coordinates by employing the reductive perturbation technique. The modified cylindrical (spherical) KdV equation with fractional power nonlinearity is obtained as the evolution equation. Due to the nature of nonlinearity, this evolution equation cannot be reduced to the conventional KdV equation. A new family of closed form analytical approximate solution to the evolution equation and a comparison with numerical solution are presented and the results are depicted in some 2D and 3D figures. The results reveal that both solutions are in good agreement and the method can be used to obtain a new progressive wave solution for such evolution equations. Moreover, the resulting closed form analytical solution allows us to carry out a parametric study to investigate the effect of the physical parameters on the solution behavior of the modified cylindrical (spherical) KdV equation.
Liger-Belair, Gérard; Polidori, Guillaume; Zéninari, Virginie
2012-06-30
In champagne and sparkling wine tasting, the concentration of dissolved CO(2) is indeed an analytical parameter of high importance since it directly impacts the four following sensory properties: (i) the frequency of bubble formation in the glass, (ii) the growth rate of rising bubbles, (iii) the mouth feel, and (iv) the nose of champagne, i.e., its so-called bouquet. In this state-of-the-art review, the evolving nature of the dissolved and gaseous CO(2) found in champagne wines is evidenced, from the bottle to the glass, through various analytical techniques. Results obtained concerning various steps where the CO(2) molecule plays a role (from its ingestion in the liquid phase during the fermentation process to its progressive release in the headspace above the tasting glass) are gathered and synthesized to propose a self-consistent and global overview of how gaseous and dissolved CO(2) impact champagne and sparkling wine science. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ovchinnikov, M. Yu.; Ivanov, D. S.; Ivlev, N. A.; Karpenko, S. O.; Roldugin, D. S.; Tkachev, S. S.
2014-01-01
Design, analytical investigation, laboratory and in-flight testing of the attitude determination and control system (ADCS) of a microsatellites are considered. The system consists of three pairs of reaction wheels, three magnetorquers, a set of Sun sensors, a three-axis magnetometer and a control unit. The ADCS is designed for a small 10-50 kg LEO satellite. System development is accomplished in several steps: satellite dynamics preliminary study using asymptotical and numerical techniques, hardware and software design, laboratory testing of each actuator and sensor and the whole ADCS. Laboratory verification is carried out on the specially designed test-bench. In-flight ADCS exploitation results onboard the Russian microsatellite "Chibis-M" are presented. The satellite was developed, designed and manufactured by the Institute of Space Research of RAS. "Chibis-M" was launched by the "Progress-13M" cargo vehicle on January 25, 2012 after undocking from the International Space Station (ISS). This paper assess both the satellite and the ADCS mock-up dynamics. Analytical, numerical and laboratory study results are in good correspondence with in-flight data.
In situ analytical techniques for battery interface analysis.
Tripathi, Alok M; Su, Wei-Nien; Hwang, Bing Joe
2018-02-05
Lithium-ion batteries, simply known as lithium batteries, are distinct among high energy density charge-storage devices. The power delivery of batteries depends upon the electrochemical performances and the stability of the electrode, electrolytes and their interface. Interfacial phenomena of the electrode/electrolyte involve lithium dendrite formation, electrolyte degradation and gas evolution, and a semi-solid protective layer formation at the electrode-electrolyte interface, also known as the solid-electrolyte interface (SEI). The SEI protects electrodes from further exfoliation or corrosion and suppresses lithium dendrite formation, which are crucial needs for enhancing the cell performance. This review covers the compositional, structural and morphological aspects of SEI, both artificially and naturally formed, and metallic dendrites using in situ/in operando cells and various in situ analytical tools. Critical challenges and the historical legacy in the development of in situ/in operando electrochemical cells with some reports on state-of-the-art progress are particularly highlighted. The present compilation pinpoints the emerging research opportunities in advancing this field and concludes on the future directions and strategies for in situ/in operando analysis.
Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.
Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà
2017-10-01
Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.
Combined sensing platform for advanced diagnostics in exhaled mouse breath
NASA Astrophysics Data System (ADS)
Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris
2013-03-01
Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.
Analytical methods in multivariate highway safety exposure data estimation
DOT National Transportation Integrated Search
1984-01-01
Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...
Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.
Valcárcel, Miguel
2017-11-07
This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
RAWP Progress Report August 19, 2011 - Combined
Combines cover letter and Progress Report #2 as required in Residential Soil Remedial Action Work Plan (RAWP)-Phase 1, with the Analytical Laboratory Report of Walter Coke Inc site in Birmingham AL, August 16, 2011, prepared by TestAmerica Laboratories.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Single-Molecule Electronics: Chemical and Analytical Perspectives.
Nichols, Richard J; Higgins, Simon J
2015-01-01
It is now possible to measure the electrical properties of single molecules using a variety of techniques including scanning probe microcopies and mechanically controlled break junctions. Such measurements can be made across a wide range of environments including ambient conditions, organic liquids, ionic liquids, aqueous solutions, electrolytes, and ultra high vacuum. This has given new insights into charge transport across molecule electrical junctions, and these experimental methods have been complemented with increasingly sophisticated theory. This article reviews progress in single-molecule electronics from a chemical perspective and discusses topics such as the molecule-surface coupling in electrical junctions, chemical control, and supramolecular interactions in junctions and gating charge transport. The article concludes with an outlook regarding chemical analysis based on single-molecule conductance.
On the Development of a Deterministic Three-Dimensional Radiation Transport Code
NASA Technical Reports Server (NTRS)
Rockell, Candice; Tweed, John
2011-01-01
Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.
Cost and Schedule Analytical Techniques Development: Option 1
NASA Technical Reports Server (NTRS)
1996-01-01
This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) for the base contract year from December 1, 1995 through November 30, 1996. The Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical services and products to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PPO3) and the Program Plans and Requirements Officer (PPO2). Detailed Monthly Progress Reports were submitted to MSFC in accordance with the contract's Statement of Work Section IV "Reporting and Documentation". These reports spelled out each month's specific work accomplishments, deliverables submitted, major meetings held, and other pertinent information. This Final Report will summarize these activities at a higher level.
Chiral topological phases from artificial neural networks
NASA Astrophysics Data System (ADS)
Kaubruegger, Raphael; Pastori, Lorenzo; Budich, Jan Carl
2018-05-01
Motivated by recent progress in applying techniques from the field of artificial neural networks (ANNs) to quantum many-body physics, we investigate to what extent the flexibility of ANNs can be used to efficiently study systems that host chiral topological phases such as fractional quantum Hall (FQH) phases. With benchmark examples, we demonstrate that training ANNs of restricted Boltzmann machine type in the framework of variational Monte Carlo can numerically solve FQH problems to good approximation. Furthermore, we show by explicit construction how n -body correlations can be kept at an exact level with ANN wave functions exhibiting polynomial scaling with power n in system size. Using this construction, we analytically represent the paradigmatic Laughlin wave function as an ANN state.
New Trends in Impedimetric Biosensors for the Detection of Foodborne Pathogenic Bacteria
Wang, Yixian; Ye, Zunzhong; Ying, Yibin
2012-01-01
The development of a rapid, sensitive, specific method for the foodborne pathogenic bacteria detection is of great importance to ensure food safety and security. In recent years impedimetric biosensors which integrate biological recognition technology and impedance have gained widespread application in the field of bacteria detection. This paper presents an overview on the progress and application of impedimetric biosensors for detection of foodborne pathogenic bacteria, particularly the new trends in the past few years, including the new specific bio-recognition elements such as bacteriophage and lectin, the use of nanomaterials and microfluidics techniques. The applications of these new materials or techniques have provided unprecedented opportunities for the development of high-performance impedance bacteria biosensors. The significant developments of impedimetric biosensors for bacteria detection in the last five years have been reviewed according to the classification of with or without specific bio-recognition element. In addition, some microfluidics systems, which were used in the construction of impedimetric biosensors to improve analytical performance, are introduced in this review. PMID:22737018
McMullin, David; Mizaikoff, Boris; Krska, Rudolf
2015-01-01
Infrared spectroscopy is a rapid, nondestructive analytical technique that can be applied to the authentication and characterization of food samples in high throughput. In particular, near infrared spectroscopy is commonly utilized in the food quality control industry to monitor the physical attributes of numerous cereal grains for protein, carbohydrate, and lipid content. IR-based methods require little sample preparation, labor, or technical competence if multivariate data mining techniques are implemented; however, they do require extensive calibration. Economically important crops are infected by fungi that can severely reduce crop yields and quality and, in addition, produce mycotoxins. Owing to the health risks associated with mycotoxins in the food chain, regulatory limits have been set by both national and international institutions for specific mycotoxins and mycotoxin classes. This article discusses the progress and potential of IR-based methods as an alternative to existing chemical methods for the determination of fungal contamination in crops, as well as emerging spectroscopic methods.
Point-of-care rare cell cancer diagnostics.
Issadore, David
2015-01-01
The sparse cells that are shed from tumors into peripheral circulation are an increasingly promising resource for noninvasive monitoring of cancer progression, early diagnosis of disease, and serve as a tool for improving our understanding of cancer metastasis. However, the extremely sparse concentration of circulating tumor cells (CTCs) in blood (~1-100 CTC in 7.5 mL of blood) as well as their heterogeneous biomarker expression has limited their detection using conventional laboratory techniques. To overcome these challenges, we have developed a microfluidic chip-based micro-Hall detector (μHD), which can directly measure single, immunomagnetically tagged cells in whole blood. The μHD can detect individual cells even in the presence of vast numbers of blood cells and unbound reactants, and does not require any washing or purification steps. Furthermore, this cost-effective, single-cell analytical technique is well suited for miniaturization into a mobile platform for low-cost point-of-care use. In this chapter, we describe the methodology used to design, fabricate, and apply these chips to cancer diagnostics.
An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Collis, Peter
2012-01-01
Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
Progressive damage, fracture predictions and post mortem correlations for fiber composites
NASA Technical Reports Server (NTRS)
1985-01-01
Lewis Research Center is involved in the development of computational mechanics methods for predicting the structural behavior and response of composite structures. In conjunction with the analytical methods development, experimental programs including post failure examination are conducted to study various factors affecting composite fracture such as laminate thickness effects, ply configuration, and notch sensitivity. Results indicate that the analytical capabilities incorporated in the CODSTRAN computer code are effective in predicting the progressive damage and fracture of composite structures. In addition, the results being generated are establishing a data base which will aid in the characterization of composite fracture.
ERIC Educational Resources Information Center
Johnson, Marcus Edward
2017-01-01
Using an analytic informed by Nietzschean genealogy and systems theory, this paper explains how two conceptual structures (the emancipatory binary and the progressive triad), along with standard citation practices in academic journal writing, function to sustain and regenerate a progressive perspective within social studies education scholarship.…
Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert
2009-01-01
Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...
Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A
2007-01-01
The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.
Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.
ERIC Educational Resources Information Center
Hercules, David M.; Hercules, Shirley H.
1984-01-01
Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)
Jeffery, Nick D; Bate, Simon T; Safayi, Sina; Howard, Matthew A; Moon, Lawrence; Jeffery, Unity
2018-03-01
In animal experiments, neuroscientists typically assess the effectiveness of interventions by comparing the average response of groups of treated and untreated animals. While providing useful insights, focusing only on group effects risks overemphasis of small, statistically significant but physiologically unimportant, differences. Such differences can be created by analytical variability or physiological within-individual variation, especially if the number of animals in each group is small enough that one or two outlier values can have considerable impact on the summary measures for the group. Physicians face a similar dilemma when comparing two results from the same patient. To determine whether the change between two values reflects disease progression or known analytical and physiological variation, the magnitude of the difference between two results is compared to the reference change value. These values are generated by quantifying analytical and within-individual variation, and differences between two results from the same patient are considered clinically meaningful only if they exceed the combined effect of these two sources of 'noise'. In this article, we describe how the reference change interval can be applied within neuroscience. This form of analysis provides a measure of outcome at an individual level that complements traditional group-level comparisons, and therefore, introduction of this technique into neuroscience can enrich interpretation of experimental data. It can also safeguard against some of the possible misinterpretations that may occur during analysis of the small experimental groups that are common in neuroscience and, by illuminating analytical error, may aid in design of more efficient experimental methods. © 2018 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
A case history: from traumatic repetition towards psychic representability.
Bichi, Estela L
2008-06-01
This paper is devoted principally to a case history concerning an analytic process extending over a period of almost ten years. The patient is B, who consulted the author after a traumatic episode. Although that was her reason for commencing treatment, a history of previous traumatogenic situations, including a rape during her adolescence, subsequently came to light. The author describes three stages of the treatment, reflected in three different settings in accordance with the work done by both patient and analyst in enabling B to own and work through her infantile and adult traumatic experiences. The process of transformation of traumatic traces lacking psychic representation, which was undertaken by both members of the analytic couple from the beginning of the treatment, was eventually approached in a particular way on the basis of their respective creative capacities, which facilitated the patient's psychic progress towards representability and the possibility of working through the experiences of the past. Much of the challenge of this case involved the analyst's capacity to maintain and at the same time consolidate her analytic posture within her internal setting, while doing her best to overcome any possible misfit (Balint, 1968) between her own technique and the specific complexities of the individual patient. The account illustrates the alternation of phases, at the beginning of the analysis, of remembering and interpretation on the one hand and of the representational void and construction on the other. In the case history proper and in her detailed summing up, the author refers to the place of the analyst during the analytic process, the involvement of her psychic functioning, and the importance of her capacity to work on and make use of her countertransference and self-analytic introspection, with a view to neutralizing any influence that aspects of her 'real person' might have had on the analytic field and on the complex processes taking place within it.
Control system design for flexible structures using data models
NASA Technical Reports Server (NTRS)
Irwin, R. Dennis; Frazier, W. Garth; Mitchell, Jerrel R.; Medina, Enrique A.; Bukley, Angelia P.
1993-01-01
The dynamics and control of flexible aerospace structures exercises many of the engineering disciplines. In recent years there has been considerable research in the developing and tailoring of control system design techniques for these structures. This problem involves designing a control system for a multi-input, multi-output (MIMO) system that satisfies various performance criteria, such as vibration suppression, disturbance and noise rejection, attitude control and slewing control. Considerable progress has been made and demonstrated in control system design techniques for these structures. The key to designing control systems for these structures that meet stringent performance requirements is an accurate model. It has become apparent that theoretically and finite-element generated models do not provide the needed accuracy; almost all successful demonstrations of control system design techniques have involved using test results for fine-tuning a model or for extracting a model using system ID techniques. This paper describes past and ongoing efforts at Ohio University and NASA MSFC to design controllers using 'data models.' The basic philosophy of this approach is to start with a stabilizing controller and frequency response data that describes the plant; then, iteratively vary the free parameters of the controller so that performance measures become closer to satisfying design specifications. The frequency response data can be either experimentally derived or analytically derived. One 'design-with-data' algorithm presented in this paper is called the Compensator Improvement Program (CIP). The current CIP designs controllers for MIMO systems so that classical gain, phase, and attenuation margins are achieved. The center-piece of the CIP algorithm is the constraint improvement technique which is used to calculate a parameter change vector that guarantees an improvement in all unsatisfied, feasible performance metrics from iteration to iteration. The paper also presents a recently demonstrated CIP-type algorithm, called the Model and Data Oriented Computer-Aided Design System (MADCADS), developed for achieving H(sub infinity) type design specifications using data models. Control system design for the NASA/MSFC Single Structure Control Facility are demonstrated for both CIP and MADCADS. Advantages of design-with-data algorithms over techniques that require analytical plant models are also presented.
Hemispheric Activation Differences in Novice and Expert Clinicians during Clinical Decision Making
ERIC Educational Resources Information Center
Hruska, Pam; Hecker, Kent G.; Coderre, Sylvain; McLaughlin, Kevin; Cortese, Filomeno; Doig, Christopher; Beran, Tanya; Wright, Bruce; Krigolson, Olav
2016-01-01
Clinical decision making requires knowledge, experience and analytical/non-analytical types of decision processes. As clinicians progress from novice to expert, research indicates decision-making becomes less reliant on foundational biomedical knowledge and more on previous experience. In this study, we investigated how knowledge and experience…
Removal of DNT from Wastewaters at Radford Army Ammunition Plant
1991-03-31
appi~cable) U.S. Armty Corps of EngineersI ITEnvronenta Prgras, ic. ____________Toxic an’d Hazardous Material-, zec 6ý.. ADDPLSS (Crty, State. and... Peter Hartmann is RAAP’s Technical Analytical laboratory supervisor and is responsible for tracking analytical progress and ensuring timely completion
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Approximate analytical relationships for linear optimal aeroelastic flight control laws
NASA Astrophysics Data System (ADS)
Kassem, Ayman Hamdy
1998-09-01
This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.
Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration
NASA Technical Reports Server (NTRS)
Merritt, D. A.; Brand, W. A.; Hayes, J. M.
1994-01-01
In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).
Numerical simulations of strongly correlated electron and spin systems
NASA Astrophysics Data System (ADS)
Changlani, Hitesh Jaiprakash
Developing analytical and numerical tools for strongly correlated systems is a central challenge for the condensed matter physics community. In the absence of exact solutions and controlled analytical approximations, numerical techniques have often contributed to our understanding of these systems. Exact Diagonalization (ED) requires the storage of at least two vectors the size of the Hilbert space under consideration (which grows exponentially with system size) which makes it affordable only for small systems. The Density Matrix Renormalization Group (DMRG) uses an intelligent Hilbert space truncation procedure to significantly reduce this cost, but in its present formulation is limited to quasi-1D systems. Quantum Monte Carlo (QMC) maps the Schrodinger equation to the diffusion equation (in imaginary time) and only samples the eigenvector over time, thereby avoiding the memory limitation. However, the stochasticity involved in the method gives rise to the "sign problem" characteristic of fermion and frustrated spin systems. The first part of this thesis is an effort to make progress in the development of a numerical technique which overcomes the above mentioned problems. We consider novel variational wavefunctions, christened "Correlator Product States" (CPS), that have a general functional form which hopes to capture essential correlations in the ground states of spin and fermion systems in any dimension. We also consider a recent proposal to modify projector (Green's Function) Quantum Monte Carlo to ameliorate the sign problem for realistic and model Hamiltonians (such as the Hubbard model). This exploration led to our own set of improvements, primarily a semistochastic formulation of projector Quantum Monte Carlo. Despite their limitations, existing numerical techniques can yield physical insights into a wide variety of problems. The second part of this thesis considers one such numerical technique - DMRG - and adapts it to study the Heisenberg antiferromagnet on a generic tree graph. Our attention turns to a systematic numerical and semi-analytical study of the effect of local even/odd sublattice imbalance on the low energy spectrum of antiferromagnets on regular Cayley trees. Finally, motivated by previous experiments and theories of randomly diluted antiferromagnets (where an even/odd sublattice imbalance naturally occurs), we present our study of the Heisenberg antiferromagnet on the Cayley tree at the percolation threshold. Our work shows how to detect "emergent" low energy degrees of freedom and compute the effective interactions between them by using data from DMRG calculations.
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
Applications of mid-infrared spectroscopy in the clinical laboratory setting.
De Bruyne, Sander; Speeckaert, Marijn M; Delanghe, Joris R
2018-01-01
Fourier transform mid-infrared (MIR-FTIR) spectroscopy is a nondestructive, label-free, highly sensitive and specific technique that provides complete information on the chemical composition of biological samples. The technique both can offer fundamental structural information and serve as a quantitative analysis tool. Therefore, it has many potential applications in different fields of clinical laboratory science. Although considerable technological progress has been made to promote biomedical applications of this powerful analytical technique, most clinical laboratory analyses are based on spectroscopic measurements in the visible or ultraviolet (UV) spectrum and the potential role of FTIR spectroscopy still remains unexplored. In this review, we present some general principles of FTIR spectroscopy as a useful method to study molecules in specimens by MIR radiation together with a short overview of methods to interpret spectral data. We aim at illustrating the wide range of potential applications of the proposed technique in the clinical laboratory setting with a focus on its advantages and limitations and discussing the future directions. The reviewed applications of MIR spectroscopy include (1) quantification of clinical parameters in body fluids, (2) diagnosis and monitoring of cancer and other diseases by analysis of body fluids, cells, and tissues, (3) classification of clinically relevant microorganisms, and (4) analysis of kidney stones, nails, and faecal fat.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
Williams, Claire; Lewsey, James D.; Mackay, Daniel F.; Briggs, Andrew H.
2016-01-01
Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results. PMID:27698003
Williams, Claire; Lewsey, James D; Mackay, Daniel F; Briggs, Andrew H
2017-05-01
Modeling of clinical-effectiveness in a cost-effectiveness analysis typically involves some form of partitioned survival or Markov decision-analytic modeling. The health states progression-free, progression and death and the transitions between them are frequently of interest. With partitioned survival, progression is not modeled directly as a state; instead, time in that state is derived from the difference in area between the overall survival and the progression-free survival curves. With Markov decision-analytic modeling, a priori assumptions are often made with regard to the transitions rather than using the individual patient data directly to model them. This article compares a multi-state modeling survival regression approach to these two common methods. As a case study, we use a trial comparing rituximab in combination with fludarabine and cyclophosphamide v. fludarabine and cyclophosphamide alone for the first-line treatment of chronic lymphocytic leukemia. We calculated mean Life Years and QALYs that involved extrapolation of survival outcomes in the trial. We adapted an existing multi-state modeling approach to incorporate parametric distributions for transition hazards, to allow extrapolation. The comparison showed that, due to the different assumptions used in the different approaches, a discrepancy in results was evident. The partitioned survival and Markov decision-analytic modeling deemed the treatment cost-effective with ICERs of just over £16,000 and £13,000, respectively. However, the results with the multi-state modeling were less conclusive, with an ICER of just over £29,000. This work has illustrated that it is imperative to check whether assumptions are realistic, as different model choices can influence clinical and cost-effectiveness results.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
NASA Astrophysics Data System (ADS)
Aziz, Akram Mekhael; Sauck, William August; Shendi, El-Arabi Hendi; Rashed, Mohamed Ahmed; Abd El-Maksoud, Mohamed
2013-07-01
Progress in the past three decades in geophysical data processing and interpretation techniques was particularly focused in the field of aero-geophysics. The present study is to demonstrate the application of some of these techniques, including Analytic Signal, Located Euler Deconvolution, Standard Euler Deconvolution, and 2D inverse modelling, to help in enhancing and interpreting the archeo-magnetic measurements. A high-resolution total magnetic field survey was conducted at the ancient city of Pelusium (name derived from the ancient Pelusiac branch of the Nile, and recently called Tell el-Farama), located in the northwestern corner of the Sinai Peninsula. The historical city had served as a harbour throughout the Egyptian history. Different ruins at the site have been dated back to late Pharaonic, Graeco-Roman, Byzantine, Coptic, and Islamic periods. An area of 10,000 m2, to the west of the famous huge red brick citadel of Pelusium, was surveyed using the magnetic method. The chosen location was recommended by the Egyptian archaeologists, where they suspected the presence of buried foundations of a temple to the gods Zeus and Kasios. The interpretation of the results revealed interesting shallow-buried features, which may represent the Temple's outer walls. These walls are elongated in the same azimuth as the northern wall of the citadel, which supports the hypothesis of a controlling feature such as a former seacoast or shore of a distributary channel.
A Systematic Review of Health Economics Simulation Models of Chronic Obstructive Pulmonary Disease.
Zafari, Zafar; Bryan, Stirling; Sin, Don D; Conte, Tania; Khakban, Rahman; Sadatsafavi, Mohsen
2017-01-01
Many decision-analytic models with varying structures have been developed to inform resource allocation in chronic obstructive pulmonary disease (COPD). To review COPD models for their adherence to the best practice modeling recommendations and their assumptions regarding important aspects of the natural history of COPD. A systematic search of English articles reporting on the development or application of a decision-analytic model in COPD was performed in MEDLINE, Embase, and citations within reviewed articles. Studies were summarized and evaluated on the basis of their adherence to the Consolidated Health Economic Evaluation Reporting Standards. They were also evaluated for the underlying assumptions about disease progression, heterogeneity, comorbidity, and treatment effects. Forty-nine models of COPD were included. Decision trees and Markov models were the most popular techniques (43 studies). Quality of reporting and adherence to the guidelines were generally high, especially in more recent publications. Disease progression was modeled through clinical staging in most studies. Although most studies (n = 43) had incorporated some aspects of COPD heterogeneity, only 8 reported the results across subgroups. Only 2 evaluations explicitly considered the impact of comorbidities. Treatment effect had been mostly modeled (20) as both reduction in exacerbation rate and improvement in lung function. Many COPD models have been developed, generally with similar structural elements. COPD is highly heterogeneous, and comorbid conditions play an important role in its burden. These important aspects, however, have not been adequately addressed in most of the published models. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Common aspects influencing the translocation of SERS to Biomedicine.
Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu
2018-01-04
In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
An implementation and performance measurement of the progressive retry technique
NASA Technical Reports Server (NTRS)
Suri, Gaurav; Huang, Yennun; Wang, Yi-Min; Fuchs, W. Kent; Kintala, Chandra
1995-01-01
This paper describes a recovery technique called progressive retry for bypassing software faults in message-passing applications. The technique is implemented as reusable modules to provide application-level software fault tolerance. The paper describes the implementation of the technique and presents results from the application of progressive retry to two telecommunications systems. the results presented show that the technique is helpful in reducing the total recovery time for message-passing applications.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Surface-Enhanced Raman Spectroscopy.
ERIC Educational Resources Information Center
Garrell, Robin L.
1989-01-01
Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
A comparison of measured and theoretical predictions for STS ascent and entry sonic booms
NASA Technical Reports Server (NTRS)
Garcia, F., Jr.; Jones, J. H.; Henderson, H. R.
1983-01-01
Sonic boom measurements have been obtained during the flights of STS-1 through 5. During STS-1, 2, and 4, entry sonic boom measurements were obtained and ascent measurements were made on STS-5. The objectives of this measurement program were (1) to define the sonic boom characteristics of the Space Transportation System (STS), (2) provide a realistic assessment of the validity of xisting theoretical prediction techniques, and (3) establish a level of confidence for predicting future STS configuration sonic boom environments. Detail evaluation and reporting of the results of this program are in progress. This paper will address only the significant results, mainly those data obtained during the entry of STS-1 at Edwards Air Force Base (EAFB), and the ascent of STS-5 from Kennedy Space Center (KSC). The theoretical prediction technique employed in this analysis is the so called Thomas Program. This prediction technique is a semi-empirical method that required definition of the near field signatures, detailed trajectory characteristics, and the prevailing meteorological characteristics as an input. This analytical procedure then extrapolates the near field signatures from the flight altitude to an altitude consistent with each measurement location.
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
Analytical Chemistry Division annual progress report for period ending December 31, 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: (1) Analytical Research, Development, and Implementation. The division maintains a program to conceptualize, investigate, develop, assess, improve, and implement advanced technology for chemical and physicochemical measurements. Emphasis is on problems and needs identified with ORNL and Department of Energy (DOE) programs; however, attention is also given to advancing the analytical sciences themselves. (2) Programmatic Research, Development, andmore » Utilization. The division carries out a wide variety of chemical work that typically involves analytical research and/or development plus the utilization of analytical capabilities to expedite programmatic interests. (3) Technical Support. The division performs chemical and physicochemical analyses of virtually all types. The Analytical Chemistry Division is organized into four major sections, each of which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to December 31, 1988. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8.« less
NASA Astrophysics Data System (ADS)
Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.
2002-04-01
Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
NASA Astrophysics Data System (ADS)
Safouhi, Hassan; Hoggan, Philip
2003-01-01
This review on molecular integrals for large electronic systems (MILES) places the problem of analytical integration over exponential-type orbitals (ETOs) in a historical context. After reference to the pioneering work, particularly by Barnett, Shavitt and Yoshimine, it focuses on recent progress towards rapid and accurate analytic solutions of MILES over ETOs. Software such as the hydrogenlike wavefunction package Alchemy by Yoshimine and collaborators is described. The review focuses on convergence acceleration of these highly oscillatory integrals and in particular it highlights suitable nonlinear transformations. Work by Levin and Sidi is described and applied to MILES. A step by step description of progress in the use of nonlinear transformation methods to obtain efficient codes is provided. The recent approach developed by Safouhi is also presented. The current state of the art in this field is summarized to show that ab initio analytical work over ETOs is now a viable option.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Beaver, Justin M; BogenII, Paul L.
In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less
The scent of disease: volatile organic compounds of the human body related to disease and disorder.
Shirasu, Mika; Touhara, Kazushige
2011-09-01
Hundreds of volatile organic compounds (VOCs) are emitted from the human body, and the components of VOCs usually reflect the metabolic condition of an individual. Therefore, contracting an infectious or metabolic disease often results in a change in body odour. Recent progresses in analytical techniques allow rapid analyses of VOCs derived from breath, blood, skin and urine. Disease-specific VOCs can be used as diagnostic olfactory biomarkers of infectious diseases, metabolic diseases, genetic disorders and other kinds of diseases. Elucidation of pathophysiological mechanisms underlying production of disease-specific VOCs may provide novel insights into therapeutic approaches for treatments for various diseases. This review summarizes the current knowledge on chemical and clinical aspects of body-derived VOCs, and provides a brief outlook at the future of olfactory diagnosis.
Shielding from space radiations
NASA Technical Reports Server (NTRS)
Chang, C. Ken; Badavi, Forooz F.; Tripathi, Ram K.
1993-01-01
This Progress Report covering the period of December 1, 1992 to June 1, 1993 presents the development of an analytical solution to the heavy ion transport equation in terms of Green's function formalism. The mathematical development results are recasted into a highly efficient computer code for space applications. The efficiency of this algorithm is accomplished by a nonperturbative technique of extending the Green's function over the solution domain. The code may also be applied to accelerator boundary conditions to allow code validation in laboratory experiments. Results from the isotopic version of the code with 59 isotopes present for a single layer target material, for the case of an iron beam projectile at 600 MeV/nucleon in water is presented. A listing of the single layer isotopic version of the code is included.
Failure analysis of woven and braided fabric reinforced composites
NASA Technical Reports Server (NTRS)
Naik, Rajiv A.
1994-01-01
A general purpose micromechanics analysis that discretely models the yarn architecture within the textile repeating unit cell was developed to predict overall, three dimensional, thermal and mechanical properties, damage initiation and progression, and strength. This analytical technique was implemented in a user-friendly, personal computer-based, menu-driven code called Textile Composite Analysis for Design (TEXCAD). TEXCAD was used to analyze plain weave and 2x2, 2-D triaxial braided composites. The calculated tension, compression, and shear strengths correlated well with available test data for both woven and braided composites. Parametric studies were performed on both woven and braided architectures to investigate the effects of parameters such as yarn size, yarn spacing, yarn crimp, braid angle, and overall fiber volume fraction on the strength properties of the textile composite.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rockward, Tommy
2012-07-16
For the past 6 years, open discussions and/or meetings have been held and are still on-going with OEM, Hydrogen Suppliers, other test facilities from the North America Team and International collaborators regarding experimental results, fuel clean-up cost, modeling, and analytical techniques to help determine levels of constituents for the development of an international standard for hydrogen fuel quality (ISO TC197 WG-12). Significant progress has been made. The process for the fuel standard is entering final stages as a result of the technical accomplishments. The objectives are to: (1) Determine the allowable levels of hydrogen fuel contaminants in support of themore » development of science-based international standards for hydrogen fuel quality (ISO TC197 WG-12); and (2) Validate the ASTM test method for determining low levels of non-hydrogen constituents.« less
New methods for new questions: obstacles and opportunities.
Foster, E Michael; Kalil, Ariel
2008-03-01
Two forces motivate this special section, "New Methods for New Questions in Developmental Psychology." First are recent developments in social science methodology and the increasing availability of those methods in common software packages. Second, at the same time psychologists' understanding of developmental phenomena has continued to grow. At their best, these developments in theory and methods work in tandem, fueling each other. Newer methods make it possible for scientists to better test their ideas; better ideas lead methodologists to techniques that better reflect, capture, and quantify the underlying processes. The articles in this special section represent a sampling of these new methods and new questions. The authors describe common themes in these articles and identify barriers to future progress, such as the lack of data sharing by and analytical training for developmentalists.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data
NASA Astrophysics Data System (ADS)
Jern, Mikael
Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.
Llano, Daniel A; Devanarayan, Viswanath; Simon, Adam J
2013-01-01
Previous studies that have examined the potential for plasma markers to serve as biomarkers for Alzheimer disease (AD) have studied single analytes and focused on the amyloid-β and τ isoforms and have failed to yield conclusive results. In this study, we performed a multivariate analysis of 146 plasma analytes (the Human DiscoveryMAP v 1.0 from Rules-Based Medicine) in 527 subjects with AD, mild cognitive impairment (MCI), or cognitively normal elderly subjects from the Alzheimer's Disease Neuroimaging Initiative database. We identified 4 different proteomic signatures, each using 5 to 14 analytes, that differentiate AD from control patients with sensitivity and specificity ranging from 74% to 85%. Five analytes were common to all 4 signatures: apolipoprotein A-II, apolipoprotein E, serum glutamic oxaloacetic transaminase, α-1-microglobulin, and brain natriuretic peptide. None of the signatures adequately predicted progression from MCI to AD over a 12- and 24-month period. A new panel of analytes, optimized to predict MCI to AD conversion, was able to provide 55% to 60% predictive accuracy. These data suggest that a simple panel of plasma analytes may provide an adjunctive tool to differentiate AD from controls, may provide mechanistic insights to the etiology of AD, but cannot adequately predict MCI to AD conversion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, R.; Engebrecht, C.
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
Analytical Chemistry Division. Annual progress report for period ending December 31, 1980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lyon, W.S.
1981-05-01
This report is divided into: analytical methodology; mass and emission spectrometry; technical support; bio/organic analysis; nuclear and radiochemical analysis; quality assurance, safety, and tabulation of analyses; supplementary activities; and presentation of research results. Separate abstracts were prepared for the technical support, bio/organic analysis, and nuclear and radiochemical analysis. (DLC)
Analytical Chemistry Division annual progress report for period ending December 31, 1989
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-04-01
The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: Analytical Research, Development and Implementation; Programmatic Research, Development, and Utilization; and Technical Support. The Analytical Chemistry Division is organized into four major sections, each which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to Decembermore » 31, 1989. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8. Approximately 69 articles, 41 proceedings, and 31 reports were published, and 151 oral presentations were given during this reporting period. Some 308,981 determinations were performed.« less
Recent Progresses in Nanobiosensing for Food Safety Analysis
Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen
2016-01-01
With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014–present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636
Recent Progresses in Nanobiosensing for Food Safety Analysis.
Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen
2016-07-19
With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014-present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly.
Guidelines and Parameter Selection for the Simulation of Progressive Delamination
NASA Technical Reports Server (NTRS)
Song, Kyongchan; Davila, Carlos G.; Rose, Cheryl A.
2008-01-01
Turon s methodology for determining optimal analysis parameters for the simulation of progressive delamination is reviewed. Recommended procedures for determining analysis parameters for efficient delamination growth predictions using the Abaqus/Standard cohesive element and relatively coarse meshes are provided for single and mixed-mode loading. The Abaqus cohesive element, COH3D8, and a user-defined cohesive element are used to develop finite element models of the double cantilever beam specimen, the end-notched flexure specimen, and the mixed-mode bending specimen to simulate progressive delamination growth in Mode I, Mode II, and mixed-mode fracture, respectively. The predicted responses are compared with their analytical solutions. The results show that for single-mode fracture, the predicted responses obtained with the Abaqus cohesive element correlate well with the analytical solutions. For mixed-mode fracture, it was found that the response predicted using COH3D8 elements depends on the damage evolution criterion that is used. The energy-based criterion overpredicts the peak loads and load-deflection response. The results predicted using a tabulated form of the BK criterion correlate well with the analytical solution and with the results predicted with the user-written element.
Resonance Ionization, Mass Spectrometry.
ERIC Educational Resources Information Center
Young, J. P.; And Others
1989-01-01
Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)
Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods
ERIC Educational Resources Information Center
Zhang, Ying
2011-01-01
Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
Dinov, Ivo D.; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W.; Price, Nathan D.; Van Horn, John D.; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M.; Dauer, William; Toga, Arthur W.
2016-01-01
Background A unique archive of Big Data on Parkinson’s Disease is collected, managed and disseminated by the Parkinson’s Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson’s disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data–large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources–all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Methods and Findings Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson’s disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Conclusions Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson’s disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer’s, Huntington’s, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications. PMID:27494614
An analytical and experimental evaluation of a Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. A.; Cosby, R. M.
1976-01-01
An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Lead Slowing-Down Spectrometry Time Spectral Analysis for Spent Fuel Assay: FY11 Status Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulisek, Jonathan A.; Anderson, Kevin K.; Bowyer, Sonya M.
2011-09-30
Developing a method for the accurate, direct, and independent assay of the fissile isotopes in bulk materials (such as used fuel) from next-generation domestic nuclear fuel cycles is a goal of the Office of Nuclear Energy, Fuel Cycle R&D, Material Protection and Control Technology (MPACT) Campaign. To meet this goal, MPACT supports a multi-institutional collaboration, of which PNNL is a part, to study the feasibility of Lead Slowing Down Spectroscopy (LSDS). This technique is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic masses in used fuel with an uncertaintymore » considerably lower than the approximately 10% typical of today's confirmatory assay methods. This document is a progress report for FY2011 PNNL analysis and algorithm development. Progress made by PNNL in FY2011 continues to indicate the promise of LSDS analysis and algorithms applied to used fuel. PNNL developed an empirical model based on calibration of the LSDS to responses generated from well-characterized used fuel. The empirical model, which accounts for self-shielding effects using empirical basis vectors calculated from the singular value decomposition (SVD) of a matrix containing the true self-shielding functions of the used fuel assembly models. The potential for the direct and independent assay of the sum of the masses of 239Pu and 241Pu to within approximately 3% over a wide used fuel parameter space was demonstrated. Also, in FY2011, PNNL continued to develop an analytical model. Such efforts included the addition of six more non-fissile absorbers in the analytical shielding function and the non-uniformity of the neutron flux across the LSDS assay chamber. A hybrid analytical-empirical approach was developed to determine the mass of total Pu (sum of the masses of 239Pu, 240Pu, and 241Pu), which is an important quantity in safeguards. Results using this hybrid method were of approximately the same accuracy as the pure empirical approach. In addition, total Pu with much better accuracy with the hybrid approach than the pure analytical approach. In FY2012, PNNL will continue efforts to optimize its empirical model and minimize its reliance on calibration data. In addition, PNNL will continue to develop an analytical model, considering effects such as neutron-scattering in the fuel and cladding, as well as neutrons streaming through gaps between fuel pins in the fuel assembly.« less
Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi
2016-10-01
The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (
Analytical Ultrasonics in Materials Research and Testing
NASA Technical Reports Server (NTRS)
Vary, A.
1986-01-01
Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.
Measuring research progress in photovoltaics
NASA Technical Reports Server (NTRS)
Jackson, B.; Mcguire, P.
1986-01-01
The role and some results of the project analysis and integration function in the Flat-plate Solar Array (FSA) Project are presented. Activities included supporting the decision-making process, preparation of plans for project direction, setting goals for project activities, measuring progress within the project, and the development and maintenance of analytical models.
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
Simple and Efficient Numerical Evaluation of Near-Hypersingular Integrals
NASA Technical Reports Server (NTRS)
Fink, Patrick W.; Wilton, Donald R.; Khayat, Michael A.
2007-01-01
Recently, significant progress has been made in the handling of singular and nearly-singular potential integrals that commonly arise in the Boundary Element Method (BEM). To facilitate object-oriented programming and handling of higher order basis functions, cancellation techniques are favored over techniques involving singularity subtraction. However, gradients of the Newton-type potentials, which produce hypersingular kernels, are also frequently required in BEM formulations. As is the case with the potentials, treatment of the near-hypersingular integrals has proven more challenging than treating the limiting case in which the observation point approaches the surface. Historically, numerical evaluation of these near-hypersingularities has often involved a two-step procedure: a singularity subtraction to reduce the order of the singularity, followed by a boundary contour integral evaluation of the extracted part. Since this evaluation necessarily links basis function, Green s function, and the integration domain (element shape), the approach ill fits object-oriented programming concepts. Thus, there is a need for cancellation-type techniques for efficient numerical evaluation of the gradient of the potential. Progress in the development of efficient cancellation-type procedures for the gradient potentials was recently presented. To the extent possible, a change of variables is chosen such that the Jacobian of the transformation cancels the singularity. However, since the gradient kernel involves singularities of different orders, we also require that the transformation leaves remaining terms that are analytic. The terms "normal" and "tangential" are used herein with reference to the source element. Also, since computational formulations often involve the numerical evaluation of both potentials and their gradients, it is highly desirable that a single integration procedure efficiently handles both.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Cortez, Juliana; Pasquini, Celio
2013-02-05
The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
NASA Technical Reports Server (NTRS)
Corker, Kevin; Lebacqz, J. Victor (Technical Monitor)
1997-01-01
The NASA and the FAA have entered into a joint venture to explore, define, design and implement a new airspace management operating concept. The fundamental premise of that concept is that technologies and procedures need to be developed for flight deck and ground operations to improve the efficiency, the predictability, the flexibility and the safety of airspace management and operations. To that end NASA Ames has undertaken an initial development and exploration of "key concepts" in the free flight airspace management technology development. Human Factors issues in automation aiding design, coupled aiding systems between air and ground, communication protocols in distributed decision making, and analytic techniques for definition of concepts of airspace density and operator cognitive load have been undertaken. This paper reports the progress of these efforts, which are not intended to definitively solve the many evolving issues of design for future ATM systems, but to provide preliminary results to chart the parameters of performance and the topology of the analytic effort required. The preliminary research in provision of cockpit display of traffic information, dynamic density definition, distributed decision making, situation awareness models and human performance models is discussed as they focus on the theme of "design requirements".
21st century toolkit for optimizing population health through precision nutrition.
O'Sullivan, Aifric; Henrick, Bethany; Dixon, Bonnie; Barile, Daniela; Zivkovic, Angela; Smilowitz, Jennifer; Lemay, Danielle; Martin, William; German, J Bruce; Schaefer, Sara Elizabeth
2017-07-05
Scientific, technological, and economic progress over the last 100 years all but eradicated problems of widespread food shortage and nutrient deficiency in developed nations. But now society is faced with a new set of nutrition problems related to energy imbalance and metabolic disease, which require new kinds of solutions. Recent developments in the area of new analytical tools enable us to systematically study large quantities of detailed and multidimensional metabolic and health data, providing the opportunity to address current nutrition problems through an approach called Precision Nutrition. This approach integrates different kinds of "big data" to expand our understanding of the complexity and diversity of human metabolism in response to diet. With these tools, we can more fully elucidate each individual's unique phenotype, or the current state of health, as determined by the interactions among biology, environment, and behavior. The tools of precision nutrition include genomics, metabolomics, microbiomics, phenotyping, high-throughput analytical chemistry techniques, longitudinal tracking with body sensors, informatics, data science, and sophisticated educational and behavioral interventions. These tools are enabling the development of more personalized and predictive dietary guidance and interventions that have the potential to transform how the public makes food choices and greatly improve population health.
Simón-Manso, Yamil; Lowenthal, Mark S; Kilpatrick, Lisa E; Sampson, Maureen L; Telu, Kelly H; Rudnick, Paul A; Mallard, W Gary; Bearden, Daniel W; Schock, Tracey B; Tchekhovskoi, Dmitrii V; Blonder, Niksa; Yan, Xinjian; Liang, Yuxue; Zheng, Yufang; Wallace, William E; Neta, Pedatsur; Phinney, Karen W; Remaley, Alan T; Stein, Stephen E
2013-12-17
Recent progress in metabolomics and the development of increasingly sensitive analytical techniques have renewed interest in global profiling, i.e., semiquantitative monitoring of all chemical constituents of biological fluids. In this work, we have performed global profiling of NIST SRM 1950, "Metabolites in Human Plasma", using GC-MS, LC-MS, and NMR. Metabolome coverage, difficulties, and reproducibility of the experiments on each platform are discussed. A total of 353 metabolites have been identified in this material. GC-MS provides 65 unique identifications, and most of the identifications from NMR overlap with the LC-MS identifications, except for some small sugars that are not directly found by LC-MS. Also, repeatability and intermediate precision analyses show that the SRM 1950 profiling is reproducible enough to consider this material as a good choice to distinguish between analytical and biological variability. Clinical laboratory data shows that most results are within the reference ranges for each assay. In-house computational tools have been developed or modified for MS data processing and interactive web display. All data and programs are freely available online at http://peptide.nist.gov/ and http://srmd.nist.gov/ .
NASA Technical Reports Server (NTRS)
Carle, G. C.
1985-01-01
Gas chromatography (GC) technology was developed for flight experiments in solar system exploration. The GC is a powerful analytical technique with simple devices separating individual components from complex mixtures to make very sensitive quantitative and qualitative measurements. It monitors samples containing mixtures of fixed gases and volatile organic molecules. The GC was used on the Viking mission in support of life detection experiments and on the Pioneer Venus Large Probe to determine the composition of the venusian atmosphere. A flight GC is under development to study the progress and extent of STS astronaut denitrogenation prior to extravehicular activity. Advanced flight GC concepts and systems for future solar system exploration are also studied. Studies include miniature ionization detectors and associated control systems capable of detecting from ppb up to 100% concentration levels. Further miniaturization is investigated using photolithography and controlled chemical etching in silicon wafers. Novel concepts such as ion mobility drift spectroscopy and multiplex gas chromatography are also developed for future flight experiments. These powerful analytical concepts and associated hardware are ideal for the monitoring of cabin atmospheres containing potentially dangerous volatile compounds.
[Carbonyl compounds emission and uptake by plant: Research progress].
Li, Jian; Cai, Jing; Yan, Liu-Shui; Li, Ling-Na; Tao, Min
2013-02-01
This paper reviewed the researches on the carbonyl compounds emission and uptake by plants, and discussed the compensation point of the bidirectional exchange of carbonyl compounds between plants and atmosphere. The uptake by leaf stomata and stratum corneum is the principal way for the purification of air aldehydes by plants. After entering into plant leaves, most parts of carbonyl compounds can be metabolized into organic acid, glucide, amino acid, and carbon dioxide, etc. , by the endoenzymes in leaves. The exchange direction of the carbonyl compounds between plants and atmosphere can be preliminarily predicted by the compensation point and the concentrations of ambient carbonyl compounds. This paper summarized the analytical methods such as DNPH/HPLC/UV and PFPH/GC/MS used for the determination of carbonyl compounds emitted from plants or in plant leaves. The main research interests in the future were pointed out, e. g. , to improve and optimize the analytical methods for the determination of carbonyl compounds emitted from plants and the researches on systems (e. g. , plant-soil system), to enlarge the detection species of carbonyl compounds emitted from plants, to screen the plant species which can effectively metabolize the pollutants, and to popularize the phytoremediation techniques for atmospheric
NASA Astrophysics Data System (ADS)
Yazdchi, K.; Salehi, M.; Shokrieh, M. M.
2009-03-01
By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.
2016-01-01
The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer p. If p takes its maximum value, then we have a complete analytic material. Otherwise, it is incomplete analytic material of rank p. For two-dimensional materials, further progress can be made in the identification of analytic materials by using the well-known fact that a 90° rotation applied to a divergence-free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations. PMID:27956882
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagase, F.; Ishikawa, J.; Kurata, M.
2013-07-01
Estimation of the accident progress and status inside the pressure vessels (RPV) and primary containment vessels (PCV) is required for appropriate conductance of decommissioning in the Fukushima-Daiichi NPP. For that, it is necessary to obtain additional experimental data and revised models for the estimation using computer codes with increased accuracies. The Japan Atomic Energy Agency (JAEA) has selected phenomena to be reviewed and developed, considering previously obtained information, conditions specific to the Fukushima-Daiichi NPP accident, and recent progress of experimental and analytical technologies. As a result, research and development items have been picked up in terms of thermal-hydraulic behavior inmore » the RPV and PCV, progression of fuel bundle degradation, failure of the lower head of RPV, and analysis of the accident. This paper introduces the selected phenomena to be reviewed and developed, research plans and recent results from the JAEA's corresponding research programs. (authors)« less
Review of levoglucosan in glacier snow and ice studies: Recent progress and future perspectives.
You, Chao; Xu, Chao
2018-03-01
Levoglucosan (LEV) in glacier snow and ice layers provides a fingerprint of fire activity, ranging from modern air pollution to ancient fire emissions. In this study, we review recent progress in our understanding and application of LEV in glaciers, including analytical methods, transport and post-depositional processes, and historical records. We firstly summarize progress in analytical methods for determination of LEV in glacier snow and ice. Then, we discuss the processes influencing the records of LEV in snow and ice layers. Finally, we make some recommendations for future work, such as assessing the stability of LEV and obtaining continuous records, to increase reliability of the reconstructed ancient fire activity. This review provides an update for researchers working with LEV and will facilitate the further use of LEV as a biomarker in paleo-fire studies based on ice core records. Copyright © 2017 Elsevier B.V. All rights reserved.
Progress Towards an Open Data Ecosystem for Australian Geochemistry and Geochronology Data
NASA Astrophysics Data System (ADS)
McInnes, B.; Rawling, T.; Brown, W.; Liffers, M.; Wyborn, L. A.; Brown, A.; Cox, S. J. D.
2016-12-01
Technological improvements in laboratory automation and microanalytical methods are producing an unprecedented volume of high-value geochemical data for use by geoscientists in understanding geological and planetary processes. In contrast, the research infrastructure necessary to systematically manage, deliver and archive analytical data has not progressed much beyond the minimum effort necessary to produce a peer-reviewed publication. Anecdotal evidence indicates that the majority of publically funded data is underreported, and what is published is relatively undiscoverable to experienced researchers let alone the general public. Government-funded "open data" initiatives have a role to play in the development of networks of data management and delivery ecosystems and practices allowing access to publically funded data. This paper reports on progress in Australia towards creation of an open data ecosystem involving multiple academic and government research institutions cooperating to create an open data architecture linking researchers, physical samples, sample metadata, laboratory metadata, analytical data and consumers.
An Analytical Solution for Transient Thermal Response of an Insulated Structure
NASA Technical Reports Server (NTRS)
Blosser, Max L.
2012-01-01
An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
Barker, John R; Martinez, Antonio
2018-04-04
Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.
NASA Astrophysics Data System (ADS)
Barker, John R.; Martinez, Antonio
2018-04-01
Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.
Optical trapping for analytical biotechnology.
Ashok, Praveen C; Dholakia, Kishan
2012-02-01
We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.
Access to Education in Bangladesh: Country Analytic Review of Primary and Secondary School
ERIC Educational Resources Information Center
Ahmed, Manzoor; Ahmed, Kazi Saleh; Khan, Nurul Islam; Ahmed, Romij
2007-01-01
This country analytical review examines the key issues in access to and participation in primary and secondary education in Bangladesh, with a special focus on areas and dimensions of exclusion. Against a background of overall progress, particularly in closing the gender gap in primary and secondary enrollment, the research applies a conceptual…
Assessment of Learning in Digital Interactive Social Networks: A Learning Analytics Approach
ERIC Educational Resources Information Center
Wilson, Mark; Gochyyev, Perman; Scalise, Kathleen
2016-01-01
This paper summarizes initial field-test results from data analytics used in the work of the Assessment and Teaching of 21st Century Skills (ATC21S) project, on the "ICT Literacy--Learning in digital networks" learning progression. This project, sponsored by Cisco, Intel and Microsoft, aims to help educators around the world enable…
Solid State Division progress report for period ending September 30, 1993
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, P.H.; Hinton, L.W.
1994-08-01
This report covers research progress in the Solid State Division from April 1, 1992, to September 30, 1993. During this period, the division conducted a broad, interdisciplinary materials research program with emphasis on theoretical solid state physics, neutron scattering, synthesis and characterization of materials, ion beam and laser processing, and the structure of solids and surfaces. This research effort was enhanced by new capabilities in atomic-scale materials characterization, new emphasis on the synthesis and processing of materials, and increased partnering with industry and universities. The theoretical effort included a broad range of analytical studies, as well as a new emphasismore » on numerical simulation stimulated by advances in high-performance computing and by strong interest in related division experimental programs. Superconductivity research continued to advance on a broad front from fundamental mechanisms of high-temperature superconductivity to the development of new materials and processing techniques. The Neutron Scattering Program was characterized by a strong scientific user program and growing diversity represented by new initiatives in complex fluids and residual stress. The national emphasis on materials synthesis and processing was mirrored in division research programs in thin-film processing, surface modification, and crystal growth. Research on advanced processing techniques such as laser ablation, ion implantation, and plasma processing was complemented by strong programs in the characterization of materials and surfaces including ultrahigh resolution scanning transmission electron microscopy, atomic-resolution chemical analysis, synchrotron x-ray research, and scanning tunneling microscopy.« less
NASA Astrophysics Data System (ADS)
Becker, J. Sabine
2005-04-01
For a few years now inductively coupled plasma mass spectrometry has been increasingly used for precise and accurate determination of isotope ratios of long-lived radionuclides at the trace and ultratrace level due to its excellent sensitivity, good precision and accuracy. At present, ICP-MS and also laser ablation ICP-MS are applied as powerful analytical techniques in different fields such as the characterization of nuclear materials, recycled and by-products (e.g., spent nuclear fuel or depleted uranium ammunitions), radioactive waste control, in environmental monitoring and in bioassay measurements, in health control, in geochemistry and geochronology. Especially double-focusing sector field ICP mass spectrometers with single ion detector or with multiple ion collector device have been used for the precise determination of long-lived radionuclides isotope ratios at very low concentration levels. Progress has been achieved by the combination of ultrasensitive mass spectrometric techniques with effective separation and enrichment procedures in order to improve detection limits or by the introduction of the collision cell in ICP-MS for reducing disturbing interfering ions (e.g., of 129Xe+ for the determination of 129I). This review describes the state of the art and the progress of ICP-MS and laser ablation ICP-MS for isotope ratio measurements of long-lived radionuclides in different sample types, especially in the main application fields of characterization of nuclear and radioactive waste material, environmental research and health controls.
Novel approaches against epidermal growth factor receptor tyrosine kinase inhibitor resistance
Heydt, Carina; Michels, Sebastian; Thress, Kenneth S.; Bergner, Sven; Wolf, Jürgen; Buettner, Reinhard
2018-01-01
Background The identification and characterization of molecular biomarkers has helped to revolutionize non-small-cell lung cancer (NSCLC) management, as it transitions from target-focused to patient-based treatment, centered on the evolving genomic profile of the individual. Determination of epidermal growth factor receptor (EGFR) mutation status represents a critical step in the diagnostic process. The recent emergence of acquired resistance to “third-generation” EGFR tyrosine kinase inhibitors (TKIs) via multiple mechanisms serves to illustrate the important influence of tumor heterogeneity on prognostic outcomes in patients with NSCLC. Design This literature review examines the emergence of TKI resistance and the course of disease progression and, consequently, the clinical decision-making process in NSCLC. Results Molecular markers of acquired resistance, of which T790M and HER2 or MET amplifications are the most common, help to guide ongoing treatment past the point of progression. Although tissue biopsy techniques remain the gold standard, the emergence of liquid biopsies and advances in analytical techniques may eventually allow “real-time” monitoring of tumor evolution and, in this way, help to optimize targeted treatment approaches. Conclusions The influence of inter- and intra-tumor heterogeneity on resistance mechanisms should be considered when treating patients using resistance-specific therapies. New tools are necessary to analyze changes in heterogeneity and clonal composition during drug treatment. The refinement and standardization of diagnostic procedures and increased accessibility to technology will ultimately help in personalizing the management of NSCLC. PMID:29632655
Nuclear and atomic analytical techniques in environmental studies in South America.
Paschoa, A S
1990-01-01
The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.
Green aspects, developments and perspectives of liquid phase microextraction techniques.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2014-02-01
Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.
Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A
2011-07-01
Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R
2016-03-30
A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Vogt, Frank
2011-01-01
Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…
ERIC Educational Resources Information Center
Griffith, James
2002-01-01
Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
Methodological issues in microdialysis sampling for pharmacokinetic studies.
de Lange, E C; de Boer, A G; Breimer, D D
2000-12-15
Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.
Kim, Saewung; Guenther, Alex; Apel, Eric
2013-07-01
The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.
ERIC Educational Resources Information Center
Camara, Boubacar
This publication complements the "Education for All" program and is intended to provide a comprehensive and operational indicator for monitoring education. As a synthetic tool, the Educational Progress Indicator (EPI) facilitates the analytical assessment and projection work of educational planners, managers, actors, and policymakers. The EPI…
Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.
Mura, Paola
2015-09-10
Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.
Historical milestones in measurement of HDL-cholesterol: impact on clinical and laboratory practice.
Langlois, Michel R; Blaton, Victor H
2006-07-23
High-density lipoprotein cholesterol (HDL-C) comprises a family of particles with differing physicochemical characteristics. Continuing progress in improving HDL-C analysis has originated from two separate fields-one clinical, reflecting increased attention to HDL-C in estimating risk for coronary heart disease (CHD), and the other analytical, reflecting increased emphasis on finding more reliable and cost-effective HDL-C assays. Epidemiologic and prospective studies established the inverse association of HDL-C with CHD risk, a relationship that is consistent with protective mechanisms demonstrated in basic research and animal studies. Atheroprotective and less atheroprotective HDL subpopulations have been described. Guidelines on primary and secondary CHD prevention, which increased the workload in clinical laboratories, have led to a revolution in HDL-C assay technology. Many analytical techniques including ultracentrifugation, electrophoresis, chromatography, and polyanion precipitation methods have been developed to separate and quantify HDL-C and HDL subclasses. More recently developed homogeneous assays enable direct measurement of HDL-C on an automated analyzer, without the need for manual pretreatment to separate non-HDL. Although homogeneous assays show improved accuracy and precision in normal serum, discrepant results exist in samples with atypical lipoprotein characteristics. Hypertriglyceridemia and monoclonal paraproteins are important interfering factors. A novel approach is nuclear magnetic resonance spectroscopy that allows rapid and reliable analysis of lipoprotein subclasses, which may improve the identification of individuals at increased CHD risk. Apolipoprotein A-I, the major protein of HDL, has been proposed as an alternative cardioprotective marker avoiding the analytical limitations of HDL-C.
ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress
NASA Technical Reports Server (NTRS)
Kempler, Steven
2015-01-01
The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.
Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R
2016-01-21
The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Query2Question: Translating Visualization Interaction into Natural Language.
Nafari, Maryam; Weaver, Chris
2015-06-01
Richly interactive visualization tools are increasingly popular for data exploration and analysis in a wide variety of domains. Existing systems and techniques for recording provenance of interaction focus either on comprehensive automated recording of low-level interaction events or on idiosyncratic manual transcription of high-level analysis activities. In this paper, we present the architecture and translation design of a query-to-question (Q2Q) system that automatically records user interactions and presents them semantically using natural language (written English). Q2Q takes advantage of domain knowledge and uses natural language generation (NLG) techniques to translate and transcribe a progression of interactive visualization states into a visual log of styled text that complements and effectively extends the functionality of visualization tools. We present Q2Q as a means to support a cross-examination process in which questions rather than interactions are the focus of analytic reasoning and action. We describe the architecture and implementation of the Q2Q system, discuss key design factors and variations that effect question generation, and present several visualizations that incorporate Q2Q for analysis in a variety of knowledge domains.
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique
NASA Astrophysics Data System (ADS)
Riccò, R.; Nizzero, S.; Penna, E.; Meneghello, A.; Cretaio, E.; Enrichi, F.
2018-05-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. [Figure not available: see fulltext.
Statistical mechanics of the vertex-cover problem
NASA Astrophysics Data System (ADS)
Hartmann, Alexander K.; Weigt, Martin
2003-10-01
We review recent progress in the study of the vertex-cover problem (VC). The VC belongs to the class of NP-complete graph theoretical problems, which plays a central role in theoretical computer science. On ensembles of random graphs, VC exhibits a coverable-uncoverable phase transition. Very close to this transition, depending on the solution algorithm, easy-hard transitions in the typical running time of the algorithms occur. We explain a statistical mechanics approach, which works by mapping the VC to a hard-core lattice gas, and then applying techniques such as the replica trick or the cavity approach. Using these methods, the phase diagram of the VC could be obtained exactly for connectivities c < e, where the VC is replica symmetric. Recently, this result could be confirmed using traditional mathematical techniques. For c > e, the solution of the VC exhibits full replica symmetry breaking. The statistical mechanics approach can also be used to study analytically the typical running time of simple complete and incomplete algorithms for the VC. Finally, we describe recent results for the VC when studied on other ensembles of finite- and infinite-dimensional graphs.
NASA Technical Reports Server (NTRS)
Freund, Friedemann
1991-01-01
Substantial progress has been made towards a better understanding of the dissolution of common gas/fluid phase components, notably H2O and CO2, in minerals. It has been shown that the dissolution mechanisms are significantly more complex than currently believed. By judiciously combining various solid state analytical techniques, convincing evidence was obtained that traces of dissolved gas/fluid phase components undergo, at least in part, a redox conversion by which they split into reduced H2 and and reduced C on one hand and oxidized oxygen, O(-), on the other. Analysis for 2 and C as well as for any organic molecules which may form during the process of co-segregation are still impeded by the omnipresent danger of extraneous contamination. However, the presence of O(-), an unusual oxidized form of oxygen, has been proven beyond a reasonable doubt. The presence of O(-) testifies to the fact that a redox reaction must have taken place in the solid state involving the dissolved traces of gas/fluid phase components. Detailed information on the techniques used and the results obtained are given.
Buttigieg, Pier Luigi; Ramette, Alban
2014-12-01
The application of multivariate statistical analyses has become a consistent feature in microbial ecology. However, many microbial ecologists are still in the process of developing a deep understanding of these methods and appreciating their limitations. As a consequence, staying abreast of progress and debate in this arena poses an additional challenge to many microbial ecologists. To address these issues, we present the GUide to STatistical Analysis in Microbial Ecology (GUSTA ME): a dynamic, web-based resource providing accessible descriptions of numerous multivariate techniques relevant to microbial ecologists. A combination of interactive elements allows users to discover and navigate between methods relevant to their needs and examine how they have been used by others in the field. We have designed GUSTA ME to become a community-led and -curated service, which we hope will provide a common reference and forum to discuss and disseminate analytical techniques relevant to the microbial ecology community. © 2014 The Authors. FEMS Microbiology Ecology published by John Wiley & Sons Ltd on behalf of Federation of European Microbiological Societies.
Chemical analysis of Panax quinquefolius (North American ginseng): A review.
Wang, Yaping; Choi, Hyung-Kyoon; Brinckmann, Josef A; Jiang, Xue; Huang, Linfang
2015-12-24
Panax quinquefolius (PQ) is one of the best-selling natural health products due to its proposed beneficial anti-aging, anti-cancer, anti-stress, anti-fatigue, and anxiolytic effects. In recent years, the quality of PQ has received considerable attention. Sensitive and accurate methods for qualitative and quantitative analyses of chemical constituents are necessary for the comprehensive quality control to ensure the safety and efficacy of PQ. This article reviews recent progress in the chemical analysis of PQ and its preparations. Numerous analytical techniques, including spectroscopy, thin-layer chromatography (TLC), gas chromatography (GC), high-performance liquid chromatography (HPLC), liquid chromatography/mass spectrometry (LC/MS), high-speed centrifugal partition chromatography (HSCPC), high-performance counter-current chromatography (HPCCC), nuclear magnetic resonance spectroscopy (NMR), and immunoassay, are described. Among these techniques, HPLC coupled with mass spectrometry (MS) is the most promising method for quality control. The challenges encountered in the chemical analysis of PQ are also briefly discussed, and the remaining questions regarding the quality control of PQ that require further investigation are highlighted. Copyright © 2015 Elsevier B.V. All rights reserved.
Molecular markers: progress and prospects for understanding reproductive ecology in elasmobranchs.
Portnoy, D S; Heist, E J
2012-04-01
Application of modern molecular tools is expanding the understanding of elasmobranch reproductive ecology. High-resolution molecular markers provide information at scales ranging from the identification of reproductively isolated populations in sympatry (i.e. cryptic species) to the relationships among parents, offspring and siblings. This avenue of study has not only augmented the current understanding of the reproductive biology of elasmobranchs but has also provided novel insights that could not be obtained through experimental or observational techniques. Sharing of genetic polymorphisms across ocean basins indicates that for some species there may be gene flow on global scales. The presence, however, of morphologically similar but genetically distinct entities in sympatry suggests that reproductive isolation can occur with minimal morphological differentiation. This review discusses the recent findings in elasmobranch reproductive biology like philopatry, hybridization and polyandry while highlighting important molecular and analytical techniques. Furthermore, the review examines gaps in current knowledge and discusses how new technologies may be applied to further the understanding of elasmobranch reproductive ecology. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Accuracy of trace element determinations in alternate fuels
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1980-01-01
NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.
MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...
Product identification techniques used as training aids for analytical chemists
NASA Technical Reports Server (NTRS)
Grillo, J. P.
1968-01-01
Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.
Ion beams provided by small accelerators for material synthesis and characterization
NASA Astrophysics Data System (ADS)
Mackova, Anna; Havranek, Vladimir
2017-06-01
The compact, multipurpose electrostatic tandem accelerators are extensively used for production of ion beams with energies in the range from 400 keV to 24 MeV of almost all elements of the periodic system for the trace element analysis by means of nuclear analytical methods. The ion beams produced by small accelerators have a broad application, mainly for material characterization (Rutherford Back-Scattering spectrometry, Particle Induced X ray Emission analysis, Nuclear Reaction Analysis and Ion-Microprobe with 1 μm lateral resolution among others) and for high-energy implantation. Material research belongs to traditionally progressive fields of technology. Due to the continuous miniaturization, the underlying structures are far beyond the analytical limits of the most conventional methods. Ion Beam Analysis (IBA) techniques provide this possibility as they use probes of similar or much smaller dimensions (particles, radiation). Ion beams can be used for the synthesis of new progressive functional nanomaterials for optics, electronics and other applications. Ion beams are extensively used in studies of the fundamental energetic ion interaction with matter as well as in the novel nanostructure synthesis using ion beam irradiation in various amorphous and crystalline materials in order to get structures with extraordinary functional properties. IBA methods serve for investigation of materials coming from material research, industry, micro- and nano-technology, electronics, optics and laser technology, chemical, biological and environmental investigation in general. Main research directions in laboratories employing small accelerators are also the preparation and characterization of micro- and nano-structured materials which are of interest for basic and oriented research in material science, and various studies of biological, geological, environmental and cultural heritage artefacts are provided too.
Plenis, Alina; Oledzka, Ilona; Kowalski, Piotr; Baczek, Tomasz
2016-01-01
During the last few years there has been a growing interest in research focused on the metabolism of steroid hormones despite that the study of metabolic hormone pathways is still a difficult and demanding task because of low steroid concentrations and a complexity of the analysed matrices. Thus, there has been an increasing interest in the development of new, more selective and sensitive methods for monitoring these compounds in biological samples. A lot of bibliographic databases for world research literature were structurally searched using selected review question and inclusion/exclusion criteria. Next, the reports of the highest quality were selected using standard tools (181) and they were described to evaluate the advantages and limitations of different approaches in the measurements of the steroids and their metabolites. The overview of the analytical challenges, development of methods used in the assessment of the metabolic pathways of steroid hormones, and the priorities for future research with a special consideration for liquid chromatography (LC) and capillary electrophoresis (CE) techniques have been presented. Moreover, many LC and CE applications in pharmacological and psychological studies as well as endocrinology and sports medicine, taking into account the recent progress in the area of the metabolic profiling of steroids, have been critically discussed. The latest reports show that LC systems coupled with mass spectrometry have the predominant position in the research of steroid profiles. Moreover, CE techniques are going to gain a prominent position in the diagnosis of hormone levels in the near future.
Factor-Analytic and Individualized Approaches to Constructing Brief Measures of ADHD Behaviors
ERIC Educational Resources Information Center
Volpe, Robert J.; Gadow, Kenneth D.; Blom-Hoffman, Jessica; Feinberg, Adam B.
2009-01-01
Two studies were performed to examine a factor-analytic and an individualized approach to creating short progress-monitoring measures from the longer "ADHD-Symptom Checklist-4" (ADHD-SC4). In Study 1, teacher ratings on items of the ADHD:Inattentive (IA) and ADHD:Hyperactive-Impulsive (HI) scales of the ADHD-SC4 were factor analyzed in a normative…
Building America House Simulation Protocols
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendron, Robert; Engebrecht, Cheryn
2010-09-01
The House Simulation Protocol document was developed to track and manage progress toward Building America's multi-year, average whole-building energy reduction research goals for new construction and existing homes, using a consistent analytical reference point. This report summarizes the guidelines for developing and reporting these analytical results in a consistent and meaningful manner for all home energy uses using standard operating conditions.
Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakkila, E.A.
1978-10-01
Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.
Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975
1975-09-01
and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Greenberg, Alexandra J; Serrano, Katrina J; Thai, Chan L; Blake, Kelly D; Moser, Richard P; Hesse, Bradford W; Ahern, David K
2017-03-01
Use of the internet for seeking and managing health information in the U.S., Europe, and emerging and developing nations is growing. Recent global trends indicate more interactive uses of the internet including online communication with providers. In the U.S., The Healthy People 2020 (HP2020) initiative was created by the Department of Health and Human Services to provide 10-year goals for improving the health of American citizens. Two goals of HP2020 were to increase the proportion of individuals who use the Internet to keep track of their personal health information (PHI) online and to increase the proportion of individuals who use the internet to communicate with their healthcare provider. In the present study, we use data from the seven administrations of the Health Information National Trends Survey (HINTS) to assess progress towards these goals. These data were analyzed using descriptive, bivariate, and logistic regression analytic techniques. Results of this study suggested that the HP2020 target of having 15.7% of individuals manage their PHI online by 2020 has already been exceeded (28.1%); similarly, the goal for proportion of individuals communicating with their provider using the internet (15.0%) was exceeded by 2014 (29.7%). While progress towards these goals was positive in all sociodemographic groups for both goals, differences in the rate of progress were seen by gender, race/ethnicity, income, and education, but not by age group. The rapidly increasing proportion of individuals globally who use the internet to manage their health information provides unique opportunities for patient-centered health information technology interventions.
NASA Astrophysics Data System (ADS)
Schaich, David
2016-03-01
Lattice field theory provides a non-perturbative regularization of strongly interacting systems, which has proven crucial to the study of quantum chromodynamics among many other theories. Supersymmetry plays prominent roles in the study of physics beyond the standard model, both as an ingredient in model building and as a tool to improve our understanding of quantum field theory. Attempts to apply lattice techniques to supersymmetric field theories have a long history, but until recently these efforts have generally encountered insurmountable difficulties related to the interplay of supersymmetry with the lattice discretization of spacetime. In recent years these difficulties have been overcome for a class of theories that includes the particularly interesting case of maximally supersymmetric Yang-Mills (N = 4 SYM) in four dimensions, which is a cornerstone of AdS/CFT duality. In combination with computational advances this progress enables practical numerical investigations of N = 4 SYM on the lattice, which can address questions that are difficult or impossible to handle through perturbation theory, AdS/CFT duality, or the conformal bootstrap program. I will briefly review some of the new ideas underlying this recent progress, and present some results from ongoing large-scale numerical calculations, including comparisons with analytic predictions.
NASA Astrophysics Data System (ADS)
Studdert-Kennedy, M.; Obrien, N.
1983-05-01
This report is one of a regular series on the status and progress of studies on the nature of speech, instrumentation for its investigation, and practical applications. Manuscripts cover the following topics: The influence of subcategorical mismatches on lexical access; The Serbo-Croatian orthography constraints the reader to a phonologically analytic strategy; Grammatical priming effects between pronouns and inflected verb forms; Misreadings by beginning readers of Serrbo-Croatian; Bi-alphabetism and work recognition; Orthographic and phonemic coding for word identification: Evidence for Hebrew; Stress and vowel duration effects on syllable recognition; Phonetic and auditory trading relations between acoustic cues in speech perception: Further results; Linguistic coding by deaf children in relation beginning reading success; Determinants of spelling ability in deaf and hearing adults: Access to linguistic structures; A dynamical basis for action systems; On the space-time structure of human interlimb coordination; Some acoustic and physiological observations on diphthongs; Relationship between pitch control and vowel articulation; Laryngeal vibrations: A comparison between high-speed filming and glottographic techniques; Compensatory articulation in hearing impaired speakers: A cinefluorographic study; and Review (Pierre Delattre: Studies in comparative phonetics.)
Single Domain Antibodies as New Biomarker Detectors
Fischer, Katja; Leow, Chiuan Yee; Chuah, Candy; McCarthy, James
2017-01-01
Biomarkers are defined as indicators of biological processes, pathogenic processes, or pharmacological responses to a therapeutic intervention. Biomarkers have been widely used for early detection, prediction of response after treatment, and for monitoring the progression of diseases. Antibodies represent promising tools for recognition of biomarkers, and are widely deployed as analytical tools in clinical settings. For immunodiagnostics, antibodies are now exploited as binders for antigens of interest across a range of platforms. More recently, the discovery of antibody surface display and combinatorial chemistry techniques has allowed the exploration of new binders from a range of animals, for instance variable domains of new antigen receptors (VNAR) from shark and variable heavy chain domains (VHH) or nanobodies from camelids. These single domain antibodies (sdAbs) have some advantages over conventional murine immunoglobulin owing to the lack of a light chain, making them the smallest natural biomarker binders thus far identified. In this review, we will discuss several biomarkers used as a means to validate diseases progress. The potential functionality of modern singe domain antigen binders derived from phylogenetically early animals as new biomarker detectors for current diagnostic and research platforms development will be described. PMID:29039819
From molecular biology to nanotechnology and nanomedicine.
Bogunia-Kubik, Katarzyna; Sugisaka, Masanori
2002-01-01
Great progress in the development of molecular biology techniques has been seen since the discovery of the structure of deoxyribonucleic acid (DNA) and the implementation of a polymerase chain reaction (PCR) method. This started a new era of research on the structure of nucleic acids molecules, the development of new analytical tools, and DNA-based analyses. The latter included not only diagnostic procedures but also, for example, DNA-based computational approaches. On the other hand, people have started to be more interested in mimicking real life, and modeling the structures and organisms that already exist in nature for the further evaluation and insight into their behavior and evolution. These factors, among others, have led to the description of artificial organelles or cells, and the construction of nanoscale devices. These nanomachines and nanoobjects might soon find a practical implementation, especially in the field of medical research and diagnostics. The paper presents some examples, illustrating the progress in multidisciplinary research in the nanoscale area. It is focused especially on immunogenetics-related aspects and the wide usage of DNA molecules in various fields of science. In addition, some proposals for nanoparticles and nanoscale tools and their applications in medicine are reviewed and discussed.
Advanced Telemetry System Development.
Progress in advanced telemetry system development is described. Discussions are included of studies leading to the specification for design...characteristics of adaptive and analytical telemetry systems in which the information efficiently utilizes the data channel capacity. Also discussed are...Progress indicates that further sophistication of existing designs in telemetry will be less advantageous than the development of new systems of
NASA Technical Reports Server (NTRS)
Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.
1985-01-01
An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.
Airborne chemistry: acoustic levitation in chemical analysis.
Santesson, Sabina; Nilsson, Staffan
2004-04-01
This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.
Yu, Peiqiang; Xin, Hangshu; Ban, Yajing; Zhang, Xuewei
2014-05-07
Recent advances in biofuel and bio-oil processing technology require huge supplies of energy feedstocks for processing. Very recently, new carinata seeds have been developed as energy feedstocks for biofuel and bio-oil production. The processing results in a large amount of coproducts, which are carinata meal. To date, there is no systematic study on interactive association between biopolymers and biofunctions in carinata seed as energy feedstocks for biofuel and bioethanol processing and their processing coproducts (carinata meal). Molecular spectroscopy with synchrotron and globar sources is a rapid and noninvasive analytical technique and is able to investigate molecular structure conformation in relation to biopolymer functions and bioavailability. However, to date, these techniques are seldom used in biofuel and bioethanol processing in other research laboratories. This paper aims to provide research progress and updates with molecular spectroscopy on the energy feedstock (carinata seed) and coproducts (carinata meal) from biofuel and bioethanol processing and show how to use these molecular techniques to study the interactive association between biopolymers and biofunctions in the energy feedstocks and their coproducts (carinata meal) from biofuel and bio-oil processing before and after biodegradation.
HEAVY AND THERMAL OIL RECOVERY PRODUCTION MECHANISMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anthony R. Kovscek
2003-04-01
This technical progress report describes work performed from January 1 through March 31, 2003 for the project ''Heavy and Thermal Oil Recovery Production Mechanisms,'' DE-FC26-00BC15311. In this project, a broad spectrum of research is undertaken related to thermal and heavy-oil recovery. The research tools and techniques span from pore-level imaging of multiphase fluid flow to definition of reservoir-scale features through streamline-based history matching techniques. During this period, previous analysis of experimental data regarding multidimensional imbibition to obtain shape factors appropriate for dual-porosity simulation was verified by comparison among analytic, dual-porosity simulation, and fine-grid simulation. We continued to study the mechanismsmore » by which oil is produced from fractured porous media at high pressure and high temperature. Temperature has a beneficial effect on recovery and reduces residual oil saturation. A new experiment was conducted on diatomite core. Significantly, we show that elevated temperature induces fines release in sandstone cores and this behavior may be linked to wettability. Our work in the area of primary production of heavy oil continues with field cores and crude oil. On the topic of reservoir definition, work continued on developing techniques that integrate production history into reservoir models using streamline-based properties.« less
Olivieri, Alejandro C
2005-08-01
Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.
Culture-Sensitive Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Vandenberghe, L.
2008-01-01
Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…
NASA Astrophysics Data System (ADS)
Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel
2013-10-01
We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.
Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu
2010-01-01
This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
Analytical methods for gelatin differentiation from bovine and porcine origins and food products.
Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B
2012-01-01
Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®
An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. L.; Cosby, R. M.
1976-01-01
Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Arbaugh, J. B.; Hwang, Alvin
2013-01-01
Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Kenneth Paul
Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less
Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo
2016-03-24
Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Srirengan, Kanthikannan
The overall objective of this research was to develop the finite element code required to efficiently predict the strength of plain weave composite structures. Towards which, three-dimensional conventional progressive damage analysis was implemented to predict the strength of plain weave composites subjected to periodic boundary conditions. Also, modal technique for three-dimensional global/local stress analysis was developed to predict the failure initiation in plain weave composite structures. The progressive damage analysis was used to study the effect of quadrature order, mesh refinement and degradation models on the predicted damage and strength of plain weave composites subjected to uniaxial tension in the warp tow direction. A 1/32sp{nd} part of the representative volume element of a symmetrically stacked configuration was analyzed. The tow geometry was assumed to be sinusoidal. Graphite/Epoxy system was used. Maximum stress criteria and combined stress criteria were used to predict failure in the tows and maximum principal stress criterion was used to predict failure in the matrix. Degradation models based on logical reasoning, micromechanics idealization and experimental comparisons were used to calculate the effective material properties with of damage. Modified Newton-Raphson method was used to determine the incremental solution for each applied strain level. Using a refined mesh and the discount method based on experimental comparisons, the progressive damage and the strength of plain weave composites of waviness ratios 1/3 and 1/6 subjected to uniaxial tension in the warp direction have been characterized. Plain weave composites exhibit a brittle response in uniaxial tension. The strength decreases significantly with the increase in waviness ratio. Damage initiation and collapse were caused dominantly due to intra-tow cracking and inter-tow debonding respectively. The predicted strength of plain weave composites of racetrack geometry and waviness ratio 1/25.7 was compared with analytical predictions and experimental findings and was found to match well. To evaluate the performance of the modal technique, failure initiation in a short woven composite cantilevered plate subjected to end moment and transverse end load was predicted. The global/local predictions were found to reasonably match well with the conventional finite element predictions.
Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin
2015-01-01
In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.
Protein assay structured on paper by using lithography
NASA Astrophysics Data System (ADS)
Wilhelm, E.; Nargang, T. M.; Al Bitar, W.; Waterkotte, B.; Rapp, B. E.
2015-03-01
There are two main challenges in producing a robust, paper-based analytical device. The first one is to create a hydrophobic barrier which unlike the commonly used wax barriers does not break if the paper is bent. The second one is the creation of the (bio-)specific sensing layer. For this proteins have to be immobilized without diminishing their activity. We solve both problems using light-based fabrication methods that enable fast, efficient manufacturing of paper-based analytical devices. The first technique relies on silanization by which we create a flexible hydrophobic barrier made of dimethoxydimethylsilane. The second technique demonstrated within this paper uses photobleaching to immobilize proteins by means of maskless projection lithography. Both techniques have been tested on a classical lithography setup using printed toner masks and on a lithography system for maskless lithography. Using these setups we could demonstrate that the proposed manufacturing techniques can be carried out at low costs. The resolution of the paper-based analytical devices obtained with static masks was lower due to the lower mask resolution. Better results were obtained using advanced lithography equipment. By doing so we demonstrated, that our technique enables fabrication of effective hydrophobic boundary layers with a thickness of only 342 μm. Furthermore we showed that flourescine-5-biotin can be immobilized on the non-structured paper and be employed for the detection of streptavidinalkaline phosphatase. By carrying out this assay on a paper-based analytical device which had been structured using the silanization technique we proofed biological compatibility of the suggested patterning technique.
Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C
2016-01-28
Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Gardiner, Derek J.
1980-01-01
Reviews mainly quantitative analytical applications in the field of Raman spectrometry. Includes references to other reviews, new and analytically untested techniques, and novel sampling and instrument designs. Cites 184 references. (CS)
Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.
Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C
2016-09-01
Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.
Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C
2013-01-01
The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.
The HVT technique and the 'uncertainty' relation for central potentials
NASA Astrophysics Data System (ADS)
Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th
2004-08-01
The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.
ERIC Educational Resources Information Center
Borman, Stuart A.
1982-01-01
Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
Predictive modeling of complications.
Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P
2016-09-01
Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.
Active Control of Inlet Noise on the JT15D Turbofan Engine
NASA Technical Reports Server (NTRS)
Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.
1999-01-01
This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.
Mass spectrometric based approaches in urine metabolomics and biomarker discovery.
Khamis, Mona M; Adamko, Darryl J; El-Aneed, Anas
2017-03-01
Urine metabolomics has recently emerged as a prominent field for the discovery of non-invasive biomarkers that can detect subtle metabolic discrepancies in response to a specific disease or therapeutic intervention. Urine, compared to other biofluids, is characterized by its ease of collection, richness in metabolites and its ability to reflect imbalances of all biochemical pathways within the body. Following urine collection for metabolomic analysis, samples must be immediately frozen to quench any biogenic and/or non-biogenic chemical reactions. According to the aim of the experiment; sample preparation can vary from simple procedures such as filtration to more specific extraction protocols such as liquid-liquid extraction. Due to the lack of comprehensive studies on urine metabolome stability, higher storage temperatures (i.e. 4°C) and repetitive freeze-thaw cycles should be avoided. To date, among all analytical techniques, mass spectrometry (MS) provides the best sensitivity, selectivity and identification capabilities to analyze the majority of the metabolite composition in the urine. Combined with the qualitative and quantitative capabilities of MS, and due to the continuous improvements in its related technologies (i.e. ultra high-performance liquid chromatography [UPLC] and hydrophilic interaction liquid chromatography [HILIC]), liquid chromatography (LC)-MS is unequivocally the most utilized and the most informative analytical tool employed in urine metabolomics. Furthermore, differential isotope tagging techniques has provided a solution to ion suppression from urine matrix thus allowing for quantitative analysis. In addition to LC-MS, other MS-based technologies have been utilized in urine metabolomics. These include direct injection (infusion)-MS, capillary electrophoresis-MS and gas chromatography-MS. In this article, the current progresses of different MS-based techniques in exploring the urine metabolome as well as the recent findings in providing potentially diagnostic urinary biomarkers are discussed. © 2015 Wiley Periodicals, Inc. Mass Spec Rev 36:115-134, 2017. © 2015 Wiley Periodicals, Inc.
David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert
2011-01-01
In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.
1990-08-01
This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less
Discourse-Centric Learning Analytics: Mapping the Terrain
ERIC Educational Resources Information Center
Knight, Simon; Littleton, Karen
2015-01-01
There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations
ERIC Educational Resources Information Center
Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei
2016-01-01
Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…
Techniques for sensing methanol concentration in aqueous environments
NASA Technical Reports Server (NTRS)
Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)
2001-01-01
An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.
DOT National Transportation Integrated Search
2016-12-25
The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...
INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ANTIBIOTICS WITH ANALYTICAL CHEMISTRY
Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques.
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Wang, Pei; Yu, Zhiguo
2015-10-01
Near infrared (NIR) spectroscopy as a rapid and nondestructive analytical technique, integrated with chemometrics, is a powerful process analytical tool for the pharmaceutical industry and is becoming an attractive complementary technique for herbal medicine analysis. This review mainly focuses on the recent applications of NIR spectroscopy in species authentication of herbal medicines and their geographical origin discrimination.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, R.S.; Kong, E.J.; Bahner, M.A.
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.
Miglior, Filippo; Fleming, Allison; Malchiodi, Francesca; Brito, Luiz F; Martin, Pauline; Baes, Christine F
2017-12-01
Over the past 100 yr, the range of traits considered for genetic selection in dairy cattle populations has progressed to meet the demands of both industry and society. At the turn of the 20th century, dairy farmers were interested in increasing milk production; however, a systematic strategy for selection was not available. Organized milk performance recording took shape, followed quickly by conformation scoring. Methodological advances in both genetic theory and statistics around the middle of the century, together with technological innovations in computing, paved the way for powerful multitrait analyses. As more sophisticated analytical techniques for traits were developed and incorporated into selection programs, production began to increase rapidly, and the wheels of genetic progress began to turn. By the end of the century, the focus of selection had moved away from being purely production oriented toward a more balanced breeding goal. This shift occurred partly due to increasing health and fertility issues and partly due to societal pressure and welfare concerns. Traits encompassing longevity, fertility, calving, health, and workability have now been integrated into selection indices. Current research focuses on fitness, health, welfare, milk quality, and environmental sustainability, underlying the concentrated emphasis on a more comprehensive breeding goal. In the future, on-farm sensors, data loggers, precision measurement techniques, and other technological aids will provide even more data for use in selection, and the difficulty will lie not in measuring phenotypes but rather in choosing which traits to select for. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Quality of Life and Outcomes in African Americans with CKD
Fischer, Michael J.; Wang, Xuelei; Brooks, Deborah; Bruce, Marino; Charleston, Jeanne; Cleveland, William H.; Dowie, Donna; Faulkner, Marquetta; Gassman, Jennifer; Hiremath, Leena; Kendrick, Cindy; Kusek, John W.; Norris, Keith C.; Thornley-Brown, Denyse; Greene, Tom; Lash, James P.
2014-01-01
Low health-related quality of life (HRQOL) has been associated with increased risk for hospitalization and death in ESRD. However, the relationship of HRQOL with outcomes in predialysis CKD is not well understood. We evaluated the association between HRQOL and renal and cardiovascular (CV) outcomes in 1091 African Americans with hypertensive CKD enrolled in the African American Study of Kidney Disease and Hypertension (AASK) trial and cohort studies. Outcomes included CKD progression (doubling of serum creatinine/ESRD), CV events/CV death, and a composite of CKD progression or death from any cause (CKD progression/death). We assessed HRQOL, including mental health composite (MHC) and physical health composite (PHC), using the Short Form-36 survey. Cox regression analyses were used to assess the relationship between outcomes and five-point decrements in MHC and PHC scores using measurements at baseline, at the most recent annual visit (time-varying), or averaged from baseline to the most recent visit (cumulative). During approximately 10 years of follow-up, lower mean PHC score was associated with increased risk of CV events/CV death and CKD progression/death across all analytic approaches, but only time-varying and cumulative decrements were associated with CKD progression. Similarly, lower mean MHC score was associated with increased risk of CV events/CV death regardless of analytic approach, while only time-varying and cumulative decrements in mean MHC score was associated with CKD progression and CKD progression or death. In conclusion, lower HRQOL is associated with a range of adverse outcomes in African Americans with hypertensive CKD. PMID:24700865
Quality of life and outcomes in African Americans with CKD.
Porter, Anna; Fischer, Michael J; Wang, Xuelei; Brooks, Deborah; Bruce, Marino; Charleston, Jeanne; Cleveland, William H; Dowie, Donna; Faulkner, Marquetta; Gassman, Jennifer; Hiremath, Leena; Kendrick, Cindy; Kusek, John W; Norris, Keith C; Thornley-Brown, Denyse; Greene, Tom; Lash, James P
2014-08-01
Low health-related quality of life (HRQOL) has been associated with increased risk for hospitalization and death in ESRD. However, the relationship of HRQOL with outcomes in predialysis CKD is not well understood. We evaluated the association between HRQOL and renal and cardiovascular (CV) outcomes in 1091 African Americans with hypertensive CKD enrolled in the African American Study of Kidney Disease and Hypertension (AASK) trial and cohort studies. Outcomes included CKD progression (doubling of serum creatinine/ESRD), CV events/CV death, and a composite of CKD progression or death from any cause (CKD progression/death). We assessed HRQOL, including mental health composite (MHC) and physical health composite (PHC), using the Short Form-36 survey. Cox regression analyses were used to assess the relationship between outcomes and five-point decrements in MHC and PHC scores using measurements at baseline, at the most recent annual visit (time-varying), or averaged from baseline to the most recent visit (cumulative). During approximately 10 years of follow-up, lower mean PHC score was associated with increased risk of CV events/CV death and CKD progression/death across all analytic approaches, but only time-varying and cumulative decrements were associated with CKD progression. Similarly, lower mean MHC score was associated with increased risk of CV events/CV death regardless of analytic approach, while only time-varying and cumulative decrements in mean MHC score was associated with CKD progression and CKD progression or death. In conclusion, lower HRQOL is associated with a range of adverse outcomes in African Americans with hypertensive CKD. Copyright © 2014 by the American Society of Nephrology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less
The Coordinate Orthogonality Check (corthog)
NASA Astrophysics Data System (ADS)
Avitabile, P.; Pechinsky, F.
1998-05-01
A new technique referred to as the coordinate orthogonality check (CORTHOG) helps to identify how each physical degree of freedom contributes to the overall orthogonality relationship between analytical and experimental modal vectors on a mass-weighted basis. Using the CORTHOG technique together with the pseudo-orthogonality check (POC) clarifies where potential discrepancies exist between the analytical and experimental modal vectors. CORTHOG improves the understanding of the correlation (or lack of correlation) that exists between modal vectors. The CORTHOG theory is presented along with the evaluation of several cases to show the use of the technique.
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.
2004-01-01
A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.
Translocation of "rod-coil" polymers: probing the structure of single molecules within nanopores.
de Haan, Hendrick W; Slater, Gary W
2013-01-25
Using simulation and analytical techniques, we demonstrate that it is possible to extract structural information about biological molecules by monitoring the dynamics as they translocate through nanopores. From Langevin dynamics simulations of polymers exhibiting discrete changes in flexibility (rod-coil polymers), distinct plateaus are observed in the progression towards complete translocation. Characterizing these dynamics via an incremental mean first passage approach, the large steps are shown to correspond to local barriers preventing the passage of the coils while the rods translocate relatively easily. Analytical replication of the results provides insight into the corrugated nature of the free energy landscape as well as the dependence of the effective barrier heights on the length of the coil sections. Narrowing the width of the pore or decreasing the charge on either the rod or the coil segments are both shown to enhance the resolution of structural details. The special case of a single rod confined within a nanopore is also studied. Here, sufficiently long flexible sections attached to either end are demonstrated to act as entropic anchors which can effectively trap the rod within the pore for an extended period of time. Both sets of results suggest new experimental approaches for the control and study of biological molecules within nanopores.
Green, Nelson W.; Perdue, E. Michael; Aiken, George R.; Butler, Kenna D.; Chen, Hongmei; Dittmar, Thorsten; Niggemann, Jutta; Stubbins, Aron
2014-01-01
Dissolved organic matter (DOM) was isolated from large volumes of deep (674 m) and surface (21 m) ocean water via reverse osmosis/electrodialysis (RO/ED) and two solid-phase extraction (SPE) methods (XAD-8/4 and PPL) at the Natural Energy Laboratory of Hawaii Authority (NELHA). By applying the three methods to common water samples, the efficiencies of XAD, PPL and RO/ED DOM isolation were compared. XAD recovered 42% of dissolved organic carbon (DOC) from deep water (25% with XAD-8; 17% with XAD-4) and 30% from surface water (16% with XAD-8; 14% with XAD-4). PPL recovered 61 ± 3% of DOC from deep water and 61% from surface water. RO/ED recovered 82 ± 3% of DOC from deep water, 14 ± 3% of which was recovered in a sodium hydroxide rinse, and 75 ± 5% of DOC from surface water, with 12 ± 2% in the sodium hydroxide rinse. The highest recoveries of all were achieved by the sequential isolation of DOC, first with PPL and then via RO/ED. This combined technique recovered 98% of DOC from a deep water sample and 101% of DOC from a surface water sample. In total, 1.9, 10.3 and 1.6 g-C of DOC were collected via XAD, PPL and RO/ED, respectively. Rates of DOC recovery using the XAD, PPL and RO/ED methods were 10, 33 and 10 mg-C h− 1, respectively. Based upon C/N ratios, XAD isolates were heavily C-enriched compared with water column DOM, whereas RO/ED and PPL ➔ RO/ED isolate C/N values were most representative of the original DOM. All techniques are suitable for the isolation of large amounts of DOM with purities suitable for most advanced analytical techniques. Coupling PPL and RO/ED techniques may provide substantial progress in the search for a method to quantitatively isolate oceanic DOC, bringing the entirety of the DOM pool within the marine chemist's analytical window.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Flexible aircraft dynamic modeling for dynamic analysis and control synthesis
NASA Technical Reports Server (NTRS)
Schmidt, David K.
1989-01-01
The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.
Decision analytic models for Alzheimer's disease: state of the art and future directions.
Cohen, Joshua T; Neumann, Peter J
2008-05-01
Decision analytic policy models for Alzheimer's disease (AD) enable researchers and policy makers to investigate questions about the costs and benefits of a wide range of existing and potential screening, testing, and treatment strategies. Such models permit analysts to compare existing alternatives, explore hypothetical scenarios, and test the strength of underlying assumptions in an explicit, quantitative, and systematic way. Decision analytic models can best be viewed as complementing clinical trials both by filling knowledge gaps not readily addressed by empirical research and by extrapolating beyond the surrogate markers recorded in a trial. We identified and critiqued 13 distinct AD decision analytic policy models published since 1997. Although existing models provide useful insights, they also have a variety of limitations. (1) They generally characterize disease progression in terms of cognitive function and do not account for other distinguishing features, such as behavioral symptoms, functional performance, and the emotional well-being of AD patients and caregivers. (2) Many describe disease progression in terms of a limited number of discrete states, thus constraining the level of detail that can be used to characterize both changes in patient status and the relationships between disease progression and other factors, such as residential status, that influence outcomes of interest. (3) They have focused almost exclusively on evaluating drug treatments, thus neglecting other disease management strategies and combinations of pharmacologic and nonpharmacologic interventions. Future AD models should facilitate more realistic and compelling evaluations of various interventions to address the disease. An improved model will allow decision makers to better characterize the disease, to better assess the costs and benefits of a wide range of potential interventions, and to better evaluate the incremental costs and benefits of specific interventions used in conjunction with other disease management strategies.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-04
Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.
A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics
NASA Technical Reports Server (NTRS)
Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan
2013-01-01
In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.
ERIC Educational Resources Information Center
Feifer, Irwin; And Others
Based on an analytically evaluative case study of a New York City furniture department store's experiences with a Manpower Administration contract, this report deals with the development and progress of the program as analyzed by one investigator through interviews with almost all of the participants in the program. As a result of the study,…
Chen, Zhencai; De Beuckelaer, Alain; Wang, Xu; Liu, Jia
2017-11-24
Recent studies revealed spontaneous neural activity to be associated with fluid intelligence (gF) which is commonly assessed by Raven's Advanced Progressive Matrices, and embeds two types of reasoning: visuospatial and verbal-analytic reasoning. With resting-state fMRI data, using global brain connectivity (GBC) analysis which averages functional connectivity of a voxel in relation to all other voxels in the brain, distinct neural correlates of these two reasoning types were found. For visuospatial reasoning, negative correlations were observed in both the primary visual cortex (PVC) and the precuneus, and positive correlations were observed in the temporal lobe. For verbal-analytic reasoning, negative correlations were observed in the right inferior frontal gyrus (rIFG), dorsal anterior cingulate cortex and temporoparietal junction, and positive correlations were observed in the angular gyrus. Furthermore, an interaction between GBC value and type of reasoning was found in the PVC, rIFG and the temporal lobe. These findings suggest that visuospatial reasoning benefits more from elaborate perception to stimulus features, whereas verbal-analytic reasoning benefits more from feature integration and hypothesis testing. In sum, the present study offers, for different types of reasoning in gF, first empirical evidence of separate neural substrates in the resting brain.
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
2016-09-16
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
Conceptual data sampling for breast cancer histology image classification.
Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir
2017-10-01
Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Eckel, J. S.; Crabtree, M. S.
1984-01-01
Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.
Zhdanov,; Michael, S [Salt Lake City, UT
2008-01-29
Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.
Cooperative research in coal liquefaction. Technical progress report, May 1, 1993--April 30, 1994
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huffman, G.P.
Accomplishments for the past year are presented for the following tasks: coliquefaction of coal with waste materials; catalysts for coal liquefaction to clean transportation fuels; fundamental research in coal liquefaction; and in situ analytical techniques for coal liquefaction and coal liquefaction catalysts some of the highlights are: very promising results have been obtained from the liquefaction of plastics, rubber tires, paper and other wastes, and the coliquefaction of wastes with coal; a number of water soluble coal liquefaction catalysts, iron, cobalt, nickel and molybdenum, have been comparatively tested; mossbauer spectroscopy, XAFS spectroscopy, TEM and XPS have been used to characterizemore » a variety of catalysts and other samples from numerous consortium and DOE liquefaction projects and in situ ESR measurements of the free radical density have been conducted at temperatures from 100 to 600{degrees}C and H{sub 2} pressures up to 600 psi.« less
Early adventures in drug metabolism. 1. Role of the Bratton-Marshall reagent
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glazko, A.J.
1987-01-01
The Bratton-Marshall reagent is one of the real land-marks in the development of drug metabolism and pharmacokinetics, coming at a time when highly sensitive and specific analytical procedures were desperately needed for the measurement of drug concentrations in the body. Examples of its applications are taken from early work in the mid-40's and 50's in the Parke-Davis Research Laboratories, extending from primary aromatic amines (e.g., sulfonamides), to p-nitrophenyl compounds that must first be reduced to amines (e.g., chloramphenicol), and to phenyl derivatives that must be nitrated on a microgram scale and then reduced to aryl amines (e.g., phenytoin). The developmentmore » and use of separation techniques such as liquid/liquid counter-current partition and paper chromatography is described. Emphasis is placed upon continued, progressive improvement in the basic assay procedures over long periods of time.« less
Measuring Uranium Decay Rates for Advancement of Nuclear Forensics and Geochronology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsons-Davis, Tashi
Radioisotopic dating techniques are highly valuable tools for understanding the history of physical and chemical processes in materials related to planetary sciences and nuclear forensics, and rely on accurate knowledge of decay constants and their uncertainties. The decay constants of U-238 and U-235 are particularly important to Earth science, and often the measured values with lowest reported uncertainties are applied, although they have not been independently verified with similar precision. New direct measurements of the decay constants of U-238, Th-234, U-235, and U-234 were completed, using a range of analytical approaches. An overarching goal of the project was to ensuremore » the quality of results, including metrological traceability to facilitate implementation across diverse disciplines. This report presents preliminary results of these experiments, as a few final measurements and calculations are still in progress.« less
NASA Technical Reports Server (NTRS)
1974-01-01
Observations and research progress of the Smithsonian Astrophysical Observatory are reported. Satellite tracking networks (ground stations) are discussed and equipment (Baker-Nunn cameras) used to observe the satellites is described. The improvement of the accuracy of a laser ranging system of the ground stations is discussed. Also, research efforts in satellite geodesy (tides, gravity anomalies, plate tectonics) is discussed. The use of data processing for geophysical data is examined, and a data base for the Earth and Ocean Physics Applications Program is proposed. Analytical models of the earth's motion (computerized simulation) are described and the computation (numerical integration and algorithms) of satellite orbits affected by the earth's albedo, using computer techniques, is also considered. Research efforts in the study of the atmosphere are examined (the effect of drag on satellite motion), and models of the atmosphere based on satellite data are described.
Uncertainty Quantification in Aeroelasticity
NASA Astrophysics Data System (ADS)
Beran, Philip; Stanford, Bret; Schrock, Christopher
2017-01-01
Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.
Jain, Ajay N.; Chin, Koei; Børresen-Dale, Anne-Lise; Erikstein, Bjorn K.; Lonning, Per Eystein; Kaaresen, Rolf; Gray, Joe W.
2001-01-01
We present a general method for rigorously identifying correlations between variations in large-scale molecular profiles and outcomes and apply it to chromosomal comparative genomic hybridization data from a set of 52 breast tumors. We identify two loci where copy number abnormalities are correlated with poor survival outcome (gain at 8q24 and loss at 9q13). We also identify a relationship between abnormalities at two loci and the mutational status of p53. Gain at 8q24 and loss at 5q15-5q21 are linked with mutant p53. The 9q and 5q losses suggest the possibility of gene products involved in breast cancer progression. The analytical techniques are general and also are applicable to the analysis of array-based expression data. PMID:11438741
On-Chip High-Finesse Fabry-Perot Microcavities for Optical Sensing and Quantum Information.
Bitarafan, Mohammad H; DeCorby, Ray G
2017-07-31
For applications in sensing and cavity-based quantum computing and metrology, open-access Fabry-Perot cavities-with an air or vacuum gap between a pair of high reflectance mirrors-offer important advantages compared to other types of microcavities. For example, they are inherently tunable using MEMS-based actuation strategies, and they enable atomic emitters or target analytes to be located at high field regions of the optical mode. Integration of curved-mirror Fabry-Perot cavities on chips containing electronic, optoelectronic, and optomechanical elements is a topic of emerging importance. Micro-fabrication techniques can be used to create mirrors with small radius-of-curvature, which is a prerequisite for cavities to support stable, small-volume modes. We review recent progress towards chip-based implementation of such cavities, and highlight their potential to address applications in sensing and cavity quantum electrodynamics.
A Mass Spectrometer in Every Fume Hood
NASA Astrophysics Data System (ADS)
McBride, Ethan M.; Verbeck, Guido F.
2018-06-01
Since their inception, mass spectrometers have played a pivotal role in the direction and application of synthetic chemical research. The ability to develop new instrumentation to solve current analytical challenges in this area has always been at the heart of mass spectrometry, although progress has been slow at times. Herein, we briefly review the history of how mass spectrometry has been used to approach challenges in organic chemistry, how new developments in portable instrumentation and ambient ionization have been used to open novel areas of research, and how current techniques have the ability to expand on our knowledge of synthetic mechanisms and kinetics. Lastly, we discuss the relative paucity of work done in recent years to embrace the concept of improving benchtop synthetic chemistry with mass spectrometry, the disconnect between applications and fundamentals within these studies, and what hurdles still need to be overcome. [Figure not available: see fulltext.
Rapid bacterial diagnostics via surface enhanced Raman microscopy.
Premasiri, W R; Sauer-Budge, A F; Lee, J C; Klapperich, C M; Ziegler, L D
2012-06-01
There is a continuing need to develop new techniques for the rapid and specific identification of bacterial pathogens in human body fluids especially given the increasing prevalence of drug resistant strains. Efforts to develop a surface enhanced Raman spectroscopy (SERS) based approach, which encompasses sample preparation, SERS substrates, portable Raman microscopy instrumentation and novel identification software, are described. The progress made in each of these areas in our laboratory is summarized and illustrated by a spiked infectious sample for urinary tract infection (UTI) diagnostics. SERS bacterial spectra exhibit both enhanced sensitivity and specificity allowing the development of an easy to use, portable, optical platform for pathogen detection and identification. SERS of bacterial cells is shown to offer not only reproducible molecular spectroscopic signatures for analytical applications in clinical diagnostics, but also is a new tool for studying biochemical activity in real time at the outer layers of these organisms.
Jenkins, Nigel; Meleady, Paula; Tyther, Raymond; Murphy, Lisa
2009-05-06
The production of monoclonal antibodies and other recombinant proteins is one of the highest growth areas in the pharmaceutical industry. Mammalian cells are used to manufacture the majority of biotherapeutics, largely due to their ability to perform complex post-translational modifications. Although significant progress has been made recently in improving product yields and protein quality, many challenges still lie ahead to achieve consistently high yields while avoiding potentially damaging protein modifications. The present review first considers the strategies used to analyse and improve recombinant protein expression of industrial cell lines, with an emphasis on proteomic technologies. Next, cellular and environmental influences on protein production and quality are examined, and strategies for improvements in product yield and quality are reviewed. The analytical techniques required to detect these protein changes are also described, together with prospects for assay improvements.
On-Chip High-Finesse Fabry-Perot Microcavities for Optical Sensing and Quantum Information
Bitarafan, Mohammad H.; DeCorby, Ray G.
2017-01-01
For applications in sensing and cavity-based quantum computing and metrology, open-access Fabry-Perot cavities—with an air or vacuum gap between a pair of high reflectance mirrors—offer important advantages compared to other types of microcavities. For example, they are inherently tunable using MEMS-based actuation strategies, and they enable atomic emitters or target analytes to be located at high field regions of the optical mode. Integration of curved-mirror Fabry-Perot cavities on chips containing electronic, optoelectronic, and optomechanical elements is a topic of emerging importance. Micro-fabrication techniques can be used to create mirrors with small radius-of-curvature, which is a prerequisite for cavities to support stable, small-volume modes. We review recent progress towards chip-based implementation of such cavities, and highlight their potential to address applications in sensing and cavity quantum electrodynamics. PMID:28758967
Economic sustainability assessment in semi-steppe rangelands.
Mofidi Chelan, Morteza; Alijanpour, Ahmad; Barani, Hossein; Motamedi, Javad; Azadi, Hossein; Van Passel, Steven
2018-05-08
This study was conducted to determine indices and components of economic sustainability assessment in the pastoral units of Sahand summer rangelands. The method was based on descriptive-analytical survey (experts and researchers) with questionnaires. Analysis of variance showed that the mean values of economic components are significantly different from each other and the efficiency component has the highest mean value (0.57). The analysis of rangeland pastoral units with the technique for order-preference by similarity to ideal solution (TOPSIS) indicated that from an economic sustainability standpoint, Garehgol (Ci = 0.519) and Badir Khan (Ci = 0.129), pastoral units ranked first and last, respectively. This study provides a clear understanding of existing resources and opportunities for policy makers that is crucial to approach economic sustainable development. Accordingly, this study can help better define sustainable development goals and monitor the progress of achieving them. Copyright © 2018 Elsevier B.V. All rights reserved.
Tungsten devices in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hou, Xiandeng; Jones, Bradley T.
2002-04-01
Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.
Nano-Aptasensing in Mycotoxin Analysis: Recent Updates and Progress.
Rhouati, Amina; Bulbul, Gonca; Latif, Usman; Hayat, Akhtar; Li, Zhan-Hong; Marty, Jean Louis
2017-10-28
Recent years have witnessed an overwhelming integration of nanomaterials in the fabrication of biosensors. Nanomaterials have been incorporated with the objective to achieve better analytical figures of merit in terms of limit of detection, linear range, assays stability, low production cost, etc. Nanomaterials can act as immobilization support, signal amplifier, mediator and artificial enzyme label in the construction of aptasensors. We aim in this work to review the recent progress in mycotoxin analysis. This review emphasizes on the function of the different nanomaterials in aptasensors architecture. We subsequently relate their features to the analytical performance of the given aptasensor towards mycotoxins monitoring. In the same context, a critically analysis and level of success for each nano-aptasensing design will be discussed. Finally, current challenges in nano-aptasensing design for mycotoxin analysis will be highlighted.
Chemistry Division. Quarterly progress report for period ending June 30, 1949
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1949-09-14
Progress reports are presented for the following tasks: (1) nuclear and chemical properties of heavy elements (solution chemistry, phase rule studies); (2) nuclear and chemical properties of elements in the fission product region; (3) general nuclear chemistry; (4) radio-organic chemistry; (5) chemistry of separations processes; (6) physical chemistry and chemical physics; (7) radiation chemistry; (8) physical measurements and instrumentation; and (9) analytical chemistry. The program of the chemistry division is divided into two efforts of approximately equal weight with respect to number of personnel, chemical research, and analytical service for the Laboratory. The various research problems fall into the followingmore » classifications: (1) chemical separation processes for isolation and recovery of fissionable material, production of radioisotopes, and military applications; (2) reactor development; and (3) fundamental research.« less
Postbuckling and Growth of Delaminations in Composite Plates Subjected to Axial Compression
NASA Technical Reports Server (NTRS)
Reeder, James R.; Chunchu, Prasad B.; Song, Kyongchan; Ambur, Damodar R.
2002-01-01
The postbuckling response and growth of circular delaminations in flat and curved plates are investigated as part of a study to identify the criticality of delamination locations through the laminate thickness. The experimental results from tests on delaminated plates are compared with finite element analysis results generated using shell models. The analytical prediction of delamination growth is obtained by assessing the strain energy release rate results from the finite element model and comparing them to a mixed-mode fracture toughness failure criterion. The analytical results for onset of delamination growth compare well with experimental results generated using a 3-dimensional displacement visualization system. The record of delamination progression measured in this study has resulted in a fully 3-dimensional test case with which progressive failure models can be validated.
ERIC Educational Resources Information Center
Ramsey-Klee, Diane M.; Richman, Vivian
The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…
Cost and schedule analytical techniques development
NASA Technical Reports Server (NTRS)
1994-01-01
This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.
The analyst's participation in the analytic process.
Levine, H B
1994-08-01
The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.
Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V
2014-01-01
This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.
Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul
2017-01-01
The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Hydraulic Fracturing Fluid Analysis for Regulatory Parameters - A Progress Report
This presentation is a progress report on the analysis of Hydraulic Fracturing Fluids for regulatory compounds outlined in the various US EPA methodologies. Fracturing fluids vary significantly in consistency and viscosity prior to fracturing. Due to the nature of the fluids the analytical challenges will have to be addressed. This presentation also outlines the sampling issues associated with the collection of dissolved gas samples.
Extended Analytic Device Optimization Employing Asymptotic Expansion
NASA Technical Reports Server (NTRS)
Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred
2013-01-01
Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.
NASA Astrophysics Data System (ADS)
Scholin, C.; Preston, C.; Harris, A.; Birch, J.; Marin, R.; Jensen, S.; Roman, B.; Everlove, C.; Makarewicz, A.; Riot, V.; Hadley, D.; Benett, W.; Dzenitis, J.
2008-12-01
An internet search using the phrase "ecogenomic sensor" will return numerous references that speak broadly to the idea of detecting molecular markers indicative of specific organisms, genes or other biomarkers within an environmental context. However, a strict and unified definition of "ecogenomic sensor" is lacking and the phrase may be used for laboratory-based tools and techniques as well as semi or fully autonomous systems that can be deployed outside of laboratory. We are exploring development of an ecogenomic sensor from the perspective of a field-portable device applied towards oceanographic research and water quality monitoring. The device is known as the Environmental Sample Processor, or ESP. The ESP employs wet chemistry molecular analytical techniques to autonomously assess the presence and abundance of specific organisms, their genes and/or metabolites in near real-time. Current detection chemistries rely on low- density DNA probe and protein arrays. This presentation will emphasize results from 2007-8 field trials when the ESP was moored in Monterey Bay, CA, as well as current engineering activities for improving analytical capacity of the instrument. Changes in microbial community structure at the rRNA level were observed remotely in accordance with changing chemical and physical oceanographic conditions. Current developments include incorporation of a reusable solid phase extraction column for purifying nucleic acids and a 4-channel real-time PCR module. Users can configure this system to support a variety of PCR master mixes, primer/probe combinations and control templates. An update on progress towards fielding a PCR- enabled ESP will be given along with an outline of plans for its use in coastal and oligotrophic oceanic regimes.
NASA Technical Reports Server (NTRS)
Ahamed, Aakash; Bolten, John; Doyle, Colin; Fayne, Jessica
2016-01-01
Floods are the costliest natural disaster, causing approximately 6.8 million deaths in the twentieth century alone. Worldwide economic flood damage estimates in 2012 exceed $19 Billion USD. Extended duration floods also pose longer term threats to food security, water, sanitation, hygiene, and community livelihoods, particularly in developing countries. Projections by the Intergovernmental Panel on Climate Change (IPCC) suggest that precipitation extremes, rainfall intensity, storm intensity, and variability are increasing due to climate change. Increasing hydrologic uncertainty will likely lead to unprecedented extreme flood events. As such, there is a vital need to enhance and further develop traditional techniques used to rapidly assess flooding and extend analytical methods to estimate impacted population and infrastructure. Measuring flood extent in situ is generally impractical, time consuming, and can be inaccurate. Remotely sensed imagery acquired from space-borne and airborne sensors provides a viable platform for consistent and rapid wall-to-wall monitoring of large flood events through time. Terabytes of freely available satellite imagery are made available online each day by NASA, ESA, and other international space research institutions. Advances in cloud computing and data storage technologies allow researchers to leverage these satellite data and apply analytical methods at scale. Repeat-survey earth observations help provide insight about how natural phenomena change through time, including the progression and recession of floodwaters. In recent years, cloud-penetrating radar remote sensing techniques (e.g., Synthetic Aperture Radar) and high temporal resolution imagery platforms (e.g., MODIS and its 1-day return period), along with high performance computing infrastructure, have enabled significant advances in software systems that provide flood warning, assessments, and hazard reduction potential. By incorporating social and economic data, researchers can develop systems that automatically quantify the socioeconomic impacts resulting from flood disaster events.
Metrology for hydrogen energy applications: a project to address normative requirements
NASA Astrophysics Data System (ADS)
Haloua, Frédérique; Bacquart, Thomas; Arrhenius, Karine; Delobelle, Benoît; Ent, Hugo
2018-03-01
Hydrogen represents a clean and storable energy solution that could meet worldwide energy demands and reduce greenhouse gases emission. The joint research project (JRP) ‘Metrology for sustainable hydrogen energy applications’ addresses standardisation needs through pre- and co-normative metrology research in the fast emerging sector of hydrogen fuel that meet the requirements of the European Directive 2014/94/EU by supplementing the revision of two ISO standards that are currently too generic to enable a sustainable implementation of hydrogen. The hydrogen purity dispensed at refueling points should comply with the technical specifications of ISO 14687-2 for fuel cell electric vehicles. The rapid progress of fuel cell technology now requires revising this standard towards less constraining limits for the 13 gaseous impurities. In parallel, optimized validated analytical methods are proposed to reduce the number of analyses. The study aims also at developing and validating traceable methods to assess accurately the hydrogen mass absorbed and stored in metal hydride tanks; this is a research axis for the revision of the ISO 16111 standard to develop this safe storage technique for hydrogen. The probability of hydrogen impurity presence affecting fuel cells and analytical techniques for traceable measurements of hydrogen impurities will be assessed and new data of maximum concentrations of impurities based on degradation studies will be proposed. Novel validated methods for measuring the hydrogen mass absorbed in hydrides tanks AB, AB2 and AB5 types referenced to ISO 16111 will be determined, as the methods currently available do not provide accurate results. The outputs here will have a direct impact on the standardisation works for ISO 16111 and ISO 14687-2 revisions in the relevant working groups of ISO/TC 197 ‘Hydrogen technologies’.
Heat as a tracer to determine streambed water exchanges
Constantz, J.
2010-01-01
This work reviews the use of heat as a tracer of shallow groundwater movement and describes current temperature-based approaches for estimating streambed water exchanges. Four common hydrologic conditions in stream channels are graphically depicted with the expected underlying streambed thermal responses, and techniques are discussed for installing and monitoring temperature and stage equipment for a range of hydrological environments. These techniques are divided into direct-measurement techniques in streams and streambeds, groundwater techniques relying on traditional observation wells, and remote sensing and other large-scale advanced temperatureacquisition techniques. A review of relevant literature suggests researchers often graphically visualize temperature data to enhance conceptual models of heat and water flow in the near-stream environment and to determine site-specific approaches of data analysis. Common visualizations of stream and streambed temperature patterns include thermographs, temperature envelopes, and one-, two-, and three-dimensional temperature contour plots. Heat and water transport governing equations are presented for the case of transport in streambeds, followed by methods of streambed data analysis, including simple heat-pulse arrival time and heat-loss procedures, analytical and time series solutions, and heat and water transport simulation models. A series of applications of these methods are presented for a variety of stream settings ranging from arid to continental climates. Progressive successes to quantify both streambed fluxes and the spatial extent of streambeds indicate heat-tracing tools help define the streambed as a spatially distinct field (analogous to soil science), rather than simply the lower boundary in stream research or an amorphous zone beneath the stream channel.
Trace metal speciation in natural waters: Computational vs. analytical
Nordstrom, D. Kirk
1996-01-01
Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.
Pavement Performance : Approaches Using Predictive Analytics
DOT National Transportation Integrated Search
2018-03-23
Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...
Analytical techniques of pilot scanning behavior and their application
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.
1986-01-01
The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.
Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi
2016-02-19
The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications
Medina, Isabel; Cappiello, Achille; Careri, Maria
2018-01-01
Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370
Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.
Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S
2016-04-07
Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Stosnach, Hagen
2010-09-01
Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.
Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.
Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn
2017-01-01
The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.
Rohlenová, J; Gryndler, M; Forczek, S T; Fuksová, K; Handova, V; Matucha, M
2009-05-15
Chloride, which comes into the forest ecosystem largely from the sea as aerosol (and has been in the past assumed to be inert), causes chlorination of soil organic matter. Studies of the chlorination showed that the content of organically bound chlorine in temperate forest soils is higher than that of chloride, and various chlorinated compounds are produced. Our study of chlorination of organic matter in the fermentation horizon of forest soil using radioisotope 36Cl and tracer techniques shows that microbial chlorination clearly prevails over abiotic, chlorination of soil organic matter being enzymatically mediated and proportional to chloride content and time. Long-term (>100 days) chlorination leads to more stable chlorinated substances contained in the organic layer of forest soil (overtime; chlorine is bound progressively more firmly in humic acids) and volatile organochlorines are formed. Penetration of chloride into microorganisms can be documented by the freezing/thawing technique. Chloride absorption in microorganisms in soil and in litter residues in the fermentation horizon complicates the analysis of 36Cl-chlorinated soil. The results show that the analytical procedure used should be tested for every soil type under study.
Jackson, Brian A; Faith, Kay Sullivan
2013-02-01
Although significant progress has been made in measuring public health emergency preparedness, system-level performance measures are lacking. This report examines a potential approach to such measures for Strategic National Stockpile (SNS) operations. We adapted an engineering analytic technique used to assess the reliability of technological systems-failure mode and effects analysis-to assess preparedness. That technique, which includes systematic mapping of the response system and identification of possible breakdowns that affect performance, provides a path to use data from existing SNS assessment tools to estimate likely future performance of the system overall. Systems models of SNS operations were constructed and failure mode analyses were performed for each component. Linking data from existing assessments, including the technical assistance review and functional drills, to reliability assessment was demonstrated using publicly available information. The use of failure mode and effects estimates to assess overall response system reliability was demonstrated with a simple simulation example. Reliability analysis appears an attractive way to integrate information from the substantial investment in detailed assessments for stockpile delivery and dispensing to provide a view of likely future response performance.
Understanding Organics in Meteorites and the Pre-Biotic Environment
NASA Technical Reports Server (NTRS)
Zare, Richard N.
2003-01-01
(1) Refinement of the analytic capabilities of our experiment via characterization of molecule-specific response and the effects upon analysis of the type of sample under investigation; (2) Measurement of polycyclic aromatic hydrocarbons (PAHs) with high sensitivity and spatial resolution within extraterrestrial samples; (3) Investigation of the interstellar reactions of PAHs via the analysis of species formed in systems modeling dust grains and ices; (4) Investigations into the potential role of PAHs in prebiotic and early biotic chemistry via photoreactions of PAHs under simulated prebiotic Earth conditions. To meet these objectives, we use microprobe laser-desorption, laser-ionization mass spectrometry (MuL(exp 2)MS), which is a sensitive, selective, and spatially resolved technique for detection of aromatic compounds. Appendix A presents a description of the MuL(exp 2)MS technique. The initial grant proposal was for a three-year funding period, while the award was given for a one-year interim period. Because of this change in time period, emphasis was shifted from the first research goal, which was more development-oriented, in order to focus more on the other analysis-oriented goals. The progress made on each of the four research areas is given below.
Thermo-acousto-photonics for noncontact temperature measurement in silicon wafer processing
NASA Astrophysics Data System (ADS)
Suh, Chii-Der S.; Rabroker, G. Andrew; Chona, Ravinder; Burger, Christian P.
1999-10-01
A non-contact thermometry technique has been developed to characterize the thermal state of silicon wafers during rapid thermal processing. Information on thermal variations is obtained from the dispersion relations of the propagating waveguide mode excited in wafers using a non-contact, broadband optical system referred to as Thermal Acousto- Photonics for Non-Destructive Evaluation. Variations of thermo-mechanical properties in silicon wafers are correlated to temperature changes by performing simultaneous time-frequency analyses on Lamb waveforms acquired with a fiber-tip interferometer sensor. Experimental Lamb wave data collected for cases ranging from room temperature to 400 degrees C is presented. The results show that the temporal progressions of all spectral elements found in the fundamental antisymmetric mode are strong functions of temperature. This particular attribute is exploited to achieve a thermal resolution superior to the +/- 5 degrees C attainable through current pyrometric techniques. By analyzing the temperature-dependent group velocity of a specific frequency component over the temperature range considered and then comparing the results to an analytical model developed for silicon wafers undergoing annealing, excellent agreement was obtained. Presented results demonstrate the feasibility of applying laser-induced stress waves as a temperature diagnostic during rapid thermal processing.
Applications of surface analytical techniques in Earth Sciences
NASA Astrophysics Data System (ADS)
Qian, Gujie; Li, Yubiao; Gerson, Andrea R.
2015-03-01
This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenik, E.A.
X-ray microanalysis in an analytical electron microscope is a proven technique for the measurement of solute segregation in alloys. Solute segregation under equilibrium or nonequilibrium conditions can strongly influence material performance. X-ray microanalysis in an analytical electron microscope provides an alternative technique to measure grain boundary segregation, as well as segregation to other defects not accessible to Auger analysis. The utility of the technique is demonstrated by measurements of equilibrium segregation to boundaries in an antimony containing stainless steel, including the variation of segregation with boundary character and by measurements of nonequilibrium segregation to boundaries and dislocations in an ion-irradiatedmore » stainless steel.« less
Solid Lubrication Fundamentals and Applications. Chapter 2
NASA Technical Reports Server (NTRS)
Miyoshi, Kazuhisa
1998-01-01
This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.
Mass spectrometry of long-lived radionuclides
NASA Astrophysics Data System (ADS)
Becker, Johanna Sabine
2003-10-01
The capability of determining element concentrations at the trace and ultratrace level and isotope ratios is a main feature of inorganic mass spectrometry. The precise and accurate determination of isotope ratios of long-lived natural and artificial radionuclides is required, e.g. for their environmental monitoring and health control, for studying radionuclide migration, for age dating, for determining isotope ratios of radiogenic elements in the nuclear industry, for quality assurance and determination of the burn-up of fuel material in a nuclear power plant, for reprocessing plants, nuclear material accounting and radioactive waste control. Inorganic mass spectrometry, especially inductively coupled plasma mass spectrometry (ICP-MS) as the most important inorganic mass spectrometric technique today, possesses excellent sensitivity, precision and good accuracy for isotope ratio measurements and practically no restriction with respect to the ionization potential of the element investigated—therefore, thermal ionization mass spectrometry (TIMS), which has been used as the dominant analytical technique for precise isotope ratio measurements of long-lived radionuclides for many decades, is being replaced increasingly by ICP-MS. In the last few years instrumental progress in improving figures of merit for the determination of isotope ratio measurements of long-lived radionuclides in ICP-MS has been achieved by the application of a multiple ion collector device (MC-ICP-MS) and the introduction of the collision cell interface in order to dissociate disturbing argon-based molecular ions, to reduce the kinetic energy of ions and neutralize the disturbing noble gas ions (e.g. of 129Xe + for the determination of 129I). The review describes the state of the art and the progress of different inorganic mass spectrometric techniques such as ICP-MS, laser ablation ICP-MS vs. TIMS, glow discharge mass spectrometry, secondary ion mass spectrometry, resonance ionization mass spectrometry and accelerator mass spectrometry for the determination of long-lived radionuclides in quite different materials.
Mercury-induced fragmentation of n-decane and n-undecane in positive mode ion mobility spectrometry.
Gunzer, F
2015-09-21
Ion mobility spectrometry is a well-known technique for trace gas analysis. Using soft ionization techniques, fragmentation of analytes is normally not observed, with the consequence that analyte spectra of single substances are quite simple, i.e. showing in general only one peak. If the concentration is high enough, an extra cluster peak involving two analyte molecules can often be observed. When investigating n-alkanes, different results regarding the number of peaks in the spectra have been obtained in the past using this spectrometric technique. Here we present results obtained when analyzing n-alkanes (n-hexane to n-undecane) with a pulsed electron source, which show no fragmentation or clustering at all. However, when investigating a mixture of mercury and an n-alkane, a situation quite typical in the oil and gas industry, a strong fragmentation and cluster formation involving these fragments has been observed exclusively for n-decane and n-undecane.
[Recent Development of Atomic Spectrometry in China].
Xiao, Yuan-fang; Wang, Xiao-hua; Hang, Wei
2015-09-01
As an important part of modern analytical techniques, atomic spectrometry occupies a decisive status in the whole analytical field. The development of atomic spectrometry also reflects the continuous reform and innovation of analytical techniques. In the past fifteen years, atomic spectrometry has experienced rapid development and been applied widely in many fields in China. This review has witnessed its development and remarkable achievements. It contains several directions of atomic spectrometry, including atomic emission spectrometry (AES), atomic absorption spectrometry (AAS), atomic fluorescence spectrometry (AFS), X-ray fluorescence spectrometry (XRF), and atomic mass spectrometry (AMS). Emphasis is put on the innovation of the detection methods and their applications in related fields, including environmental samples, biological samples, food and beverage, and geological materials, etc. There is also a brief introduction to the hyphenated techniques utilized in atomic spectrometry. Finally, the prospects of atomic spectrometry in China have been forecasted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prentice, H. J.; Proud, W. G.
2006-07-28
A technique has been developed to determine experimentally the three-dimensional displacement field on the rear surface of a dynamically deforming plate. The technique combines speckle analysis with stereoscopy, using a modified angular-lens method: this incorporates split-frame photography and a simple method by which the effective lens separation can be adjusted and calibrated in situ. Whilst several analytical models exist to predict deformation in extended or semi-infinite targets, the non-trivial nature of the wave interactions complicates the generation and development of analytical models for targets of finite depth. By interrogating specimens experimentally to acquire three-dimensional strain data points, both analytical andmore » numerical model predictions can be verified more rigorously. The technique is applied to the quasi-static deformation of a rubber sheet and dynamically to Mild Steel sheets of various thicknesses.« less
Potvin, Christopher M; Zhou, Hongde
2011-11-01
The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.
Benhammouda, Brahim; Vazquez-Leal, Hector
2016-01-01
This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.
Analytic double product integrals for all-frequency relighting.
Wang, Rui; Pan, Minghao; Chen, Weifeng; Ren, Zhong; Zhou, Kun; Hua, Wei; Bao, Hujun
2013-07-01
This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals—the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.
Study of Chemical Changes in Uranium Oxyfluoride Particles Progress Report March - October 2009
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kips, R; Kristo, M; Hutcheon, I
2009-11-22
Nuclear forensics relies on the analysis of certain sample characteristics to determine the origin and history of a nuclear material. In the specific case of uranium enrichment facilities, it is the release of trace amounts of uranium hexafluoride (UF{sub 6}) gas - used for the enrichment of uranium - that leaves a process-characteristic fingerprint. When UF{sub 6} gas interacts with atmospheric moisture, uranium oxyfluoride particles or particle agglomerates are formed with sizes ranging from several microns down to a few tens of nanometers. These particles are routinely collected by safeguards organizations, such as the International Atomic Energy Agency (IAEA), allowingmore » them to verify whether a facility is compliant with its declarations. Spectrometric analysis of uranium particles from UF{sub 6} hydrolysis has revealed the presence of both particles that contain fluorine, and particles that do not. It is therefore assumed that uranium oxyfluoride is unstable, and decomposes to form uranium oxide. Understanding the rate of fluorine loss in uranium oxyfluoride particles, and the parameters that control it, may therefore contribute to placing boundaries on the particle's exposure time in the environment. Expressly for the purpose of this study, we prepared a set of uranium oxyfluoride particles at the Institute for Reference Materials and Measurements (EU-JRC-IRMM) from a static release of UF{sub 6} in a humid atmosphere. The majority of the samples was stored in controlled temperature, humidity and lighting conditions. Single particles were characterized by a suite of micro-analytical techniques, including NanoSIMS, micro-Raman spectrometry (MRS), scanning (SEM) and transmission (TEM) electron microscopy, energy-dispersive X-ray spectrometry (EDX) and focused ion beam (FIB). The small particle size was found to be the main analytical challenge. The relative amount of fluorine, as well as the particle chemical composition and morphology were determined at different stages in the ageing process, and immediately after preparation. This report summarizes our most recent findings for each of the analytical techniques listed above, and provides an outlook on what remains to be resolved. Additional spectroscopic and mass spectrometric measurements were carried out at Pacific Northwest National Laboratory, but are not included in this summary.« less
Analytics for Cyber Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd.; Kolda, Tamara Gibson
2011-06-01
This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.
Characterizing odors from cattle feedlots with different odor techniques
USDA-ARS?s Scientific Manuscript database
Odors from cattle feedlots negatively affect local communities. The purpose of this study was to characterize odors and odorants using different odor sampling techniques. Odors were characterized with field olfactometers (Nasal Ranger®), sensory techniques (GC-O) and analytical techniques (sorbent t...
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.
ERIC Educational Resources Information Center
Johnson, Dennis C.
1980-01-01
Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)
NASA Technical Reports Server (NTRS)
Weinberg, M. C.; Oronato, P. I.; Uhlmann, D. R.
1984-01-01
Analytical expression used to calculate time it takes for stationary bubbles of oxygen and carbon dioxide to dissolve from glass melt. Technique based on analytical expression for bubble radius as function time, with consequences of surface tension included.
ERIC Educational Resources Information Center
Fulghum, J. E.; And Others
1989-01-01
This review is divided into the following analytical methods: ion spectroscopy, electron spectroscopy, scanning tunneling microscopy, atomic force microscopy, optical spectroscopy, desorption techniques, and X-ray techniques. (MVL)
Mitri, F G
2006-07-01
In this paper, analytical equations are derived for the time-averaged radiation force induced by progressive and standing acoustic waves incident on elastic spherical shells covered with a layer of viscoelastic and sound-absorbing material. The fluid surrounding the shells is considered compressible and nonviscous. The incident field is assumed to be moderate so that the scattered field from the shells is taken to linear approximation. The analytical results are illustrated by means of a numerical example in which the radiation force function curves are displayed, with particular emphasis on the coating thickness and the content of the hollow region of the shells. The fluid-loading on the radiation force function curves is analysed as well. This study attempts to generalize the various treatments of radiation force due to both progressive and standing waves on spherically-shaped structures immersed in ideal fluids. The results show that various ways can be effectively used for damping resonance peaks, such as by changing the fluid in the interior hollow region of the shells or by changing the coating thickness.
Emura, Takeshi; Nakatochi, Masahiro; Matsui, Shigeyuki; Michimae, Hirofumi; Rondeau, Virginie
2017-01-01
Developing a personalized risk prediction model of death is fundamental for improving patient care and touches on the realm of personalized medicine. The increasing availability of genomic information and large-scale meta-analytic data sets for clinicians has motivated the extension of traditional survival prediction based on the Cox proportional hazards model. The aim of our paper is to develop a personalized risk prediction formula for death according to genetic factors and dynamic tumour progression status based on meta-analytic data. To this end, we extend the existing joint frailty-copula model to a model allowing for high-dimensional genetic factors. In addition, we propose a dynamic prediction formula to predict death given tumour progression events possibly occurring after treatment or surgery. For clinical use, we implement the computation software of the prediction formula in the joint.Cox R package. We also develop a tool to validate the performance of the prediction formula by assessing the prediction error. We illustrate the method with the meta-analysis of individual patient data on ovarian cancer patients.
Advances in developing rapid, reliable and portable detection systems for alcohol.
Thungon, Phurpa Dema; Kakoti, Ankana; Ngashangva, Lightson; Goswami, Pranab
2017-11-15
Development of portable, reliable, sensitive, simple, and inexpensive detection system for alcohol has been an instinctive demand not only in traditional brewing, pharmaceutical, food and clinical industries but also in rapidly growing alcohol based fuel industries. Highly sensitive, selective, and reliable alcohol detections are currently amenable typically through the sophisticated instrument based analyses confined mostly to the state-of-art analytical laboratory facilities. With the growing demand of rapid and reliable alcohol detection systems, an all-round attempt has been made over the past decade encompassing various disciplines from basic and engineering sciences. Of late, the research for developing small-scale portable alcohol detection system has been accelerated with the advent of emerging miniaturization techniques, advanced materials and sensing platforms such as lab-on-chip, lab-on-CD, lab-on-paper etc. With these new inter-disciplinary approaches along with the support from the parallel knowledge growth on rapid detection systems being pursued for various targets, the progress on translating the proof-of-concepts to commercially viable and environment friendly portable alcohol detection systems is gaining pace. Here, we summarize the progress made over the years on the alcohol detection systems, with a focus on recent advancement towards developing portable, simple and efficient alcohol sensors. Copyright © 2017 Elsevier B.V. All rights reserved.
Coping strategies to manage stress related to vision loss and fluctuations in retinitis pigmentosa
Bittner, Ava K.; Edwards, Lori; George, Maureen
2010-01-01
Background Vision loss in retinitis pigmentosa (RP) is a slowly progressive and inexorable threat to patients’ independence. It is not surprising that RP patients, many of whom are young when diagnosed, are at high risk for stress related to their vision loss. To address these issues, eye care providers need to be aware of what coping strategies RP patients use to successfully manage their vision loss. Methods We held focus groups with eight legally blind RP patients to help us better understand how they cope with the stress that is generated from their progressive vision loss and fluctuations in vision. Focus group sessions were audiotaped and resulting notes were coded using conventional qualitative analytic techniques. Results Two themes were identified: 1) “kicking and screaming” captured the ways in which RP patients fight to maintain their independence in the face of worsening vision; and 2) “there are so many worse things” describes how RP patients keep their vision loss in perspective. These RP patients demonstrated high levels of resiliency. In particular, they often used humor as a coping mechanism. Conclusions Understanding the ways in which RP patients manage their gradual, impending vision loss may lead to improved quality of care for this patient population. PMID:20591747
2014-01-01
Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797
Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laskin, Julia; Lanekoff, Ingela
2015-11-13
Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less
Meinertz, J.R.; Stehly, G.R.; Hubert, T.D.; Bernardy, J.A.
1999-01-01
A method was developed for determining benzocaine and N-acetylbenzocaine concentrations in fillet tissue of rainbow trout. The method involves extracting the analytes with acetonitrile, removing lipids or hydrophobic compounds from the extract with hexane, and providing additional clean-up with solid-phase extraction techniques. Analyte concentrations are determined using reversed-phase high-performance liquid chromatographic techniques with an isocratic mobile phase and UV detection. The accuracy (range, 92 to 121%), precision (R.S.D., <14%), and sensitivity (method quantitation limit, <24 ng/g) for each analyte indicate the usefulness of this method for studies characterizing the depletion of benzocaine residues from fish exposed to benzocaine. Copyright (C) 1999.
Recommendations for accreditation of laboratories in molecular biology of hematologic malignancies.
Flandrin-Gresta, Pascale; Cornillet, Pascale; Hayette, Sandrine; Gachard, Nathalie; Tondeur, Sylvie; Mauté, Carole; Cayuela, Jean-Michel
2015-01-01
Over recent years, the development of molecular biology techniques has improved the hematological diseases diagnostic and follow-up. Consequently, these techniques are largely used in the biological screening of these diseases; therefore the Hemato-oncology molecular diagnostics laboratories must be actively involved in the accreditation process according the ISO 15189 standard. The French group of molecular biologists (GBMHM) provides requirements for the implementation of quality assurance for the medical molecular laboratories. This guideline states the recommendations for the pre-analytical, analytical (methods validation procedures, quality controls, reagents), and post-analytical conditions. In addition, herein we state a strategy for the internal quality control management. These recommendations will be regularly updated.
Synthesis of Feedback Controller for Chaotic Systems by Means of Evolutionary Techniques
NASA Astrophysics Data System (ADS)
Senkerik, Roman; Oplatkova, Zuzana; Zelinka, Ivan; Davendra, Donald; Jasek, Roman
2011-06-01
This research deals with a synthesis of control law for three selected discrete chaotic systems by means of analytic programming. The novality of the approach is that a tool for symbolic regression—analytic programming—is used for such kind of difficult problem. The paper consists of the descriptions of analytic programming as well as chaotic systems and used cost function. For experimentation, Self-Organizing Migrating Algorithm (SOMA) with analytic programming was used.
Elements of analytic style: Bion's clinical seminars.
Ogden, Thomas H
2007-10-01
The author finds that the idea of analytic style better describes significant aspects of the way he practices psychoanalysis than does the notion of analytic technique. The latter is comprised to a large extent of principles of practice developed by previous generations of analysts. By contrast, the concept of analytic style, though it presupposes the analyst's thorough knowledge of analytic theory and technique, emphasizes (1) the analyst's use of his unique personality as reflected in his individual ways of thinking, listening, and speaking, his own particular use of metaphor, humor, irony, and so on; (2) the analyst's drawing on his personal experience, for example, as an analyst, an analysand, a parent, a child, a spouse, a teacher, and a student; (3) the analyst's capacity to think in a way that draws on, but is independent of, the ideas of his colleagues, his teachers, his analyst, and his analytic ancestors; and (4) the responsibility of the analyst to invent psychoanalysis freshly for each patient. Close readings of three of Bion's 'Clinical seminars' are presented in order to articulate some of the elements of Bion's analytic style. Bion's style is not presented as a model for others to emulate or, worse yet, imitate; rather, it is described in an effort to help the reader consider from a different vantage point (provided by the concept of analytic style) the way in which he, the reader, practices psychoanalysis.
Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A
2017-05-10
Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.
Nano-Aptasensing in Mycotoxin Analysis: Recent Updates and Progress
Rhouati, Amina; Bulbul, Gonca; Hayat, Akhtar; Marty, Jean Louis
2017-01-01
Recent years have witnessed an overwhelming integration of nanomaterials in the fabrication of biosensors. Nanomaterials have been incorporated with the objective to achieve better analytical figures of merit in terms of limit of detection, linear range, assays stability, low production cost, etc. Nanomaterials can act as immobilization support, signal amplifier, mediator and artificial enzyme label in the construction of aptasensors. We aim in this work to review the recent progress in mycotoxin analysis. This review emphasizes on the function of the different nanomaterials in aptasensors architecture. We subsequently relate their features to the analytical performance of the given aptasensor towards mycotoxins monitoring. In the same context, a critically analysis and level of success for each nano-aptasensing design will be discussed. Finally, current challenges in nano-aptasensing design for mycotoxin analysis will be highlighted. PMID:29143760
Ariyama, Kaoru; Kadokura, Masashi; Suzuki, Tadanao
2008-01-01
Techniques to determine the geographic origin of foods have been developed for various agricultural and fishery products, and they have used various principles. Some of these techniques are already in use for checking the authenticity of the labeling. Many are based on multielement analysis and chemometrics. We have developed such a technique to determine the geographic origin of onions (Allium cepa L.). This technique, which determines whether an onion is from outside Japan, is designed for onions labeled as having a geographic origin of Hokkaido, Hyogo, or Saga, the main onion production areas in Japan. However, estimations of discrimination errors for this technique have not been fully conducted; they have been limited to those for discrimination models and do not include analytical errors. Interlaboratory studies were conducted to estimate the analytical errors of the technique. Four collaborators each determined 11 elements (Na, Mg, P, Mn, Zn, Rb, Sr, Mo, Cd, Cs, and Ba) in 4 test materials of fresh and dried onions. Discrimination errors in this technique were estimated by summing (1) individual differences within lots, (2) variations between lots from the same production area, and (3) analytical errors. The discrimination errors for onions from Hokkaido, Hyogo, and Saga were estimated to be 2.3, 9.5, and 8.0%, respectively. Those for onions from abroad in determinations targeting Hokkaido, Hyogo, and Saga were estimated to be 28.2, 21.6, and 21.9%, respectively.
Technical progress in silicon sheet growth under DOE/JPL FSA program, 1975-1986
NASA Technical Reports Server (NTRS)
Kalejs, J. P.
1986-01-01
The technical progress made in the Silicon Sheet Growth Program during its 11 years was reviewed. At present, in 1986, only two of the original 9 techniques have survived to the start-up, pilot-plant stage in industry. These two techniques are the edge-defined, film-fed growth (EFG) technique that produces closed shape polygons, and the WEB dendritic technique that produces single ribbons. Both the status and future concerns of the EFG and WEB techniques were discussed.
A Bayesian Machine Learning Model for Estimating Building Occupancy from Open Source Data
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.; ...
2016-01-01
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Robert N.; Urban, Marie L.; Duchscherer, Samantha E.
Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the artmore » by introducing the Population Data Tables (PDT), a Bayesian based informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000ft2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.Understanding building occupancy is critical to a wide array of applications including natural hazards loss analysis, green building technologies, and population distribution modeling. Due to the expense of directly monitoring buildings, scientists rely in addition on a wide and disparate array of ancillary and open source information including subject matter expertise, survey data, and remote sensing information. These data are fused using data harmonization methods which refer to a loose collection of formal and informal techniques for fusing data together to create viable content for building occupancy estimation. In this paper, we add to the current state of the art by introducing the Population Data Tables (PDT), a Bayesian model and informatics system for systematically arranging data and harmonization techniques into a consistent, transparent, knowledge learning framework that retains in the final estimation uncertainty emerging from data, expert judgment, and model parameterization. PDT probabilistically estimates ambient occupancy in units of people/1000 ft 2 for over 50 building types at the national and sub-national level with the goal of providing global coverage. The challenge of global coverage led to the development of an interdisciplinary geospatial informatics system tool that provides the framework for capturing, storing, and managing open source data, handling subject matter expertise, carrying out Bayesian analytics as well as visualizing and exporting occupancy estimation results. We present the PDT project, situate the work within the larger community, and report on the progress of this multi-year project.« less
Mohammadi, Saeed; Busa, Lori Shayne Alamo; Maeki, Masatoshi; Mohamadi, Reza M; Ishida, Akihiko; Tani, Hirofumi; Tokeshi, Manabu
2016-11-01
A novel washing technique for microfluidic paper-based analytical devices (μPADs) that is based on the spontaneous capillary action of paper and eliminates unbound antigen and antibody in a sandwich immunoassay is reported. Liquids can flow through a porous medium (such as paper) in the absence of external pressure as a result of capillary action. Uniform results were achieved when washing a paper substrate in a PDMS holder which was integrated with a cartridge absorber acting as a porous medium. Our study demonstrated that applying this washing technique would allow μPADs to become the least expensive microfluidic device platform with high reproducibility and sensitivity. In a model μPAD assay that utilized this novel washing technique, C-reactive protein (CRP) was detected with a limit of detection (LOD) of 5 μg mL -1 . Graphical Abstract A novel washing technique for microfluidic paper-based analytical devices (μPADs) that is based on the spontaneous capillary action of paper and eliminates unbound antigen and antibody in a sandwich immunoassay is reported.
Asadollahi, Aziz; Khazanovich, Lev
2018-04-11
The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
Effect of different analyte diffusion/adsorption protocols on SERS signals
NASA Astrophysics Data System (ADS)
Li, Ruoping; Petschek, Rolfe G.; Han, Junhe; Huang, Mingju
2018-07-01
The effect of different analyte diffusion/adsorption protocols was studied which is often overlooked in surface-enhanced Raman scattering (SERS) technique. Three protocols: highly concentrated dilution (HCD) protocol, half-half dilution (HHD) protocol and layered adsorption (LA) protocol were studied and the SERS substrates were monolayer films of 80 nm Ag nanoparticles (NPs) which were modified by polyvinylpyrrolidone. The diffusion/adsorption mechanisms were modelled using the diffusion equation and the electromagnetic field distribution of two adjacent Ag NPs was simulated by the finite-different time-domain method. All experimental data and theoretical analysis suggest that different diffusion/adsorption behaviour of analytes will cause different SERS signal enhancements. HHD protocol could produce the most uniform and reproducible samples, and the corresponding signal intensity of the analyte is the strongest. This study will help to understand and promote the use of SERS technique in quantitative analysis.
Freitag, Ruth; Hilbrig, Frank
2007-07-01
CEC is defined as an analytical method, where the analytes are separated on a chromatographic column in the presence of an applied voltage. The separation of charged analytes in CEC is complex, since chromatographic interaction, electroosmosis and electrophoresis contribute to the experimentally observed behavior. The putative contribution of effects such as surface electrodiffusion has been suggested. A sound theoretical treatment incorporating all effects is currently not available. The question of whether the different effects contribute in an independent or an interdependent manner is still under discussion. In this contribution, the state-of-the-art in the theoretical description of the individual contributions as well as models for the retention behavior and in particular possible dimensionless 'retention factors' is discussed, together with the experimental database for the separation of charged analytes, in particular proteins and peptides, by CEC and related techniques.
Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
Zheng, Qiuling; Chen, Hao
2016-06-01
Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Heinrich, R.R.; Graczyk, D.G.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for fiscal year 1988 (October 1987 through September 1988). The Analytical Chemistry Laboratory is a full-cost recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routinemore » standard analyses to unique problems that require significant development of methods and techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Heinrich, R.R.; Graczyk, D.G.
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1989 (October 1988 through September 1989). The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. In addition, the ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems, from routine standardmore » analyses to unique problems that require significant development of methods and techniques.« less
Eleventh international symposium on radiopharmaceutical chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This document contains abstracts of papers which were presented at the Eleventh International Symposium on Radiopharmaceutical Chemistry. Sessions included: radiopharmaceuticals for the dopaminergic system, strategies for the production and use of labelled reactive small molecules, radiopharmaceuticals for measuring metabolism, radiopharmaceuticals for the serotonin and sigma receptor systems, labelled probes for molecular biology applications, radiopharmaceuticals for receptor systems, radiopharmaceuticals utilizing coordination chemistry, radiolabelled antibodies, radiolabelling methods for small molecules, analytical techniques in radiopharmaceutical chemistry, and analytical techniques in radiopharmaceutical chemistry.
Marine geodetic control for geoidal profile mapping across the Puerto Rican Trench
NASA Technical Reports Server (NTRS)
Fubara, D. M.; Mourad, A. G.
1975-01-01
A marine geodetic control was established for the northern end of the geoidal profile mapping experiment across the Puerto Rican Trench by determining the three-dimensional geodetic coordinates of the four ocean-bottom mounted acoustic transponders. The data reduction techniques employed and analytical processes involved are described. Before applying the analytical techniques to the field data, they were tested with simulated data and proven to be effective in theory as well as in practice.
Analytical cytology applied to detection of induced cytogenetic abnormalities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, J.W.; Lucas, J.; Straume, T.
1987-08-06
Radiation-induced biological damage results in formation of a broad spectrum of cytogenetic changes such as translocations, dicentrics, ring chromosomes, and acentric fragments. A battery of analytical cytologic techniques are now emerging that promise to significantly improve the precision and ease with which these radiation induced cytogenetic changes can be quantified. This report summarizes techniques to facilitate analysis of the frequency of occurrence of structural and numerical aberrations in control and irradiated human cells. 14 refs., 2 figs.
Analytical methods for determination of mycotoxins: a review.
Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A
2009-01-26
Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.
[Problems of food authenticity].
Czerwiecki, Ludwik
2004-01-01
In this review the several data concerning food authenticity were presented. Typical examples of food adulteration were described. The most known are adulteration of vegetable and fruit products, adulteration of wine, honeys, olive oil etc. The modern analytical techniques for detection of food adulteration were discussed. Among physicochemical methods isotopic techniques (SCIRA, IRMS, SNIF-NMR) were cited. The main spectral methods are: IACPAES, PyMs, FTIR, NIR. The chromatographic techniques (GC, HPLC, HPAEC, HPTLC) with several kinds of detectors were described and the ELISA and PCR techniques are mentioned, too. The role of chemometrics as a way of several analytical data processing was highlighted. It was pointed out at the necessity of more rigorous control of food to support of all activity in area of fight with fraud in food industry.
The forensic validity of visual analytics
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.
2008-01-01
The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.
Karayannis, Miltiades I; Efstathiou, Constantinos E
2012-12-15
In this review the history of chemistry and specifically the history and the significant steps of the evolution of analytical chemistry are presented. In chronological time spans, covering the ancient world, the middle ages, the period of the 19th century, and the three evolutional periods, from the verge of the 19th century to contemporary times, it is given information for the progress of chemistry and analytical chemistry. During this period, analytical chemistry moved gradually from its pure empirical nature to more rational scientific activities, transforming itself to an autonomous branch of chemistry and a separate discipline. It is also shown that analytical chemistry moved gradually from the status of exclusive serving the chemical science, towards serving, the environment, health, law, almost all areas of science and technology, and the overall society. Some recommendations are also directed to analytical chemistry educators concerning the indispensable nature of knowledge of classical analytical chemistry and the associated laboratory exercises and to analysts, in general, why it is important to use the chemical knowledge to make measurements on problems of everyday life. Copyright © 2012 Elsevier B.V. All rights reserved.
Safina, Gulnara
2012-01-27
Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay
2003-09-01
The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.
Redígolo, M M; Sato, I M; Metairon, S; Zamboni, C B
2016-04-01
Several diseases can be diagnosed observing the variation of specific elements concentration in body fluids. In this study the concentration of inorganic elements in blood samples of dystrophic (Dmd(mdx)/J) and C57BL/6J (control group) mice strain were determined. The results obtained from Energy Dispersive X-ray Fluorescence (EDXRF) were compared with Neutron Activation Analysis (NAA) technique. Both analytical techniques showed to be appropriate and complementary offering a new contribution for veterinary medicine as well as detailed knowledge of this pathology. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Devivar, Rodrigo
2014-01-01
The performance of a material is greatly influenced by its thermal and chemical properties. Analytical pyrolysis, when coupled to a GC-MS system, is a powerful technique that can unlock the thermal and chemical properties of almost any substance and provide vital information. At NASA, we depend on precise thermal analysis instrumentation for understanding aerospace travel. Our analytical techniques allow us to test materials in the laboratory prior to an actual field test; whether the field test is miles up in the sky or miles underground, the properties of any involved material must be fully studied and understood in the laboratory.
A Spectrophotometric Study of the Permanganate-Oxalate Reaction: An Analytical Laboratory Experiment
ERIC Educational Resources Information Center
Kalbus, Gene E.; Lieu, Van T.; Kalbus, Lee H.
2004-01-01
The spectrophotometric method assists in the study of potassium permanganate-oxalate reaction. Basic analytical techniques and rules are implemented in the experiment, which can also include the examination of other compounds oxidized by permanganate.