Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Impact of the macroeconomic factors on university budgeting the US and Russia
NASA Astrophysics Data System (ADS)
Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly
2017-10-01
This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.
Development of Impurity Profiling Methods Using Modern Analytical Techniques.
Ramachandra, Bondigalla
2017-01-02
This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.
Challenges in Modern Anti-Doping Analytical Science.
Ayotte, Christiane; Miller, John; Thevis, Mario
2017-01-01
The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.
Looking ahead in systems engineering
NASA Technical Reports Server (NTRS)
Feigenbaum, Donald S.
1966-01-01
Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.
A New Approach to Business Writing.
ERIC Educational Resources Information Center
Egan, Michael
1998-01-01
Explains how business writing can be taught using examples from modern literature and the analytical tools of literary criticism. Uses Michener, Hemingway, Faulkner, and Steinbeck to illustrate techniques. (SK)
Mirski, Tomasz; Bartoszcze, Michał; Bielawska-Drózd, Agata; Cieślik, Piotr; Michalski, Aleksander J; Niemcewicz, Marcin; Kocik, Janusz; Chomiczewski, Krzysztof
2014-01-01
Modern threats of bioterrorism force the need to develop methods for rapid and accurate identification of dangerous biological agents. Currently, there are many types of methods used in this field of studies that are based on immunological or genetic techniques, or constitute a combination of both methods (immuno-genetic). There are also methods that have been developed on the basis of physical and chemical properties of the analytes. Each group of these analytical assays can be further divided into conventional methods (e.g. simple antigen-antibody reactions, classical PCR, real-time PCR), and modern technologies (e.g. microarray technology, aptamers, phosphors, etc.). Nanodiagnostics constitute another group of methods that utilize the objects at a nanoscale (below 100 nm). There are also integrated and automated diagnostic systems, which combine different methods and allow simultaneous sampling, extraction of genetic material and detection and identification of the analyte using genetic, as well as immunological techniques.
Considerations for monitoring raptor population trends based on counts of migrants
Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.
1989-01-01
Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.
Analytics and Action in Afghanistan
2010-09-01
rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology
Manual Solid-Phase Peptide Synthesis of Metallocene-Peptide Bioconjugates
ERIC Educational Resources Information Center
Kirin, Srecko I.; Noor, Fozia; Metzler-Nolte, Nils; Mier, Walter
2007-01-01
A simple and relatively inexpensive procedure for preparing a biologically active peptide using solid phase peptide synthesis (SPPS) is described. Fourth-year undergraduate students have gained firsthand experience from the solid-phase synthesis techniques and they have become familiar with modern analytical techniques based on the particular…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...
2016-02-01
We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less
Rodent Research-1 Validation of Rodent Hardware
NASA Technical Reports Server (NTRS)
Globus, Ruth; Beegle, Janet
2013-01-01
To achieve novel science objectives, validation of a rodent habitat on ISS will enable - In-flight analyses during long duration spaceflight- Use of genetically altered animals- Application of modern analytical techniques (e.g. genomics, proteomics, and metabolomics)
Identification of Microorganisms by Modern Analytical Techniques.
Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica
2017-11-01
Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.
[Problems of food authenticity].
Czerwiecki, Ludwik
2004-01-01
In this review the several data concerning food authenticity were presented. Typical examples of food adulteration were described. The most known are adulteration of vegetable and fruit products, adulteration of wine, honeys, olive oil etc. The modern analytical techniques for detection of food adulteration were discussed. Among physicochemical methods isotopic techniques (SCIRA, IRMS, SNIF-NMR) were cited. The main spectral methods are: IACPAES, PyMs, FTIR, NIR. The chromatographic techniques (GC, HPLC, HPAEC, HPTLC) with several kinds of detectors were described and the ELISA and PCR techniques are mentioned, too. The role of chemometrics as a way of several analytical data processing was highlighted. It was pointed out at the necessity of more rigorous control of food to support of all activity in area of fight with fraud in food industry.
[Recent Development of Atomic Spectrometry in China].
Xiao, Yuan-fang; Wang, Xiao-hua; Hang, Wei
2015-09-01
As an important part of modern analytical techniques, atomic spectrometry occupies a decisive status in the whole analytical field. The development of atomic spectrometry also reflects the continuous reform and innovation of analytical techniques. In the past fifteen years, atomic spectrometry has experienced rapid development and been applied widely in many fields in China. This review has witnessed its development and remarkable achievements. It contains several directions of atomic spectrometry, including atomic emission spectrometry (AES), atomic absorption spectrometry (AAS), atomic fluorescence spectrometry (AFS), X-ray fluorescence spectrometry (XRF), and atomic mass spectrometry (AMS). Emphasis is put on the innovation of the detection methods and their applications in related fields, including environmental samples, biological samples, food and beverage, and geological materials, etc. There is also a brief introduction to the hyphenated techniques utilized in atomic spectrometry. Finally, the prospects of atomic spectrometry in China have been forecasted.
Modern Analytical Chemistry in the Contemporary World
ERIC Educational Resources Information Center
Šíma, Jan
2016-01-01
Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among…
Modern analytical methods for the detection of food fraud and adulteration by food category.
Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook
2017-09-01
This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V
2014-01-01
This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.
Tidal analysis of Met rocket wind data
NASA Technical Reports Server (NTRS)
Bedinger, J. F.; Constantinides, E.
1976-01-01
A method of analyzing Met Rocket wind data is described. Modern tidal theory and specialized analytical techniques were used to resolve specific tidal modes and prevailing components in observed wind data. A representation of the wind which is continuous in both space and time was formulated. Such a representation allows direct comparison with theory, allows the derivation of other quantities such as temperature and pressure which in turn may be compared with observed values, and allows the formation of a wind model which extends over a broader range of space and time. Significant diurnal tidal modes with wavelengths of 10 and 7 km were present in the data and were resolved by the analytical technique.
Analytical techniques for identification and study of organic matter in returned lunar samples
NASA Technical Reports Server (NTRS)
Burlingame, A. L.
1974-01-01
The results of geochemical research are reviewed. Emphasis is placed on the contribution of mass spectrometric data to the solution of specific structural problems. Information on the mass spectrometric behavior of compounds of geochemical interest is reviewed and currently available techniques of particular importance to geochemistry, such as gas chromatograph-mass spectrometer coupling, modern sample introduction methods, and computer application in high resolution mass spectrometry, receive particular attention.
Analytical description of the modern steam automobile
NASA Technical Reports Server (NTRS)
Peoples, J. A.
1974-01-01
The sensitivity of operating conditions upon performance of the modern steam automobile is discussed. The word modern has been used in the title to indicate that emphasis is upon miles per gallon rather than theoretical thermal efficiency. This has been accomplished by combining classical power analysis with the ideal Pressure-Volume diagram. Several parameters are derived which characterize performance capability of the modern steam car. The report illustrates that performance is dictated by the characteristics of the working medium, and the supply temperature. Performance is nearly independent of pressures above 800 psia. Analysis techniques were developed specifically for reciprocating steam engines suitable for automotive application. Specific performance charts have been constructed on the basis of water as a working medium. The conclusions and data interpretation are therefore limited within this scope.
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
Safina, Gulnara
2012-01-27
Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Cortez, Juliana; Pasquini, Celio
2013-02-05
The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.
Computational overlay metrology with adaptive data analytics
NASA Astrophysics Data System (ADS)
Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris
2017-03-01
With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.
Quantitative proteomics in the field of microbiology.
Otto, Andreas; Becher, Dörte; Schmidt, Frank
2014-03-01
Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.
Borowsky, Joseph; Collins, Greg E
2007-10-01
The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.
ERIC Educational Resources Information Center
Gerontas, Apostolos
2014-01-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent…
A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques
Weller, Michael G.
2012-01-01
The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539
Tarasov, Andrii; Rauhut, Doris; Jung, Rainer
2017-12-01
Analytical methods of haloanisoles and halophenols quantification in cork matrix are summarized in the current review. Sample-preparation and sample-treatment techniques have been compared and discussed from the perspective of their efficiency, time- and extractant-optimization, easiness of performance. Primary interest of these analyses usually addresses to 2,4,6-trichloroanisole (TCA), which is a major wine contaminant among haloanisoles. Two concepts of TCA determination are described in the review: releasable TCA and total TCA analyses. Chromatographic, bioanalytical and sensorial methods were compared according to their application in the cork industry and in scientific investigations. Finally, it was shown that modern analytical techniques are able to provide required sensitivity, selectivity and repeatability for haloanisoles and halophenols determination. Copyright © 2017 Elsevier B.V. All rights reserved.
Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.
David, Michael D
2017-04-01
The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
A pilot modeling technique for handling-qualities research
NASA Technical Reports Server (NTRS)
Hess, R. A.
1980-01-01
A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.
NASA Technical Reports Server (NTRS)
Panda, Binayak
2009-01-01
Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.
Supercritical fluid chromatography: a promising alternative to current bioanalytical techniques.
Dispas, Amandine; Jambo, Hugues; André, Sébastien; Tyteca, Eva; Hubert, Philippe
2018-01-01
During the last years, chemistry was involved in the worldwide effort toward environmental problems leading to the birth of green chemistry. In this context, green analytical tools were developed as modern Supercritical Fluid Chromatography in the field of separative techniques. This chromatographic technique knew resurgence a few years ago, thanks to its high efficiency, fastness and robustness of new generation equipment. These advantages and its easy hyphenation to MS fulfill the requirements of bioanalysis regarding separation capacity and high throughput. In the present paper, the technical aspects focused on bioanalysis specifications will be detailed followed by a critical review of bioanalytical supercritical fluid chromatography methods published in the literature.
Modern analytical chemistry in the contemporary world
NASA Astrophysics Data System (ADS)
Šíma, Jan
2016-12-01
Students not familiar with chemistry tend to misinterpret analytical chemistry as some kind of the sorcery where analytical chemists working as modern wizards handle magical black boxes able to provide fascinating results. However, this approach is evidently improper and misleading. Therefore, the position of modern analytical chemistry among sciences and in the contemporary world is discussed. Its interdisciplinary character and the necessity of the collaboration between analytical chemists and other experts in order to effectively solve the actual problems of the human society and the environment are emphasized. The importance of the analytical method validation in order to obtain the accurate and precise results is highlighted. The invalid results are not only useless; they can often be even fatal (e.g., in clinical laboratories). The curriculum of analytical chemistry at schools and universities is discussed. It is referred to be much broader than traditional equilibrium chemistry coupled with a simple description of individual analytical methods. Actually, the schooling of analytical chemistry should closely connect theory and practice.
Recent development of electrochemiluminescence sensors for food analysis.
Hao, Nan; Wang, Kun
2016-10-01
Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.
Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.
Malá, Zdena; Gebauer, Petr; Boček, Petr
2017-01-01
This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Pan, Jun-Yang; Xie, Yi
2015-02-01
With tremendous advances in modern techniques, Einstein's general relativity has become an inevitable part of deep space missions. We investigate the relativistic algorithm for time transfer between the proper time τ of the onboard clock and the Geocentric Coordinate Time, which extends some previous works by including the effects of propagation of electromagnetic signals. In order to evaluate the implicit algebraic equations and integrals in the model, we take an analytic approach to work out their approximate values. This analytic model might be used in an onboard computer because of its limited capability to perform calculations. Taking an orbiter like Yinghuo-1 as an example, we find that the contributions of the Sun, the ground station and the spacecraft dominate the outcomes of the relativistic corrections to the model.
Franquelo, M L; Duran, A; Castaing, J; Arquillo, D; Perez-Rodriguez, J L
2012-01-30
This paper presents the novel application of recently developed analytical techniques to the study of paint layers on sculptures that have been restored/repainted several times across centuries. Analyses were performed using portable XRF, μ-XRD and μ-Raman instruments. Other techniques, such as optical microscopy, SEM-EDX and μ-FTIR, were also used. Pigments and other materials including vermilion, minium, red lac, ivory black, lead white, barium white, zinc white (zincite), titanium white (rutile and anatase), lithopone, gold and brass were detected. Pigments from both ancient and modern times were found due to the different restorations/repaintings carried out. μ-Raman was very useful to characterise some pigments that were difficult to determine by μ-XRD. In some cases, pigments identification was only possible by combining results from the different analytical techniques used in this work. This work is the first article devoted to the study of sculpture cross-section samples using laboratory-made μ-XRD systems. Copyright © 2011 Elsevier B.V. All rights reserved.
Wallrabe, U; Ruther, P; Schaller, T; Schomburg, W K
1998-03-01
The complexity of modern surgical and analytical methods requires the miniaturisation of many medical devices. The LIGA technique and also mechanical microengineering are well known for the batch fabrication of microsystems. Actuators and sensors are developed based on these techniques. The hydraulic actuation principle is advantageous for medical applications since the energy may be supplied by pressurised balanced salt solution. Some examples are turbines, pumps and valves. In addition, optical sensors and components are useful for analysis and inspection as represented by microspectrometers and spherical lenses. Finally, plastic containers with microporous bottoms allow a 3-dimensional growth of cell culture systems.
Znaleziona, Joanna; Ginterová, Pavlína; Petr, Jan; Ondra, Peter; Válka, Ivo; Ševčík, Juraj; Chrastina, Jan; Maier, Vítězslav
2015-05-18
Synthetic cannabinoids have gained popularity due to their easy accessibility and psychoactive effects. Furthermore, they cannot be detected in urine by routine drug monitoring. The wide range of active ingredients in analyzed matrices hinders the development of a standard analytical method for their determination. Moreover, their possible side effects are not well known which increases the danger. This review is focused on the sample preparation and the determination of synthetic cannabinoids in different matrices (serum, urine, herbal blends, oral fluid, hair) published since 2004. The review includes separation and identification techniques, such as thin layer chromatography, gas and liquid chromatography and capillary electrophoresis, mostly coupled with mass spectrometry. The review also includes results by spectral methods like infrared spectroscopy, nuclear magnetic resonance or direct-injection mass spectrometry. Copyright © 2015 Elsevier B.V. All rights reserved.
Analysis of Slabs-on-Grade for a Variety of Loading and Support Conditions.
1984-12-01
applications, namely "- the problem of a slab-on-grade, as encountered in the analysis and design of rigid pavements. - ". This is one of the few...proper design and construction methods are adhered to. There are several additional reasons, entirely due to recent developments, that warrant the...conservative designs led to almost imperceptible pavement deformations, thus warranting the term "rigid pavements". Modern-day analytical techniques
Foodomics: MS-based strategies in modern food science and nutrition.
Herrero, Miguel; Simó, Carolina; García-Cañas, Virginia; Ibáñez, Elena; Cifuentes, Alejandro
2012-01-01
Modern research in food science and nutrition is moving from classical methodologies to advanced analytical strategies in which MS-based techniques play a crucial role. In this context, Foodomics has been recently defined as a new discipline that studies food and nutrition domains through the application of advanced omics technologies in which MS techniques are considered indispensable. Applications of Foodomics include the genomic, transcriptomic, proteomic, and/or metabolomic study of foods for compound profiling, authenticity, and/or biomarker-detection related to food quality or safety; the development of new transgenic foods, food contaminants, and whole toxicity studies; new investigations on food bioactivity, food effects on human health, etc. This review work does not intend to provide an exhaustive revision of the many works published so far on food analysis using MS techniques. The aim of the present work is to provide an overview of the different MS-based strategies that have been (or can be) applied in the new field of Foodomics, discussing their advantages and drawbacks. Besides, some ideas about the foreseen development and applications of MS-techniques in this new discipline are also provided. Copyright © 2011 Wiley Periodicals, Inc.
Weng, Naidong; Needham, Shane; Lee, Mike
2015-01-01
The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.
Chylewska, Agnieszka; Ogryzek, M; Makowski, Mariusz
2017-10-23
New analytical and molecular methods for microorganisms are being developed on various features of identification i.e. selectivity, specificity, sensitivity, rapidity and discrimination of the viable cell. The presented review was established following the current trends in improved pathogens separation and detection methods and their subsequent use in medical diagnosis. This contribution also focuses on the development of analytical and biological methods in the analysis of microorganisms, with special attention paid to bio-samples containing microbes (blood, urine, lymph, wastewater). First, the paper discusses microbes characterization, their structure, surface, properties, size and then it describes pivotal points in the bacteria, viruses and fungi separation procedure obtained by researchers in the last 30 years. According to the above, detection techniques can be classified into three categories, which were, in our opinion, examined and modified most intensively during this period: electrophoretic, nucleic-acid-based, and immunological methods. The review covers also the progress, limitations and challenges of these approaches and emphasizes the advantages of new separative techniques in selective fractionating of microorganisms. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Rossi, D.
2011-09-01
The main focus of this article is to explain a teaching activity. This experience follows a research aimed to testing innovative systems for formal and digital analysis of architectural building. In particular, the field of investigation is the analytical drawing. An analytical draw allows to develope an interpretative and similar models of reality; these models are built using photomodeling techniques and are designed to re-write modern and contemporary architecture. The typology of the buildings surveyed belong to a cultural period, called Modern Movement, historically placed between the two world wars. The Modern Movement aimed to renew existing architectural principle and to a functional redefinition of the same one. In Italy these principles arrived during the Fascist period. Heritage made up of public social buildings (case del Balilla, G.I.L., recreation center...) built during the fascist period in middle Italy is remarkable for quantity and in many cases for architectural quality. This kind of buildings are composed using pure shapes: large cube (gyms) alternate with long rectangular block containing offices creates compositions made of big volumes and high towers. These features are perfectly suited to the needs of a surveying process by photomodeling where the role of photography is central and where there is the need to identify certain and easily distinguishable points on all picture, leaning on the edges of the volume or lininig on the texture discontinuity. The goal is the documentation to preserve and to develop buildings and urban complexes of modern architecture, directed to encourage an artistic preservation.
Jiménez-Díaz, I; Vela-Soria, F; Rodríguez-Gómez, R; Zafra-Gómez, A; Ballesteros, O; Navalón, A
2015-09-10
In the present work, a review of the analytical methods developed in the last 15 years for the determination of endocrine disrupting chemicals (EDCs) in human samples related with children, including placenta, cord blood, amniotic fluid, maternal blood, maternal urine and breast milk, is proposed. Children are highly vulnerable to toxic chemicals in the environment. Among these environmental contaminants to which children are at risk of exposure are EDCs -substances able to alter the normal hormone function of wildlife and humans-. The work focuses mainly on sample preparation and instrumental techniques used for the detection and quantification of the analytes. The sample preparation techniques include, not only liquid-liquid extraction (LLE) and solid-phase extraction (SPE), but also modern microextraction techniques such as extraction with molecular imprinted polymers (MIPs), stir-bar sorptive extraction (SBSE), hollow-fiber liquid-phase microextraction (HF-LPME), dispersive liquid-liquid microextraction (DLLME), matrix solid phase dispersion (MSPD) or ultrasound-assisted extraction (UAE), which are becoming alternatives in the analysis of human samples. Most studies focus on minimizing the number of steps and using the lowest solvent amounts in the sample treatment. The usual instrumental techniques employed include liquid chromatography (LC), gas chromatography (GC) mainly coupled to tandem mass spectrometry. Multiresidue methods are being developed for the determination of several families of EDCs with one extraction step and limited sample preparation. Copyright © 2015 Elsevier B.V. All rights reserved.
Chemical signatures of fossilized resins and recent plant exudates.
Lambert, Joseph B; Santiago-Blay, Jorge A; Anderson, Ken B
2008-01-01
Amber is one of the few gemstones based on an organic structure. Found over most of the world, it is the fossil form of sticky plant exudates called resins. Investigation of amber by modern analytical techniques provides structural information and insight into the identity of the ancient plants that produced the source resin. Mass spectrometric analysis of materials separated by gas chromatography has identified specific compounds that are the basis of a reliable classification of the different types of amber. NMR spectroscopy of bulk, solid amber provides a complementary classification. NMR spectroscopy also can be used to characterize modern resins as well as other types of plant exudates such as gums, gum resins, and kinos, which strongly resemble resins in appearance but have very different molecular constitutions.
The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.
Enke, Christie G
2015-12-15
The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.
ENVIRONMENTAL ANALYTICAL CHEMISTRY OF ...
Within the scope of a number of emerging contaminant issues in environmental analysis, one area that has received a great deal of public interest has been the assessment of the role of pharmaceuticals and personal care products (PPCPs) as stressors and agents of change in ecosystems as well as their role in unplanned human exposure. The relationship between personal actions and the occurrence of PPCPs in the environment is clear-cut and comprehensible to the public. In this overview, we attempt to examine the separations aspect of the analytical approach to the vast array of potential analytes among this class of compounds. We also highlight the relationship between these compounds and endocrine disrupting compounds (EDCs) and between PPCPs and EDCs and the more traditional environmental analytes such as the persistent organic pollutants (POPs). Although the spectrum of chemical behavior extends from hydrophobic to hydrophilic, the current focus has shifted to moderately and highly polar analytes. Thus, emphasis on HPLC and LC/MS has grown and MS/MS has become a detection technique of choice with either electrospray ionization or atmospheric pressure chemical ionization. This contrasts markedly with the bench mark approach of capillary GC, GC/MS and electron ionization in traditional environmental analysis. The expansion of the analyte list has fostered new vigor in the development of environmental analytical chemistry, modernized the range of tools appli
NASA Technical Reports Server (NTRS)
Morelli, Eugene A.; DeLoach, Richard
2003-01-01
A wind tunnel experiment for characterizing the aerodynamic and propulsion forces and moments acting on a research model airplane is described. The model airplane called the Free-flying Airplane for Sub-scale Experimental Research (FASER), is a modified off-the-shelf radio-controlled model airplane, with 7 ft wingspan, a tractor propeller driven by an electric motor, and aerobatic capability. FASER was tested in the NASA Langley 12-foot Low-Speed Wind Tunnel, using a combination of traditional sweeps and modern experiment design. Power level was included as an independent variable in the wind tunnel test, to allow characterization of power effects on aerodynamic forces and moments. A modeling technique that employs multivariate orthogonal functions was used to develop accurate analytic models for the aerodynamic and propulsion force and moment coefficient dependencies from the wind tunnel data. Efficient methods for generating orthogonal modeling functions, expanding the orthogonal modeling functions in terms of ordinary polynomial functions, and analytical orthogonal blocking were developed and discussed. The resulting models comprise a set of smooth, differentiable functions for the non-dimensional aerodynamic force and moment coefficients in terms of ordinary polynomials in the independent variables, suitable for nonlinear aircraft simulation.
Analytic theory of orbit contraction and ballistic entry into planetary atmospheres
NASA Technical Reports Server (NTRS)
Longuski, J. M.; Vinh, N. X.
1980-01-01
A space object traveling through an atmosphere is governed by two forces: aerodynamic and gravitational. On this premise, equations of motion are derived to provide a set of universal entry equations applicable to all regimes of atmospheric flight from orbital motion under the dissipate force of drag through the dynamic phase of reentry, and finally to the point of contact with the planetary surface. Rigorous mathematical techniques such as averaging, Poincare's method of small parameters, and Lagrange's expansion, applied to obtain a highly accurate, purely analytic theory for orbit contraction and ballistic entry into planetary atmospheres. The theory has a wide range of applications to modern problems including orbit decay of artificial satellites, atmospheric capture of planetary probes, atmospheric grazing, and ballistic reentry of manned and unmanned space vehicles.
NASA Astrophysics Data System (ADS)
Gupta, Lokesh Kumar
2012-11-01
Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.
Peter, Samuel C; Whelan, James P; Pfund, Rory A; Meyers, Andrew W
2018-06-14
Although readability has been traditionally operationalized and even become synonymous with the concept of word and sentence length, modern text analysis theory and technology have shifted toward multidimensional comprehension-based analytic techniques. In an effort to make use of these advancements and demonstrate their general utility, 6 commonly used measures of gambling disorder were submitted to readability analyses using 2 of these advanced approaches, Coh-Metrix and Question Understanding Aid (QUAID), and one traditional approach, the Flesch-Kincaid Grade Level. As hypothesized, significant variation was found across measures, with some questionnaires emerging as more appropriate than others for use in samples that may include individuals with low literacy. Recommendations are made for the use of these modern approaches to readability to inform decisions on measure selection and development. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
ERIC Educational Resources Information Center
Peters, Michael A.; Besley, Tina A. C.
2014-01-01
This article offers a broad philosophical and historical background to the dyad of social exclusion/inclusion by examining the analytics and politics of exclusion first by reference to Michel Foucault who studies the modern history of exclusion and makes it central to his approach in understanding the development of modern institutions of emerging…
A historical perspective on radioisotopic tracers in metabolism and biochemistry.
Lappin, Graham
2015-01-01
Radioisotopes are used routinely in the modern laboratory to trace and quantify a myriad of biochemical processes. The technique has a captivating history peppered with groundbreaking science and with more than its share of Nobel Prizes. The discovery of radioactivity at the end of the 19th century paved the way to understanding atomic structure and quickly led to the use of radioisotopes to trace the fate of molecules as they flowed through complex organic life. The 1940s saw the first radiotracer studies using homemade instrumentation and analytical techniques such as paper chromatography. This article follows the history of radioisotopic tracers from meager beginnings, through to the most recent applications. The author hopes that those researchers involved in radioisotopic tracer studies today will pause to remember the origins of the technique and those who pioneered this fascinating science.
Drummer, Olaf H
2010-01-01
Forensic toxicology has developed as a forensic science in recent years and is now widely used to assist in death investigations, in civil and criminal matters involving drug use, in drugs of abuse testing in correctional settings and custodial medicine, in road and workplace safety, in matters involving environmental pollution, as well as in sports doping. Drugs most commonly targeted include amphetamines, benzodiazepines, cannabis, cocaine and the opiates, but can be any other illicit substance or almost any over-the-counter or prescribed drug, as well as poisons available to the community. The discipline requires high level skills in analytical techniques with a solid knowledge of pharmacology and pharmacokinetics. Modern techniques rely heavily on immunoassay screening analyses and mass spectrometry (MS) for confirmatory analyses using either high-performance liquid chromatography or gas chromatography as the separation technique. Tandem MS has become more and more popular compared to single-stage MS. It is essential that analytical systems are fully validated and fit for the purpose and the assay batches are monitored with quality controls. External proficiency programs monitor both the assay and the personnel performing the work. For a laboratory to perform optimally, it is vital that the circumstances and context of the case are known and the laboratory understands the limitations of the analytical systems used, including drug stability. Drugs and poisons can change concentration postmortem due to poor or unequal quality of blood and other specimens, anaerobic metabolism and redistribution. The latter provides the largest handicap in the interpretation of postmortem results.
You can run, you can hide: The epidemiology and statistical mechanics of zombies
NASA Astrophysics Data System (ADS)
Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.
2015-11-01
We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.
NASA Astrophysics Data System (ADS)
Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin
2015-05-01
The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.
Melt inclusions come of age: Volatiles, volcanoes, and sorby's legacy
Lowenstern, J. B.
2003-01-01
Despite nearly forty years of modern research on silicate melt inclusions (MI), only within the past 10-15 years have volcanologists and petrologists come to regularly accept their utility for characterizing magmatic systems. Their relatively slow acceptance was likely due to a number of factors including: 1) Lack of reliable analytical techniques, 2) Concern that MI represent anomalous boundary-layer melts or are altered by leakage or post-entrapment crystallization, 3) Data sets indicative of heterogeneous melts and, 4) Homogenization temperatures greater than those calculated by other techniques. With improvements in analytical methods and careful studies of MI systematics, workers are increasingly convinced of the utility of these features to unravel the complexities of volcanic systems: melt inclusions have "come of age." Recent studies provide compelling evidence for the compositions of dissolved and exsolved volatiles in magma reservoirs. Evidence for immiscibility of gases, hydrosaline brines and pegmatitic fluids demonstrate that magmatic phase relations are often more complicated than can be inferred by inspection of crystalline phases alone. ?? 2003 Elsevier B.V. All rights reserved.
Van Gosen, Bradley S.
2008-01-01
A study conducted in 2006 by the U.S. Geological Survey collected 57 surface rock samples from nine types of intrusive rock in the Iron Hill carbonatite complex. This intrusive complex, located in Gunnison County of southwestern Colorado, is known for its classic carbonatite-alkaline igneous geology and petrology. The Iron Hill complex is also noteworthy for its diverse mineral resources, including enrichments in titanium, rare earth elements, thorium, niobium (columbium), and vanadium. This study was performed to reexamine the chemistry and metallic content of the major rock units of the Iron Hill complex by using modern analytical techniques, while providing a broader suite of elements than the earlier published studies. The report contains the geochemical analyses of the samples in tabular and digital spreadsheet format, providing the analytical results for 55 major and trace elements.
Four Bad Habits of Modern Psychologists
Grice, James; Cota, Lisa; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-01-01
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., “bad habits”) of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers’ theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices. PMID:28805739
Four Bad Habits of Modern Psychologists.
Grice, James; Barrett, Paul; Cota, Lisa; Felix, Crystal; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-08-14
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., "bad habits") of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers' theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices.
Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds
NASA Astrophysics Data System (ADS)
Johnson, C. E.
2017-12-01
Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.
Analytical methods for dating modern writing instrument inks on paper.
Ezcurra, Magdalena; Góngora, Juan M G; Maguregui, Itxaso; Alonso, Rosa
2010-04-15
This work reviews the different analytical methods that have been proposed in the field of forensic dating of inks from different modern writing instruments. The reported works have been classified according to the writing instrument studied and the ink component analyzed in relation to aging. The study, done chronologically, shows the advances experienced in the ink dating field in the last decades. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
A review of modern instrumental techniques for measurements of ice cream characteristics.
Bahram-Parvar, Maryam
2015-12-01
There is an increasing demand of the food industries and research institutes to have means of measurement allowing the characterization of foods. Ice cream, as a complex food system, consists of a frozen matrix containing air bubbles, fat globules, ice crystals, and an unfrozen serum phase. Some deficiencies in conventional methods for testing this product encourage the use of alternative techniques such as rheometry, spectroscopy, X-ray, electro-analytical techniques, ultrasound, and laser. Despite the development of novel instrumental applications in food science, use of some of them in ice cream testing is few, but has shown promising results. Developing the novel methods should increase our understanding of characteristics of ice cream and may allow online testing of the product. This review article discusses the potential of destructive and non-destructive methodologies in determining the quality and characteristics of ice cream and similar products. Copyright © 2015. Published by Elsevier Ltd.
Sandra, Koen; Vandenheede, Isabel; Sandra, Pat
2014-03-28
Protein biopharmaceuticals such as monoclonal antibodies and therapeutic proteins are currently in widespread use for the treatment of various life-threatening diseases including cancer, autoimmune disorders, diabetes and anemia. The complexity of protein therapeutics is far exceeding that of small molecule drugs; hence, unraveling this complexity represents an analytical challenge. The current review provides the reader with state-of-the-art chromatographic and mass spectrometric tools available to dissect primary and higher order structures, post-translational modifications, purity and impurity profiles and pharmacokinetic properties of protein therapeutics. Copyright © 2013 Elsevier B.V. All rights reserved.
McGovern, Patrick E.; Mirzoian, Armen; Hall, Gretchen R.
2009-01-01
Chemical analyses of ancient organics absorbed into pottery jars from the beginning of advanced ancient Egyptian culture, ca. 3150 B.C., and continuing for millennia have revealed that a range of natural products—specifically, herbs and tree resins—were dispensed by grape wine. These findings provide chemical evidence for ancient Egyptian organic medicinal remedies, previously only ambiguously documented in medical papyri dating back to ca. 1850 B.C. They illustrate how humans around the world, probably for millions of years, have exploited their natural environments for effective plant remedies, whose active compounds have recently begun to be isolated by modern analytical techniques. PMID:19365069
Elements in the history of the Periodic Table.
Rouvray, Dennis H
2004-06-01
Discovery of the Periodic Table was rendered possible only after four decisive prerequisites had been achieved. These were (i) the abandonment of the metaphysical and occult notions of elements that typified the alchemical era; (ii) the adoption of a modern and workable definition of an element; (iii) the development of analytical chemical techniques for the isolation of the elements and determination of their properties; and (iv) the devising of a means of associating each element with a characteristic natural number. The Periodic Table made its appearance on cue almost as soon as these preconditions had been fulfilled.
Novel choline esterase based sensor for monitoring of organophosphorus pollutants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilkins, E.S.; Ghindilis, A.L.; Atanasov, P.
1996-12-31
Organophosphorus compounds are significant major environmental pollutants due to their intensive use as pesticides. The modern techniques based on inhibition of choline esterase enzyme activity are discussed. Potentiometric electrodes based on detection of choline esterase inhibition by analytes has been developed. The detection of choline esterase activity is based on the novel principle of molecular transduction. Immobilized peroxidase acting as the molecular transducer, catalyzes the electroreduction of hydrogen peroxide by direct (mediatorless) electron transfer. The sensing element consists of a carbon based electrode containing an assembly of co-immobilized enzymes: choline esterase, choline oxidase and peroxidase.
NASA Astrophysics Data System (ADS)
Titus, Elby; Ventura, João; Pedro Araújo, João; Campos Gil, João
2017-12-01
Nanomaterials provide a remarkably novel outlook to the design and fabrication of materials. The know-how of designing, modelling and fabrication of nanomaterials demands sophisticated experimental and analytical techniques. The major impact of nanomaterials will be in the fields of electronics, energy and medicine. Nanoelectronics hold the promise of improving the quality of life of electronic devices through superior performance, weight reduction and lower power consumption. New energy production systems based on hydrogen, solar and nuclear sources have also benefited immensely from nanomaterials. In modern medicine, nanomaterials research will have great impact on public health care due to better diagnostic methods and design of novel drugs.
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
Current trends in sample preparation for cosmetic analysis.
Zhong, Zhixiong; Li, Gongke
2017-01-01
The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cardiac data mining (CDM); organization and predictive analytics on biomedical (cardiac) data
NASA Astrophysics Data System (ADS)
Bilal, M. Musa; Hussain, Masood; Basharat, Iqra; Fatima, Mamuna
2013-10-01
Data mining and data analytics has been of immense importance to many different fields as we witness the evolution of data sciences over recent years. Biostatistics and Medical Informatics has proved to be the foundation of many modern biological theories and analysis techniques. These are the fields which applies data mining practices along with statistical models to discover hidden trends from data that comprises of biological experiments or procedures on different entities. The objective of this research study is to develop a system for the efficient extraction, transformation and loading of such data from cardiologic procedure reports given by Armed Forces Institute of Cardiology. It also aims to devise a model for the predictive analysis and classification of this data to some important classes as required by cardiologists all around the world. This includes predicting patient impressions and other important features.
Conformal Bootstrap in Mellin Space
NASA Astrophysics Data System (ADS)
Gopakumar, Rajesh; Kaviraj, Apratim; Sen, Kallol; Sinha, Aninda
2017-02-01
We propose a new approach towards analytically solving for the dynamical content of conformal field theories (CFTs) using the bootstrap philosophy. This combines the original bootstrap idea of Polyakov with the modern technology of the Mellin representation of CFT amplitudes. We employ exchange Witten diagrams with built-in crossing symmetry as our basic building blocks rather than the conventional conformal blocks in a particular channel. Demanding consistency with the operator product expansion (OPE) implies an infinite set of constraints on operator dimensions and OPE coefficients. We illustrate the power of this method in the ɛ expansion of the Wilson-Fisher fixed point by reproducing anomalous dimensions and, strikingly, obtaining OPE coefficients to higher orders in ɛ than currently available using other analytic techniques (including Feynman diagram calculations). Our results enable us to get a somewhat better agreement between certain observables in the 3D Ising model and the precise numerical values that have been recently obtained.
Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.
Wieczerzak, M; Namieśnik, J; Kudłak, B
2016-09-01
For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.
ENANTIOMER-SPECIFIC FATE AND EFFECTS OF MODERN CHIRAL PESTICIDES
This slide presentation presents enantiomer-specific fate and effects of modern chiral pesticides. The research areas presented were analytical separation of enantiomers; environmental occurrence of enantiomers; transformation rates and enantioselectivity; bioaccumulation; and e...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-21
... Food Modernization Safety Act for Private Laboratory Managers AGENCY: Food and Drug Administration, HHS... Food Modernization Safety Act for Private Laboratory Managers.'' The topic to be discussed is the...
Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS
NASA Astrophysics Data System (ADS)
Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas
2014-05-01
Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected examples on combining isotopic systems for the study of ecosystem processes on different spatial scales will underpin the great opportunities substantiated by the field of analytical ecogeochemistry. Moreover, recent developments in plasma mass spectrometry and the application of new isotopic systems require sound metrological approaches in order to prevent scientific conclusions drawn from analytical artifacts.
Gómez-Robles, Aida; Bermúdez de Castro, José María; Arsuaga, Juan-Luis; Carbonell, Eudald; Polly, P. David
2013-01-01
A central problem in paleoanthropology is the identity of the last common ancestor of Neanderthals and modern humans ([N-MH]LCA). Recently developed analytical techniques now allow this problem to be addressed using a probabilistic morphological framework. This study provides a quantitative reconstruction of the expected dental morphology of the [N-MH]LCA and an assessment of whether known fossil species are compatible with this ancestral position. We show that no known fossil species is a suitable candidate for being the [N-MH]LCA and that all late Early and Middle Pleistocene taxa from Europe have Neanderthal dental affinities, pointing to the existence of a European clade originated around 1 Ma. These results are incongruent with younger molecular divergence estimates and suggest at least one of the following must be true: (i) European fossils and the [N-MH]LCA selectively retained primitive dental traits; (ii) molecular estimates of the divergence between Neanderthals and modern humans are underestimated; or (iii) phenotypic divergence and speciation between both species were decoupled such that phenotypic differentiation, at least in dental morphology, predated speciation. PMID:24145426
Big Data Analytics for Prostate Radiotherapy.
Coates, James; Souhami, Luis; El Naqa, Issam
2016-01-01
Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.
Big Data Analytics for Prostate Radiotherapy
Coates, James; Souhami, Luis; El Naqa, Issam
2016-01-01
Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211
NASA Astrophysics Data System (ADS)
Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.
2017-12-01
Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.
3D-printed and CNC milled flow-cells for chemiluminescence detection.
Spilstead, Kara B; Learey, Jessica J; Doeven, Egan H; Barbante, Gregory J; Mohr, Stephan; Barnett, Neil W; Terry, Jessica M; Hall, Robynne M; Francis, Paul S
2014-08-01
Herein we explore modern fabrication techniques for the development of chemiluminescence detection flow-cells with features not attainable using the traditional coiled tubing approach. This includes the first 3D-printed chemiluminescence flow-cells, and a milled flow-cell designed to split the analyte stream into two separate detection zones within the same polymer chip. The flow-cells are compared to conventional detection systems using flow injection analysis (FIA) and high performance liquid chromatography (HPLC), with the fast chemiluminescence reactions of an acidic potassium permanganate reagent with morphine and a series of adrenergic phenolic amines. Copyright © 2014 Elsevier B.V. All rights reserved.
Next-Generation Machine Learning for Biological Networks.
Camacho, Diogo M; Collins, Katherine M; Powers, Rani K; Costello, James C; Collins, James J
2018-06-14
Machine learning, a collection of data-analytical techniques aimed at building predictive models from multi-dimensional datasets, is becoming integral to modern biological research. By enabling one to generate models that learn from large datasets and make predictions on likely outcomes, machine learning can be used to study complex cellular systems such as biological networks. Here, we provide a primer on machine learning for life scientists, including an introduction to deep learning. We discuss opportunities and challenges at the intersection of machine learning and network biology, which could impact disease biology, drug discovery, microbiome research, and synthetic biology. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Klingenberg, Guenter; Heimerl, Joseph M.
A repository of fundamental experimental and analytical data concerning the complex phenomena associated with gun-muzzle blast and flash effects is presented, proceeding from gun muzzle signatures to modern gun-propulsion concepts, interior and transitional ballistics, and characterizations of blast-wave research and muzzle flash. Data are presented in support of a novel hypothesis which explains the ignition of secondary flash and elucidates the means for its suppression. Both chemical and mechanical (often competing) methods of flash suppression are treated. The historical work of Kesslau and Ladenburg is noted, together with French, British, Japanese and American research efforts and current techniques of experimental characterization for gun muzzle phenomena.
Comparative study of some commercial samples of naga bhasma.
Wadekar, Mrudula; Gogte, Viswas; Khandagale, Prasad; Prabhune, Asmita
2004-04-01
Naga bhasma is one of those reputed ayurvedic bhasmas which are claimed to possess some extraordinary medical properties. However, identification of a genuine sample of naga bhasma is a challenging problem. Because at present naga bhasma is manufactured by different ayurvedic pharmacies, by following different methods, these products are not standardised either from chemical and structural point of view. Therefore, comparative study of these samples using modern analytical techniques is important and necessary to understand their current status. In this communication, such study of naga bhasma from chemical and structural point of view is reported by using XRD, IR and UV spectroscopy and thermogravimetry.
Accurate mass measurements and their appropriate use for reliable analyte identification.
Godfrey, A Ruth; Brenton, A Gareth
2012-09-01
Accurate mass instrumentation is becoming increasingly available to non-expert users. This data can be mis-used, particularly for analyte identification. Current best practice in assigning potential elemental formula for reliable analyte identification has been described with modern informatic approaches to analyte elucidation, including chemometric characterisation, data processing and searching using facilities such as the Chemical Abstracts Service (CAS) Registry and Chemspider.
NASA Technical Reports Server (NTRS)
Panda, Binayak; Gorti, Sridhar
2013-01-01
A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe Micro-analyzer (EPMA) for very precise microanalysis are available as needed by the researcher. Space Station researchers are invited to work with MSFC in analyzing their samples using these techniques.
NASA Astrophysics Data System (ADS)
Dodd, J. P.; Sharp, Z. D.; Fawcett, P.
2008-12-01
Oxygen isotope values of biogenic silica from diatom frustules are a commonly used proxy in freshwater and marine environments, and provide a valuable archive of paleoclimatic information such as temperature and water cycle processes. Advances in analytical techniques have made oxygen isotope measurements of diatom silica more robust; however, to date, there are multiple published fractionation factors for biogenic silica, with no general consensus on which is 'correct.' Previous studies (e.g. Moschen et al, 2005) demonstrated that there is no difference in SiO2-H2O fractionation between different size fractions of diatoms and, therefore, no species-dependent effects. The SiO2-H2O fractionation factors observed in laboratory grown diatoms analyzed by Brandriss et al. (1998) and modern lacustrine diatoms (Moschen et al., 2005) are in close agreement (τ = -0.2‰/°C) and are defined by the equations 1000lnα SiO2-H2O = 15.56 (103 T-1) - 20.92 and 1000lnα SiO2-H2O = 20.5 (103 T-1) - 36.2, respectively. However, these studies are not in agreement with other published SiO2-H2O fractionation factors for biogenic silica in marine and freshwater environments. In order to effectively utilize diatom δ18O values as a climate proxy, it is necessary to understand how oxygen isotopes are fractionated during silica frustule formation and identify potential errors in δ18O values obtained through different analytic/purification processes. Here we present oxygen isotope data from modern diatom species collected from a wide variety of natural riverine and lacustrine environments in northern New Mexico, USA. Temperatures at collection sites ranged from 5.5°C to 37.8°C. Preliminary isotope data indicate a SiO2-H2O fractionation factor identical to Brandriss et al. (1998). Additional experiments were undertaken to examine the effect of differing chemical purification techniques (i.e. HNO3, H2O2, and NaOH) on modern diatoms to see if processing techniques might affect the δ18O values of modern samples. Visual inspection of diatom frustules with a scanning electron microscope before and after treatment with HNO3 indicates no physical alteration of the frustule structure. To discount the possibility of oxygen exchange between diatom SiO2 and HNO3, samples were treated with an 18O-enriched nitric acid (1000‰), and the resulting δ18O values were essentially unchanged. Organic content following treatment with HNO3 was measured with an elemental analyzer and diatoms were considered to be pure SiO2 once weight percent carbon dropped below 0.01%. When diatoms were treated with H2O2 alone, significant organic material (>5 weight percent carbon) remained. Oxygen isotope values were obtained using a laser-extraction, stepwise fluorination technique that provides an additional visual confirmation of diatom purity. When pure F2 was introduced to the laser chamber during prefluorination, any sample with greater than 0.5 weight % carbon reacted violently to produce CF4 and O2 gas, and resulted in anomalous δ18O diatom values.
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.
2015-07-01
The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.
Multiplex biosensing with highly sensitive magnetic nanoparticle quantification method
NASA Astrophysics Data System (ADS)
Nikitin, M. P.; Orlov, A. V.; Znoyko, S. L.; Bragina, V. A.; Gorshkov, B. G.; Ksenevich, T. I.; Cherkasov, V. R.; Nikitin, P. I.
2018-08-01
Unique properties of magnetic nanoparticles (MNP) have provided many breakthrough solutions for life science. The immense potential of MNP as labels in advanced immunoassays stems from the fact that they, unlike optical labels, can be easily detected inside 3D opaque porous biosensing structures or in colored mediums, manipulated by an external magnetic field, exhibit high stability and negligible background signal in biological samples, etc. In this research, the magnetic nanolabels and an original technique of their quantification by non-linear magnetization have permitted development of novel methods of multiplex biosensing. Several types of highly sensitive multi-channel readers that offer an extremely wide linear dynamic range are developed to count MNP in different recognition zones for quantitative concentration measurements of various analytes. Four approaches to multiplex biosensing based on MNP have been demonstrated in one-run tests based on several 3D porous structures; flat and micropillar microfluidic sensor chips; multi-line lateral flow strips and modular architecture of the strips, which is the first 3D multiplexing method that goes beyond the traditional planar techniques. Detection of cardio- and cancer markers, small molecules and oligonucleotides were used in the experiments. The analytical characteristics of the developed multiplex methods are on the level of the modern time-consuming laboratory techniques. The developed multiplex biosensing platforms are promising for medical and veterinary diagnostics, food inspection, environmental and security monitoring, etc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.
2015-01-01
The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less
Matching in an undisturbed natural human environment.
McDowell, J J; Caron, Marcia L
2010-05-01
Data from the Oregon Youth Study, consisting of the verbal behavior of 210 adolescent boys determined to be at risk for delinquency (targets) and 210 of their friends (peers), were analyzed for their conformance to the complete family of matching theory equations in light of recent findings from the basic science, and using recently developed analytic techniques. Equations of the classic and modern theories of matching were fitted as ensembles to rates and time allocations of the boys' rule-break and normative talk obtained from conversations between pairs of boys. The verbal behavior of each boy in a conversation was presumed to be reinforced by positive social responses from the other boy. Consistent with recent findings from the basic science, the boys' verbal behavior was accurately described by the modern but not the classic theory of matching. These findings also add support to the assertion that basic principles and processes that are known to govern behavior in laboratory experiments also govern human social behavior in undisturbed natural environments.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Response Surface Methods for Spatially-Resolved Optical Measurement Techniques
NASA Technical Reports Server (NTRS)
Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.
2003-01-01
Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatial ly-re solved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/-30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-light, inflatable space antenna at NASA Langley Research Center.
Nanoscale infrared spectroscopy as a non-destructive probe of extraterrestrial samples.
Dominguez, Gerardo; Mcleod, A S; Gainsforth, Zack; Kelly, P; Bechtel, Hans A; Keilmann, Fritz; Westphal, Andrew; Thiemens, Mark; Basov, D N
2014-12-09
Advances in the spatial resolution of modern analytical techniques have tremendously augmented the scientific insight gained from the analysis of natural samples. Yet, while techniques for the elemental and structural characterization of samples have achieved sub-nanometre spatial resolution, infrared spectral mapping of geochemical samples at vibrational 'fingerprint' wavelengths has remained restricted to spatial scales >10 μm. Nevertheless, infrared spectroscopy remains an invaluable contactless probe of chemical structure, details of which offer clues to the formation history of minerals. Here we report on the successful implementation of infrared near-field imaging, spectroscopy and analysis techniques capable of sub-micron scale mineral identification within natural samples, including a chondrule from the Murchison meteorite and a cometary dust grain (Iris) from NASA's Stardust mission. Complementary to scanning electron microscopy, energy-dispersive X-ray spectroscopy and transmission electron microscopy probes, this work evidences a similarity between chondritic and cometary materials, and inaugurates a new era of infrared nano-spectroscopy applied to small and invaluable extraterrestrial samples.
NASA Astrophysics Data System (ADS)
Javier Romualdez, Luis
Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.
A Modern Approach to College Analytical Chemistry.
ERIC Educational Resources Information Center
Neman, R. L.
1983-01-01
Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…
Teaching social responsibility in analytical chemistry.
Valcárcel, M; Christian, G D; Lucena, R
2013-07-02
Analytical chemistry is key to the functioning of a modern society. From early days, ethics in measurements have been a concern and that remains today, especially as we have come to rely more on the application of analytical science in many aspects of our lives. The main aim of this Feature is to suggest ways of introducing the topic of social responsibility and its relation to analytical chemistry in undergraduate or graduate chemistry courses.
Fiber optic evanescent wave biosensor
NASA Astrophysics Data System (ADS)
Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.
1991-09-01
The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).
Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao
2015-05-01
There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
NASA Astrophysics Data System (ADS)
Gerontas, Apostolos
2014-08-01
Chromatographic instrumentation has been really influential in shaping the modern chemical practice, and yet it has been largely overlooked by history of science.Gas chromatography in the 1960s was considered the analytical technique closer to becoming dominant, and being the first automated chromatography set the standards that all the subsequent chromatographic instrumentation had to fulfill. Networks of specialists, groups of actors, corporate strategies and the analytical practice itself, were all affected and in many ways because of the entrance of gas chromatography in the chemical laboratory and in the instrumentation market. This paper gives a view of the early history of the gas chromatography instrumentation, relates it to the broader research-technology phenomenon and discusses issues of education and group reproduction in the case of the groups of technologists of the era. The chaotic elements of knowledge transfer during the instrumentation revolution in chemistry are being highlighted and they are being connected to the observable radical innovation of the period.
Costa, Paulo R; Caldas, Linda V E
2002-01-01
This work presents the development and evaluation using modern techniques to calculate radiation protection barriers in clinical radiographic facilities. Our methodology uses realistic primary and scattered spectra. The primary spectra were computer simulated using a waveform generalization and a semiempirical model (the Tucker-Barnes-Chakraborty model). The scattered spectra were obtained from published data. An analytical function was used to produce attenuation curves from polychromatic radiation for specified kVp, waveform, and filtration. The results of this analytical function are given in ambient dose equivalent units. The attenuation curves were obtained by application of Archer's model to computer simulation data. The parameters for the best fit to the model using primary and secondary radiation data from different radiographic procedures were determined. They resulted in an optimized model for shielding calculation for any radiographic room. The shielding costs were about 50% lower than those calculated using the traditional method based on Report No. 49 of the National Council on Radiation Protection and Measurements.
[Artistic creativity in the light of Jungian analytical psychology].
Trixler, Mátyás; Gáti, Agnes; Tényi, Tamás
2010-01-01
C.G. Jung's analytical psychology points at important issues in the psychological understanding of creativity. The theories of the Collective Unconscious and the Archetypes contributed to important discoveries in the interpretation of artistic creativity. Jung was concerned to show the relevance of Analytical Psychology to the understanding of European Modernism. Our paper deals with a short Jungian interpretation of Csontvary's art, too.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
NASA Astrophysics Data System (ADS)
Livings, R. A.; Dayal, V.; Barnard, D. J.; Hsu, D. K.
2012-05-01
Ceramic tiles are the main ingredient of a multi-material, multi-layered composite being considered for the modernization of tank armors. The high stiffness, low attenuation, and precise dimensions of these uniform tiles make them remarkable resonators when driven to vibrate. Defects in the tile, during manufacture or after usage, are expected to change the resonance frequencies and resonance images of the tile. The comparison of the resonance frequencies and resonance images of a pristine tile/lay-up to a defective tile/lay-up will thus be a quantitative damage metric. By examining the vibrational behavior of these tiles and the composite lay-up with Finite Element Modeling and analytical plate vibration equations, the development of a new Nondestructive Evaluation technique is possible. This study examines the development of the Air-Coupled Ultrasonic Resonance Imaging technique as applied to a hexagonal ceramic tile and a multi-material, multi-layered composite.
Artificial intelligence in healthcare: past, present and future.
Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun
2017-12-01
Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Artificial intelligence in healthcare: past, present and future
Jiang, Fei; Jiang, Yong; Zhi, Hui; Dong, Yi; Li, Hao; Ma, Sufeng; Wang, Yilong; Dong, Qiang; Shen, Haipeng; Wang, Yongjun
2017-01-01
Artificial intelligence (AI) aims to mimic human cognitive functions. It is bringing a paradigm shift to healthcare, powered by increasing availability of healthcare data and rapid progress of analytics techniques. We survey the current status of AI applications in healthcare and discuss its future. AI can be applied to various types of healthcare data (structured and unstructured). Popular AI techniques include machine learning methods for structured data, such as the classical support vector machine and neural network, and the modern deep learning, as well as natural language processing for unstructured data. Major disease areas that use AI tools include cancer, neurology and cardiology. We then review in more details the AI applications in stroke, in the three major areas of early detection and diagnosis, treatment, as well as outcome prediction and prognosis evaluation. We conclude with discussion about pioneer AI systems, such as IBM Watson, and hurdles for real-life deployment of AI. PMID:29507784
ERIC Educational Resources Information Center
Shaban, Zakariyya Shaban
2015-01-01
This study aimed to investigate arrange of include communication skills text books modernism and contemporary value, and is there experience sequence, and the study tried to determine the orientation behind this concentration. A list of values included 10th modernism and contemporary values. Content analysis was used as a tool in collecting data,…
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C
2014-01-01
Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patients pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (SBM), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or QCP) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patients physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patients condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.
Many-core graph analytics using accelerated sparse linear algebra routines
NASA Astrophysics Data System (ADS)
Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric
2016-05-01
Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.
Cornut, P-L; Soldermann, Y; Robin, C; Barranco, R; Kerhoas, A; Burillon, C
2013-12-01
To report the financial impact of using modern lens and vitreoretinal surgical techniques. Bottom-up sterilization and consumables costs for new surgical techniques (microincisional coaxial phacoemulsification and transconjunctival sutureless vitrectomy) and the corresponding former techniques (phacoemulsification with 3.2-mm incision and 20G vitrectomy) were determined. These costs were compared to each other and to the target costs of the Diagnosis Related Groups for public hospitals (Groupes Homogènes de Séjours [GHS]) concerned, extracted from the analytic accounting data of the French National Cost Study (Étude Nationale des Coûts [ENC]) for 2009 (target=sum of sterilization costs posted under medical logistics, consumables, implantable medical devices, and special pharmaceuticals posted as direct expenses). For outpatient lens surgery with or without vitrectomy (GHS code: 02C05J): the ENC's target cost for 2009 was 339€ out of a total of 1432€. The cost detailed in this study was 4 % higher than the target cost when the procedure was performed using the former technique (3.2mm sutured incision) and 12 % lower when the procedure was performed using the new technique (1.8mm sutureless) after removing now unnecessary consumables and optimization of the technique. For level I retinal detachment surgeries (GHS code: 02C021): the ENC's 2009 target cost was 641€ out of a total of 3091€. The cost specified in this study was 1 % lower than the target cost when the procedure was done using the former technique (20-G vitrectomy) and 16 % less when the procedure was performed using the new technique (transconjunctival vitrectomy) after removal of now unnecessary consumables and optimization of the technique. Contrary to generally accepted ideas, implementing modern techniques in ocular surgery can result in direct cost and sterilization savings when the operator takes advantage of the possibilities these techniques offer in terms of simplification of the procedures to do away with consumables that are no longer necessary. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
The use of analytical sedimentation velocity to extract thermodynamic linkage.
Cole, James L; Correia, John J; Stafford, Walter F
2011-11-01
For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. Copyright © 2011 Elsevier B.V. All rights reserved.
The use of analytical sedimentation velocity to extract thermodynamic linkage
Cole, James L.; Correia, John J.; Stafford, Walter F.
2011-01-01
For 25 years, the Gibbs Conference on Biothermodynamics has focused on the use of thermodynamics to extract information about the mechanism and regulation of biological processes. This includes the determination of equilibrium constants for macromolecular interactions by high precision physical measurements. These approaches further reveal thermodynamic linkages to ligand binding events. Analytical ultracentrifugation has been a fundamental technique in the determination of macromolecular reaction stoichiometry and energetics for 85 years. This approach is highly amenable to the extraction of thermodynamic couplings to small molecule binding in the overall reaction pathway. In the 1980’s this approach was extended to the use of sedimentation velocity techniques, primarily by the analysis of tubulin-drug interactions by Na and Timasheff. This transport method necessarily incorporates the complexity of both hydrodynamic and thermodynamic nonideality. The advent of modern computational methods in the last 20 years has subsequently made the analysis of sedimentation velocity data for interacting systems more robust and rigorous. Here we review three examples where sedimentation velocity has been useful at extracting thermodynamic information about reaction stoichiometry and energetics. Approaches to extract linkage to small molecule binding and the influence of hydrodynamic nonideality are emphasized. These methods are shown to also apply to the collection of fluorescence data with the new Aviv FDS. PMID:21703752
Recent developments in the analysis of toxic elements.
Lisk, D J
1974-06-14
One may conclude that it is impractical to confine oneself to any one analytical method since ever more sensitive instrumentation continues to be produced. However, in certain methods such as anodic stripping voltammetry and flameless atomic absorption it may be background contamination from reagent impurities and surroundings rather than instrument sensitivity which controls the limits of element detection. The problem of contamination from dust or glassware is greatly magnified when the sample size becomes ever smaller. Air entering laboratories near highways may contain trace quantities of lead, cadmium, barium, antimony, and other elements from engine exhaust. Even plastic materials contacting the sample may be suspect as a source of contamination since specific metals may be used as catalysts in the synthesis of the plastic and traces may be retained in it. Certain elements may even be deliberately added to plastics during manufacture for identification purposes. Nondestructive methods such as neutron activation and x-ray techniques thus offer great advantages not only in time but in the elimination of impurities introduced during sample ashing. Future improvements in attainable limits of detection may arise largely from progress in the ultrapurification of reagents and "clean-room" techniques. Finally, the competence of the analyst is also vitally important in the skillful operation of modern complex analytical instrumentation and in the experienced evaluation of data.
Fast ray-tracing of human eye optics on Graphics Processing Units.
Wei, Qi; Patkar, Saket; Pai, Dinesh K
2014-05-01
We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo
2015-01-15
Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.
Streaming Swarm of Nano Space Probes for Modern Analytical Methods Applied to Planetary Science
NASA Astrophysics Data System (ADS)
Vizi, P. G.; Horvath, A. F.; Berczi, Sz.
2017-11-01
Streaming swarms gives possibilities to collect data from big fields in one time. The whole streaming fleet possible to behave like one big organization and can be realized as a planetary mission solution with stream type analytical methods.
A MASSive Laboratory Tour. An Interactive Mass Spectrometry Outreach Activity for Children
NASA Astrophysics Data System (ADS)
Jungmann, Julia H.; Mascini, Nadine E.; Kiss, Andras; Smith, Donald F.; Klinkert, Ivo; Eijkel, Gert B.; Duursma, Marc C.; Cillero Pastor, Berta; Chughtai, Kamila; Chughtai, Sanaullah; Heeren, Ron M. A.
2013-07-01
It is imperative to fascinate young children at an early stage in their education for the analytical sciences. The exposure of the public to mass spectrometry presently increases rapidly through the common media. Outreach activities can take advantage of this exposure and employ mass spectrometry as an exquisite example of an analytical science in which children can be fascinated. The presented teaching modules introduce children to mass spectrometry and give them the opportunity to experience a modern research laboratory. The modules are highly adaptable and can be applied to young children from the age of 6 to 14 y. In an interactive tour, the students explore three major scientific concepts related to mass spectrometry; the building blocks of matter, charged particle manipulation by electrostatic fields, and analyte identification by mass analysis. Also, the students carry out a mass spectrometry experiment and learn to interpret the resulting mass spectra. The multistage, inquiry-based tour contains flexible methods, which teach the students current-day research techniques and possible applications to real research topics. Besides the scientific concepts, laboratory safety and hygiene are stressed and the students are enthused for the analytical sciences by participating in "hands-on" work. The presented modules have repeatedly been successfully employed during laboratory open days. They are also found to be extremely suitable for (early) high school science classes during laboratory visit-focused field trips.
THE IMPORTANCE OF PROPER INTENSITY CALIBRATION FOR RAMAN ANALYSIS OF LOW-LEVEL ANALYTES IN WATER
Modern dispersive Raman spectroscopy offers unique advantages for the analysis of low-concentration analytes in aqueous solution. However, we have found that proper intensity calibration is critical for obtaining these benefits. This is true not only for producing spectra with ...
Morokhovets, Halyna Yu; Lysanets, Yuliia V
The main objectives of higher medical education is the continuous professional improvement of physicians to meet the needs dictated by the modern world both at undergraduate and postgraduate levels. In this respect, the system of higher medical education has undergone certain changes - from determining the range of professional competences to the adoption of new standards of education in medicine. The article aims to analyze the parameters of doctor's professionalism in the context of competence-based approach and to develop practical recommendations for the improvement of instruction techniques. The authors reviewed the psycho-pedagogical materials and summarized the acquired experience of teachers at higher medical institutions as to the development of instruction techniques in the modern educational process. The study is based on the results of testing via the technique developed by T.I. Ilyina. Analytical and biblio-semantic methods were used in the paper. It has been found that the training process at medical educational institution should be focused on the learning outcomes. The authors defined the quality parameters of doctors' training and suggested the model for developing the professional competence of medical students. This model explains the cause-and-effect relationships between the forms of instruction, teaching techniques and specific components of professional competence in future doctors. The paper provides practical recommendations on developing the core competencies which a qualified doctor should master. The analysis of existing interactive media in Ukraine and abroad has been performed. It has been found that teaching the core disciplines with the use of latest technologies and interactive means keeps abreast of the times, while teaching social studies and humanities to medical students still involves certain difficulties.
Nanoflow Separation of Amino Acids for the Analysis of Cosmic Dust
NASA Technical Reports Server (NTRS)
Martin, M. P.; Glavin, D. P.; Dworkin, Jason P.
2008-01-01
The delivery of amino acids to the early Earth by interplanetary dust particles, comets, and carbonaceous meteorites could have been a significant source of the early Earth's prebiotic organic inventory. Amino acids are central to modern terrestrial biochemistry as major components of proteins and enzymes and were probably vital in the origin of life. A variety of amino acids have been detected in the CM carbonaceous meteorite Murchison, many of which are exceptionally rare in the terrestrial biosphere including a-aminoisobutyric acid (AIB) and isovaline. AIB has also been detected in a small percentage of Antarctic micrometeorite grains believed to be related to the CM meteorites We report on progress in optimizing a nanoflow liquid chromatography separation system with dual detection via laser-induced-fluorescence time of flight mass spectrometry (nLC-LIF/ToF-MS) for the analysis of o-phthaldialdehydelN-acetyl-L-cysteine (OPA/NAC) labeled amino acids in cosmic dust grains. The very low flow rates (<3 micro-L/min) of nLC over analytical LC (>0.1 ml/min) combined with <2 micron column bead sizes has the potential to produce efficient analyte ionizations andchromatograms with very sharp peaks; both increase sensitivity. The combination of the selectivity (only primary amines are derivatized), sensitivity (>4 orders of magnitude lower than traditional GC-MS techniques), and specificity (compounds identities are determined by both retention time and exact mass) makes this a compelling technique. However, the development of an analytical method to achieve separation of compounds as structurally similar as amino acid monomers and produce the sharp peaks required for maximum sensitivity is challenging.
Multi-technique characterisation of commercial alizarin-based lakes
NASA Astrophysics Data System (ADS)
Pronti, Lucilla; Mazzitelli, Jean-Baptiste; Bracciale, Maria Paola; Massini Rosati, Lorenzo; Vieillescazes, Cathy; Santarelli, Maria Laura; Felici, Anna Candida
2018-07-01
The characterization of ancient and modern alizarin-based lakes is a largely studied topic in the literature. Analytical data on contemporary alizarin-based lakes, however, are still poor, though of primary importance, since these lakes might be indeed present in contemporary and fake paintings as well as in retouchings. In this work we systematically investigate the chemical composition and the optical features of fifteen alizarin-based lakes, by a multi-analytical technique approach combining spectroscopic methods (i.e. Energy Dispersive X-ray Fluorescence Spectroscopy, EDXRF; Attenuated Total Reflectance Fourier-Transform Infrared Spectroscopy, ATR-FTIR; X-ray Powder Diffraction, XRD; UV induced fluorescence and reflectance spectroscopies) and chromatography (i.e. High-performance Liquid Chromatography coupled with a Photodiode Array Detector, HPLC-PDA). Most of the samples contain typical compounds from the natural roots of madder, as occurring in ancient and modern lakes, but in two samples (23600-Kremer-Pigmente and alizarin crimson-Zecchi) any anthraquinonic structures were identified, thus leading to hypothesize the presence of synthetic dyes. The detection of lucidin primeveroside and ruberythrique acid in some lakes suggest the use of Rubia tinctorum. One sample (23610-Kremer-Pigmente) presents alizarin as the sole compound, thereby revealing to be a synthetic dye. Moreover, gibbsite, alunite and kaolinite were found to be used as substrates and/or mordants. Visible absorption spectra of the anthraquinonic lakes show two main absorption bands at about 494-511 nm and 537-564 nm, along with a shoulder at about 473-479 nm in presence of high amounts of purpurin. Finally, from the results obtained by UV induced fluorescence spectroscopy it is possible to figure out that, although it is commonly assumed that the madder lake presents an orange-pink fluorescence, the inorganic compounds, added to the recipe, could induce a quenching phenomenon or an inhibition of the fluorescence, as occurring in some commercial alizarin-based lakes.
Next Generation Space Surveillance System-of-Systems
NASA Astrophysics Data System (ADS)
McShane, B.
2014-09-01
International economic and military dependence on space assets is pervasive and ever-growing in an environment that is now congested, contested, and competitive. There are a number of natural and man-made risks that need to be monitored and characterized to protect and preserve the space environment and the assets within it. Unfortunately, today's space surveillance network (SSN) has gaps in coverage, is not resilient, and has a growing number of objects that get lost. Risks can be efficiently and effectively mitigated, gaps closed, resiliency improved, and performance increased within a next generation space surveillance network implemented as a system-of-systems with modern information architectures and analytic techniques. This also includes consideration for the newest SSN sensors (e.g. Space Fence) which are born Net-Centric out-of-the-box and able to seamlessly interface with the JSpOC Mission System, global information grid, and future unanticipated users. Significant opportunity exists to integrate legacy, traditional, and non-traditional sensors into a larger space system-of-systems (including command and control centers) for multiple clients through low cost sustainment, modification, and modernization efforts. Clients include operations centers (e.g. JSpOC, USSTRATCOM, CANSPOC), Intelligence centers (e.g. NASIC), space surveillance sensor sites (e.g. AMOS, GEODSS), international governments (e.g. Germany, UK), space agencies (e.g. NASA), and academic institutions. Each has differing priorities, networks, data needs, timeliness, security, accuracy requirements and formats. Enabling processes and technologies include: Standardized and type accredited methods for secure connections to multiple networks, machine-to-machine interfaces for near real-time data sharing and tip-and-queue activities, common data models for analytical processing across multiple radar and optical sensor types, an efficient way to automatically translate between differing client and sensor formats, data warehouse of time based space events, secure collaboration tools for international coalition space operations, shared concept-of-operations, tactics, techniques, and procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keck, B D; Ognibene, T; Vogel, J S
2010-02-05
Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of anymore » separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg equivalents. AMS provides an sensitive, accurate and precise method of measuring drug compounds in biological matrices.« less
Mechanical Study of a Modern Yo-Yo
ERIC Educational Resources Information Center
de Izarra, Charles
2011-01-01
This paper presents the study of a modern yo-yo having a centrifugal clutch allowing the free rolling. First, the mechanical parts of the yo-yo are measured, allowing us to determine analytically its velocity according to its height of fall. Then, we are more particularly interested in the centrifugal device constituted by springs and small…
ERIC Educational Resources Information Center
Parkes, Jenny; Heslop, Jo; Januario, Francisco; Oando, Samwel; Sabaa, Susan
2016-01-01
This paper interrogates the influence of a tradition-modernity dichotomy on perspectives and practices on sexual violence and sexual relationships involving girls in three districts of Kenya, Ghana and Mozambique. Through deploying an analytical framework of positioning within multiple discursive sites, we argue that although the dichotomy…
Degradation of glass artifacts: application of modern surface analytical techniques.
Melcher, Michael; Wiesinger, Rita; Schreiner, Manfred
2010-06-15
A detailed understanding of the stability of glasses toward liquid or atmospheric attack is of considerable importance for preserving numerous objects of our cultural heritage. Glasses produced in the ancient periods (Egyptian, Greek, or Roman glasses), as well as modern glass, can be classified as soda-lime-silica glasses. In contrast, potash was used as a flux in medieval Northern Europe for the production of window panes for churches and cathedrals. The particular chemical composition of these potash-lime-silica glasses (low in silica and rich in alkali and alkaline earth components), in combination with increased levels of acidifying gases (such as SO(2), CO(2), NO(x), or O(3)) and airborne particulate matter in today's urban or industrial atmospheres, has resulted in severe degradation of important cultural relics, particularly over the last century. Rapid developments in the fields of microelectronics and computer sciences, however, have contributed to the development of a variety of nondestructive, surface analytical techniques for the scientific investigation and material characterization of these unique and valuable objects. These methods include scanning electron microscopy in combination with energy- or wavelength-dispersive spectrometry (SEM/EDX or SEM/WDX), secondary ion mass spectrometry (SIMS), and atomic force microscopy (AFM). In this Account, we address glass analysis and weathering mechanisms, exploring the possibilities (and limitations) of modern analytical techniques. Corrosion by liquid substances is well investigated in the glass literature. In a tremendous number of case studies, the basic reaction between aqueous solutions and the glass surfaces was identified as an ion-exchange reaction between hydrogen-bearing species of the attacking liquid and the alkali and alkaline earth ions in the glass, causing a depletion of the latter in the outermost surface layers. Although mechanistic analogies to liquid corrosion are obvious, atmospheric attack on glass ("weathering") is much more complex due to the multiphase system (atmosphere, water film, glass surface, and bulk glass) and added complexities (such as relative humidity and atmospheric pollutant concentration). Weathered medieval stained glass objects, as well as artifacts under controlled museum conditions, typically have less transparent or translucent surfaces, often with a thick weathering crust on top, consisting of sulfates of the glass constituents K, Ca, Na, or Mg. In this Account, we try to answer questions about glass analysis and weathering in three main categories. (i) Which chemical reactions are involved in the weathering of glass surfaces? (ii) Which internal factors (such as the glass composition or surface properties) play a dominant role for the weathering process? Can certain environmental or climatic factors be identified as more harmful for glasses than others? Is it possible to set up a quantitative relationship or at least an approximation between the degree of weathering and the factors described above? (iii) What are the consequences for the restoration and conservation strategies of endangered glass objects? How can a severe threat to precious glass objects be avoided, or at least minimized, to preserve these artifacts of our cultural heritage for future generations?
NASA Astrophysics Data System (ADS)
Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna
2016-11-01
Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.
NASA Astrophysics Data System (ADS)
Jafari, S.; Hojjati, M. H.
2011-12-01
Rotating disks work mostly at high angular velocity and this results a large centrifugal force and consequently induce large stresses and deformations. Minimizing weight of such disks yields to benefits such as low dead weights and lower costs. This paper aims at finding an optimal disk thickness profile for minimum weight design using the simulated annealing (SA) and particle swarm optimization (PSO) as two modern optimization techniques. In using semi-analytical the radial domain of the disk is divided into some virtual sub-domains as rings where the weight of each rings must be minimized. Inequality constrain equation used in optimization is to make sure that maximum von Mises stress is always less than yielding strength of the material of the disk and rotating disk does not fail. The results show that the minimum weight obtained for all two methods is almost identical. The PSO method gives a profile with slightly less weight (6.9% less than SA) while the implementation of both PSO and SA methods are easy and provide more flexibility compared with classical methods.
Recent advances in inkjet dispensing technologies: applications in drug discovery.
Zhu, Xiangcheng; Zheng, Qiang; Yang, Hu; Cai, Jin; Huang, Lei; Duan, Yanwen; Xu, Zhinan; Cen, Peilin
2012-09-01
Inkjet dispensing technology is a promising fabrication methodology widely applied in drug discovery. The automated programmable characteristics and high-throughput efficiency makes this approach potentially very useful in miniaturizing the design patterns for assays and drug screening. Various custom-made inkjet dispensing systems as well as specialized bio-ink and substrates have been developed and applied to fulfill the increasing demands of basic drug discovery studies. The incorporation of other modern technologies has further exploited the potential of inkjet dispensing technology in drug discovery and development. This paper reviews and discusses the recent developments and practical applications of inkjet dispensing technology in several areas of drug discovery and development including fundamental assays of cells and proteins, microarrays, biosensors, tissue engineering, basic biological and pharmaceutical studies. Progression in a number of areas of research including biomaterials, inkjet mechanical systems and modern analytical techniques as well as the exploration and accumulation of profound biological knowledge has enabled different inkjet dispensing technologies to be developed and adapted for high-throughput pattern fabrication and miniaturization. This in turn presents a great opportunity to propel inkjet dispensing technology into drug discovery.
Current role of modern radiotherapy techniques in the management of breast cancer
Ozyigit, Gokhan; Gultekin, Melis
2014-01-01
Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Howlett, Cullan
2018-06-01
In this short note we publish the analytic quantile function for the Navarro, Frenk & White (NFW) profile. All known published and coded methods for sampling from the 3D NFW PDF use either accept-reject, or numeric interpolation (sometimes via a lookup table) for projecting random Uniform samples through the quantile distribution function to produce samples of the radius. This is a common requirement in N-body initial condition (IC), halo occupation distribution (HOD), and semi-analytic modelling (SAM) work for correctly assigning particles or galaxies to positions given an assumed concentration for the NFW profile. Using this analytic description allows for much faster and cleaner code to solve a common numeric problem in modern astronomy. We release R and Python versions of simple code that achieves this sampling, which we note is trivial to reproduce in any modern programming language.
Adhesion, friction, wear, and lubrication research by modern surface science techniques.
NASA Technical Reports Server (NTRS)
Keller, D. V., Jr.
1972-01-01
The field of surface science has undergone intense revitalization with the introduction of low-energy electron diffraction, Auger electron spectroscopy, ellipsometry, and other surface analytical techniques which have been sophisticated within the last decade. These developments have permitted submono- and monolayer structure analysis as well as chemical identification and quantitative analysis. The application of a number of these techniques to the solution of problems in the fields of friction, lubrication, and wear are examined in detail for the particular case of iron; and in general to illustrate how the accumulation of pure data will contribute toward the establishment of physiochemical concepts which are required to understand the mechanisms that are operational in friction systems. In the case of iron, LEED, Auger and microcontact studies have established that hydrogen and light-saturated organic vapors do not establish interfaces which prevent iron from welding, whereas oxygen and some oxygen and sulfur compounds do reduce welding as well as the coefficient of friction. Interpretation of these data suggests a mechanism of sulfur interaction in lubricating systems.
Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine
Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit
2017-01-01
In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804
Elucidation of Diels-Alder Reaction Network of 2,5-Dimethylfuran and Ethylene on HY Zeolite Catalyst
DOE Office of Scientific and Technical Information (OSTI.GOV)
Do, Phuong T. M.; McAtee, Jesse R.; Watson, Donald A.
2012-12-12
The reaction of 2,5-dimethylfuran and ethylene to produce p-xylene represents a potentially important route for the conversion of biomass to high-value organic chemicals. Current preparation methods suffer from low selectivity and produce a number of byproducts. Using modern separation and analytical techniques, the structures of many of the byproducts produced in this reaction when HY zeolite is employed as a catalyst have been identified. From these data, a detailed reaction network is proposed, demonstrating that hydrolysis and electrophilic alkylation reactions compete with the desired Diels–Alder/dehydration sequence. This information will allow the rational identification of more selective catalysts and more selectivemore » reaction conditions.« less
Explicit solution techniques for impact with contact constraints
NASA Technical Reports Server (NTRS)
Mccarty, Robert E.
1993-01-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Explicit solution techniques for impact with contact constraints
NASA Astrophysics Data System (ADS)
McCarty, Robert E.
1993-08-01
Modern military aircraft transparency systems, windshields and canopies, are complex systems which must meet a large and rapidly growing number of requirements. Many of these transparency system requirements are conflicting, presenting difficult balances which must be achieved. One example of a challenging requirements balance or trade is shaping for stealth versus aircrew vision. The large number of requirements involved may be grouped in a variety of areas including man-machine interface; structural integration with the airframe; combat hazards; environmental exposures; and supportability. Some individual requirements by themselves pose very difficult, severely nonlinear analysis problems. One such complex problem is that associated with the dynamic structural response resulting from high energy bird impact. An improved analytical capability for soft-body impact simulation was developed.
Coffee-ring effects in laser desorption/ionization mass spectrometry.
Hu, Jie-Bi; Chen, Yu-Chie; Urban, Pawel L
2013-03-05
This report focuses on the heterogeneous distribution of small molecules (e.g. metabolites) within dry deposits of suspensions and solutions of inorganic and organic compounds with implications for chemical analysis of small molecules by laser desorption/ionization (LDI) mass spectrometry (MS). Taking advantage of the imaging capabilities of a modern mass spectrometer, we have investigated the occurrence of "coffee rings" in matrix-assisted laser desorption/ionization (MALDI) and surface-assisted laser desorption/ionization (SALDI) sample spots. It is seen that the "coffee-ring effect" in MALDI/SALDI samples can be both beneficial and disadvantageous. For example, formation of the coffee rings gives rise to heterogeneous distribution of analytes and matrices, thus compromising analytical performance and reproducibility of the mass spectrometric analysis. On the other hand, the coffee-ring effect can also be advantageous because it enables partial separation of analytes from some of the interfering molecules present in the sample. We report a "hidden coffee-ring effect" where under certain conditions the sample/matrix deposit appears relatively homogeneous when inspected by optical microscopy. Even in such cases, hidden coffee rings can still be found by implementing the MALDI-MS imaging technique. We have also found that to some extent, the coffee-ring effect can be suppressed during SALDI sample preparation. Copyright © 2013 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-06
... Analytic Methods and Sampling Procedures for the United States National Residue Program for Meat, Poultry... implementing several multi-residue methods for analyzing samples of meat, poultry, and egg products for animal.... These modern, high-efficiency methods will conserve resources and provide useful and reliable results...
Random Forest as a Predictive Analytics Alternative to Regression in Institutional Research
ERIC Educational Resources Information Center
He, Lingjun; Levine, Richard A.; Fan, Juanjuan; Beemer, Joshua; Stronach, Jeanne
2018-01-01
In institutional research, modern data mining approaches are seldom considered to address predictive analytics problems. The goal of this paper is to highlight the advantages of tree-based machine learning algorithms over classic (logistic) regression methods for data-informed decision making in higher education problems, and stress the success of…
Data Acquisition Programming (LabVIEW): An Aid to Teaching Instrumental Analytical Chemistry.
ERIC Educational Resources Information Center
Gostowski, Rudy
A course was developed at Austin Peay State University (Tennessee) which offered an opportunity for hands-on experience with the essential components of modern analytical instruments. The course aimed to provide college students with the skills necessary to construct a simple model instrument, including the design and fabrication of electronic…
Benazzi, Stefano
2012-01-01
The discovery of new human fossil remains is one of the most obvious ways to improve our understanding of the dynamics of human evolution. The reanalysis of existing fossils using newer methods is also crucial, and may lead to a reconsideration of the biological and taxonomical status of some specimens, and improve our understanding of highly debated periods in human prehistory. This is particularly true for those remains that have previously been studied using traditional approaches, with only morphological descriptions and standard calliper measurements available. My own interest in the Uluzzian, and its associated human remains grew from my interest in applying recently developed analytical techniques to quantify morphological variation. Discovered more than 40 years ago, the two deciduous molars from Grotta del Cavallo (Apulia, Italy) are the only human remains associated with the Uluzzian culture (one of the main three European "transitional" cultures). These teeth were previously attributed to Neanderthals. This attribution contributed to a consensus view that the Uluzzian, with its associated ornament and tool complexes, was produced by Neanderthals. A reassessment of these deciduous teeth by means of digital morphometric analysis revealed that these remains belong to anatomically modern humans (AMHs). This finding contradicts previous assumptions and suggests that modern humans, and not Neanderthals, created the Uluzzian culture. Of equal importance, new chronometric analyses date these dental remains to 43,000-45,000 cal BP. Thus, the teeth from Grotta del Cavallo represent the oldest European AMH currently known.
The University of Arizona program in solid propellants
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar
1989-01-01
The University of Arizona program is aimed at introducing scientific rigor to the predictability and quality assurance of composite solid propellants. Two separate approaches are followed: to use the modern analytical techniques to experimentally study carefully controlled propellant batches to discern trends in mixing, casting, and cure; and to examine a vast bank of data, that has fairly detailed information on the ingredients, processing, and rocket firing results. The experimental and analytical work is described briefly. The principle findings were that: (1) pre- (dry) blending of the coarse and fine ammonium perchlorate can significantly improve the uniformity of mixing; (2) the Fourier transformed IR spectra of the uncured and cured polymer have valuable data on the state of the fuel; (3) there are considerable non-uniformities in the propellant slurry composition near the solid surfaces (blades, walls) compared to the bulk slurry; and (4) in situ measurements of slurry viscosity continuously during mixing can give a good indication of the state of the slurry. Several important observations in the study of the data bank are discussed.
Aravind, S G; Arimboor, Ranjith; Rangan, Meena; Madhavan, Soumya N; Arumughan, C
2008-11-04
Application of modern scientific knowledge coupled with sensitive analytical technique is important for the quality evaluation and standardization of polyherbal formulations. Semecarpus anacardium, an important medicinal plant with wide medicinal properties, is frequently used in a large number of traditional herbal preparations. Tetrahydroamentoflavone (THA), a major bioactive biflavonoid was selected as a chemical marker of S. anacardium and RP-semi-preparative HPLC conditions were optimized for the isolation of tetrahydroamentoflavone. HPTLC analytical method was developed for the fingerprinting of S. anacardium flavonoids and quantification of tetrahydroamentoflavone. The method was validated in terms of their linearity, LOD, LOQ, precision and accuracy and compared with RP-HPLC-DAD method. The methods were demonstrated for the chemical fingerprinting of S. anacardium plant parts and some commercial polyherbal formulations and the amount of tetrahydroamentoflavone was quantified. HPTLC analysis showed that S. anacardium seed contained approximately 10 g kg(-1) of tetrahydroamentoflavone. The methods were able to identify and quantify tetrahydroamentoflavone from complex mixtures of phytochemicals and could be extended to the marker-based standardization of polyherbal formulations, containing S. anacardium.
Automated multi-radionuclide separation and analysis with combined detection capability
NASA Astrophysics Data System (ADS)
Plionis, Alexander Asterios
The radiological dispersal device (RDD) is a weapon of great concern to those agencies responsible for protecting the public from the modern age of terrorism. In order to effectively respond to an RDD event, these agencies need to possess the capability to rapidly identify the radiological agents involved in the incident and assess the uptake of each individual victim. Since medical treatment for internal radiation poisoning is radionuclide-specific, it is critical to identify and quantify the radiological uptake of each individual victim. This dissertation describes the development of automated analytical components that could be used to determine and quantify multiple radionuclides in human urine bioassays. This is accomplished through the use of extraction chromatography that is plumbed in-line with one of a variety of detection instruments. Flow scintillation analysis is used for 90Sr and 210Po determination, flow gamma analysis is used assess 60 Co and 137Cs, and inductively coupled plasma mass spectrometry is used to determine actinides. Detection limits for these analytes were determined for the appropriate technique and related to their implications for health physics.
Hopkins, F B; Gravett, M R; Self, A J; Wang, M; Chua, Hoe-Chee; Hoe-Chee, C; Lee, H S Nancy; Sim, N Lee Hoi; Jones, J T A; Timperley, C M; Riches, J R
2014-08-01
Detailed chemical analysis of solutions used to decontaminate chemical warfare agents can be used to support verification and forensic attribution. Decontamination solutions are amongst the most difficult matrices for chemical analysis because of their corrosive and potentially emulsion-based nature. Consequently, there are relatively few publications that report their detailed chemical analysis. This paper describes the application of modern analytical techniques to the analysis of decontamination solutions following decontamination of the chemical warfare agent O-ethyl S-2-diisopropylaminoethyl methylphosphonothiolate (VX). We confirm the formation of N,N-diisopropylformamide and N,N-diisopropylamine following decontamination of VX with hypochlorite-based solution, whereas they were not detected in extracts of hydroxide-based decontamination solutions by nuclear magnetic resonance (NMR) spectroscopy or gas chromatography-mass spectrometry. We report the electron ionisation and chemical ionisation mass spectroscopic details, retention indices, and NMR spectra of N,N-diisopropylformamide and N,N-diisopropylamine, as well as analytical methods suitable for their analysis and identification in solvent extracts and decontamination residues.
Lewen, Nancy
2011-06-25
The subject of the analysis of various elements, including metals and metalloids, in the pharmaceutical industry has seen increasing importance in the last 10-15 years, as modern analytical instrumentation has afforded analysts with the opportunity to provide element-specific, accurate and meaningful information related to pharmaceutical products. Armed with toxicological data, compendial and regulatory agencies have revisited traditional approaches to the testing of pharmaceuticals for metals and metalloids, and analysts have begun to employ the techniques of atomic spectroscopy, such as flame- and graphite furnace atomic absorption spectroscopy (FAAS, Flame AA or FAA and GFAAS), inductively coupled plasma-atomic emission spectroscopy (ICP-AES) and inductively coupled plasma-mass spectrometry (ICP-MS), to meet their analytical needs. Newer techniques, such as laser-induced breakdown spectroscopy (LIBS) and Laser Ablation ICP-MS (LAICP-MS) are also beginning to see wider applications in the analysis of elements in the pharmaceutical industry.This article will provide a perspective regarding the various applications of atomic spectroscopy in the analysis of metals and metalloids in drug products, active pharmaceutical ingredients (API's), raw materials and intermediates. The application of atomic spectroscopy in the analysis of metals and metalloids in clinical samples, nutraceutical, metabolism and pharmacokinetic samples will not be addressed in this work. Copyright © 2010 Elsevier B.V. All rights reserved.
Evidence for Extended Aqueous Alteration in CR Carbonaceous Chondrites
NASA Technical Reports Server (NTRS)
Trigo-Rodriquez, J. M.; Moyano-Cambero, C. E.; Mestres, N.; Fraxedas, J.; Zolensky, M.; Nakamura, T.; Martins, Z.
2013-01-01
We are currently studying the chemical interrelationships between the main rockforming components of carbonaceous chondrites (hereafter CC), e.g. silicate chondrules, refractory inclusions and metal grains, and the surrounding meteorite matrices. It is thought that the fine-grained materials that form CC matrices are representing samples of relatively unprocessed protoplanetary disk materials [1-3]. In fact, modern non-destructive analytical techniques have shown that CC matrices host a large diversity of stellar grains from many distinguishable stellar sources [4]. Aqueous alteration has played a role in homogeneizing the isotopic content that allows the identification of presolar grains [5]. On the other hand, detailed analytical techniques have found that the aqueously-altered CR, CM and CI chondrite groups contain matrices in which the organic matter has experienced significant processing concomitant to the formation of clays and other minerals. In this sense, clays have been found to be directly associated with complex organics [6, 7]. CR chondrites are particularly relevant in this context as this chondrite group contains abundant metal grains in the interstitial matrix, and inside glassy silicate chondrules. It is important because CR are known for exhibiting a large complexity of organic compounds [8-10], and only metallic Fe is considered essential in Fischer-Tropsch catalysis of organics [11-13]. Therefore, CR chondrites can be considered primitive materials capable to provide clues on the role played by aqueous alteration in the chemical evolution of their parent asteroids.
Kamala C T; Balaram V; Dharmendra V; Satyanarayanan M; Subramanyam K S V; Krishnaiah A
2014-11-01
Recently introduced microwave plasma-atomic emission spectroscopy (MP-AES) represents yet another and very important addition to the existing array of modern instrumental analytical techniques. In this study, an attempt is made to summarize the performance characteristics of MP-AES and its potential as an analytical tool for environmental studies with some practical examples from Patancheru and Uppal industrial sectors of Hyderabad city. A range of soil, sediment, water reference materials, particulate matter, and real-life samples were chosen to evaluate the performance of this new analytical technique. Analytical wavelengths were selected considering the interference effects of other concomitant elements present in different sample solutions. The detection limits for several elements were found to be in the range from 0.05 to 5 ng/g. The trace metals analyzed in both the sectors followed the topography with more pollution in the low-lying sites. The metal contents were found to be more in ground waters than surface waters. Since a decade, the pollutants are transfered from Patancheru industrial area to Musi River. After polluting Nakkavagu and turning huge tracts of agricultural lands barren besides making people residing along the rivulet impotent and sick, industrialists of Patancheru are shifting the effluents to downstream of Musi River through an 18-km pipeline from Patancheru. Since the effluent undergoes primary treatment at Common Effluent Treatment Plant (CETP) at Patanchru and travels through pipeline and mixes with sewage, the organic effluents will be diluted. But the inorganic pollutants such as heavy and toxic metals tend to accumulate in the environmental segments near and downstreams of Musi River. The data generated by MP-AES of toxic metals like Zn, Cu, and Cr in the ground and surface waters can only be attributed to pollution from Patancheru since no other sources are available to Musi River.
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Xiong, Zhenjie; Xie, Anguo; Sun, Da-Wen; Zeng, Xin-An; Liu, Dan
2015-01-01
Currently, the issue of food safety and quality is a great public concern. In order to satisfy the demands of consumers and obtain superior food qualities, non-destructive and fast methods are required for quality evaluation. As one of these methods, hyperspectral imaging (HSI) technique has emerged as a smart and promising analytical tool for quality evaluation purposes and has attracted much interest in non-destructive analysis of different food products. With the main advantage of combining both spectroscopy technique and imaging technique, HSI technique shows a convinced attitude to detect and evaluate chicken meat quality objectively. Moreover, developing a quality evaluation system based on HSI technology would bring economic benefits to the chicken meat industry. Therefore, in recent years, many studies have been conducted on using HSI technology for the safety and quality detection and evaluation of chicken meat. The aim of this review is thus to give a detailed overview about HSI and focus on the recently developed methods exerted in HSI technology developed for microbiological spoilage detection and quality classification of chicken meat. Moreover, the usefulness of HSI technique for detecting fecal contamination and bone fragments of chicken carcasses are presented. Finally, some viewpoints on its future research and applicability in the modern poultry industry are proposed.
Analytical techniques for steroid estrogens in water samples - A review.
Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza
2016-12-01
In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
The flotation and adsorption of mixed collectors on oxide and silicate minerals.
Xu, Longhua; Tian, Jia; Wu, Houqin; Lu, Zhongyuan; Sun, Wei; Hu, Yuehua
2017-12-01
The analysis of flotation and adsorption of mixed collectors on oxide and silicate minerals is of great importance for both industrial applications and theoretical research. Over the past years, significant progress has been achieved in understanding the adsorption of single collectors in micelles as well as at interfaces. By contrast, the self-assembly of mixed collectors at liquid/air and solid/liquid interfaces remains a developing area as a result of the complexity of the mixed systems involved and the limited availability of suitable analytical techniques. In this work, we systematically review the processes involved in the adsorption of mixed collectors onto micelles and at interface by examining four specific points, namely, theoretical background, factors that affect adsorption, analytical techniques, and self-assembly of mixed surfactants at the mineral/liquid interface. In the first part, the theoretical background of collector mixtures is introduced, together with several core solution theories, which are classified according to their application in the analysis of physicochemical properties of mixed collector systems. In the second part, we discuss the factors that can influence adsorption, including factors related to the structure of collectors and environmental conditions. We summarize their influence on the adsorption of mixed systems, with the objective to provide guidance on the progress achieved in this field to date. Advances in measurement techniques can greatly promote our understanding of adsorption processes. In the third part, therefore, modern techniques such as optical reflectometry, neutron scattering, neutron reflectometry, thermogravimetric analysis, fluorescence spectroscopy, ultrafiltration, atomic force microscopy, analytical ultracentrifugation, X-ray photoelectron spectroscopy, Vibrational Sum Frequency Generation Spectroscopy and molecular dynamics simulations are introduced in virtue of their application. Finally, focusing on oxide and silicate minerals, we review and summarize the flotation and adsorption of three most widely used mixed surfactant systems (anionic-cationic, anionic-nonionic, and cationic-nonionic) at the liquid/mineral interface in order to fully understand the self-assembly progress. In the end, the paper gives a brief future outlook of the possible development in the mixed surfactants. Copyright © 2017 Elsevier B.V. All rights reserved.
Response Surface Methods For Spatially-Resolved Optical Measurement Techniques
NASA Technical Reports Server (NTRS)
Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.
2003-01-01
Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatially-resolved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/- 30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-lightweight, inflatable space antenna at NASA Langley Research Center. Photogrammetry is used to simultaneously measure the shape of the antenna at approximately 500 discrete spatial locations. RSM allows an analytic model to be developed that describes the shape of the majority of the antenna with an uncertainty of 0.4 mm, with 95% confidence. This model would allow a quantitative comparison between the actual shape of the antenna and the original design shape. Accurately determining this shape also allows confident interpolation between the measured points. Such a model could, for example, be used for ray tracing of radio-frequency waves up to 95 GHz. to predict the performance of the antenna.
Recommendations for fluorescence instrument qualification: the new ASTM Standard Guide.
DeRose, Paul C; Resch-Genger, Ute
2010-03-01
Aimed at improving quality assurance and quantitation for modern fluorescence techniques, ASTM International (ASTM) is about to release a Standard Guide for Fluorescence, reviewed here. The guide's main focus is on steady state fluorometry, for which available standards and instrument characterization procedures are discussed along with their purpose, suitability, and general instructions for use. These include the most relevant instrument properties needing qualification, such as linearity and spectral responsivity of the detection system, spectral irradiance reaching the sample, wavelength accuracy, sensitivity or limit of detection for an analyte, and day-to-day performance verification. With proper consideration of method-inherent requirements and limitations, many of these procedures and standards can be adapted to other fluorescence techniques. In addition, procedures for the determination of other relevant fluorometric quantities including fluorescence quantum yields and fluorescence lifetimes are briefly introduced. The guide is a clear and concise reference geared for users of fluorescence instrumentation at all levels of experience and is intended to aid in the ongoing standardization of fluorescence measurements.
Fluorescence analysis of ubiquinone and its application in quality control of medical supplies
NASA Astrophysics Data System (ADS)
Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.
2017-02-01
The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of fluorescence analysis into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone is one of the steps for development of the device with a view to clinical diagnostics of redox potential. Recording of fluorescence was carried out with spectrometer using UV irradiation source with thin band (max at 287 and 330 nm) as a background radiation. Concentrations of ubiquinone from 0.25 to 2.5 mmol/l in explored samples were used for investigation. Recording data was processed using correlation analysis and differential analytical technique. The fourth derivative spectrum of fluorescence spectrum provided the basis for a multicomponent analysis of the solutions. As a technique in clinical diagnostics fluorescence analysis with processing method including differential spectrophotometry, it is step forward towards redox potential calculation and quality control in pharmacy for better health care.
Zhuo, Rongjie; Liu, Hao; Liu, Ningning; Wang, Yi
2016-11-11
Identification of active compounds from natural products is a critical and challenging task in drug discovery pipelines. Besides commonly used bio-guided screening approaches, affinity selection strategy coupled with liquid chromatography or mass spectrometry, known as ligand fishing, has been gaining increasing interest from researchers. In this review, we summarized this emerging strategy and categorized those methods as off-line or on-line mode according to their features. The separation principles of ligand fishing were introduced based on distinct analytical techniques, including biochromatography, capillary electrophoresis, ultrafiltration, equilibrium dialysis, microdialysis, and magnetic beads. The applications of ligand fishing approaches in the discovery of lead compounds were reviewed. Most of ligand fishing methods display specificity, high efficiency, and require less sample pretreatment, which makes them especially suitable for screening active compounds from complex mixtures of natural products. We also summarized the applications of ligand fishing in the modernization of Traditional Chinese Medicine (TCM), and propose some perspectives of this remarkable technique.
A Geographically Explicit Genetic Model of Worldwide Human-Settlement History
Liu, Hua; Prugnolle, Franck; Manica, Andrea; Balloux, François
2006-01-01
Currently available genetic and archaeological evidence is generally interpreted as supportive of a recent single origin of modern humans in East Africa. However, this is where the near consensus on human settlement history ends, and considerable uncertainty clouds any more detailed aspect of human colonization history. Here, we present a dynamic genetic model of human settlement history coupled with explicit geographical distances from East Africa, the likely origin of modern humans. We search for the best-supported parameter space by fitting our analytical prediction to genetic data that are based on 52 human populations analyzed at 783 autosomal microsatellite markers. This framework allows us to jointly estimate the key parameters of the expansion of modern humans. Our best estimates suggest an initial expansion of modern humans ∼56,000 years ago from a small founding population of ∼1,000 effective individuals. Our model further points to high growth rates in newly colonized habitats. The general fit of the model with the data is excellent. This suggests that coupling analytical genetic models with explicit demography and geography provides a powerful tool for making inferences on human-settlement history. PMID:16826514
Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)
ERIC Educational Resources Information Center
Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.
2016-01-01
The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…
Dubé, Laurette; Labban, Alice; Moubarac, Jean-Claude; Heslop, Gabriela; Ma, Yu; Paquet, Catherine
2014-12-01
Building greater reciprocity between traditional and modern food systems and better convergence of human and economic development outcomes may enable the production and consumption of accessible, affordable, and appealing nutritious food for all. Information being key to such transformations, this roadmap paper offers a strategy that capitalizes on Big Data and advanced analytics, setting the foundation for an integrative intersectoral knowledge platform to better inform and monitor behavioral change and ecosystem transformation. Building upon the four P's of marketing (product, price, promotion, placement), we examine digital commercial marketing data through the lenses of the four A's of food security (availability, accessibility, affordability, appeal) using advanced consumer choice analytics for archetypal traditional (fresh fruits and vegetables) and modern (soft drinks) product categories. We demonstrate that business practices typically associated with the latter also have an important, if not more important, impact on purchases of the former category. Implications and limitations of the approach are discussed. © 2014 New York Academy of Sciences.
The rise of environmental analytical chemistry as an interdisciplinary activity.
Brown, Richard
2009-07-01
Modern scientific endeavour is increasingly delivered within an interdisciplinary framework. Analytical environmental chemistry is a long-standing example of an interdisciplinary approach to scientific research where value is added by the close cooperation of different disciplines. This editorial piece discusses the rise of environmental analytical chemistry as an interdisciplinary activity and outlines the scope of the Analytical Chemistry and the Environmental Chemistry domains of TheScientificWorldJOURNAL (TSWJ), and the appropriateness of TSWJ's domain format in covering interdisciplinary research. All contributions of new data, methods, case studies, and instrumentation, or new interpretations and developments of existing data, case studies, methods, and instrumentation, relating to analytical and/or environmental chemistry, to the Analytical and Environmental Chemistry domains, are welcome and will be considered equally.
Certified reference materials and reference methods for nuclear safeguards and security.
Jakopič, R; Sturm, M; Kraiem, M; Richter, S; Aregbe, Y
2013-11-01
Confidence in comparability and reliability of measurement results in nuclear material and environmental sample analysis are established via certified reference materials (CRMs), reference measurements, and inter-laboratory comparisons (ILCs). Increased needs for quality control tools in proliferation resistance, environmental sample analysis, development of measurement capabilities over the years and progress in modern analytical techniques are the main reasons for the development of new reference materials and reference methods for nuclear safeguards and security. The Institute for Reference Materials and Measurements (IRMM) prepares and certifices large quantities of the so-called "large-sized dried" (LSD) spikes for accurate measurement of the uranium and plutonium content in dissolved nuclear fuel solutions by isotope dilution mass spectrometry (IDMS) and also develops particle reference materials applied for the detection of nuclear signatures in environmental samples. IRMM is currently replacing some of its exhausted stocks of CRMs with new ones whose specifications are up-to-date and tailored for the demands of modern analytical techniques. Some of the existing materials will be re-measured to improve the uncertainties associated with their certified values, and to enable laboratories to reduce their combined measurement uncertainty. Safeguards involve the quantitative verification by independent measurements so that no nuclear material is diverted from its intended peaceful use. Safeguards authorities pay particular attention to plutonium and the uranium isotope (235)U, indicating the so-called 'enrichment', in nuclear material and in environmental samples. In addition to the verification of the major ratios, n((235)U)/n((238)U) and n((240)Pu)/n((239)Pu), the minor ratios of the less abundant uranium and plutonium isotopes contain valuable information about the origin and the 'history' of material used for commercial or possibly clandestine purposes, and have therefore reached high level of attention for safeguards authorities. Furthermore, IRMM initiated and coordinated the development of a Modified Total Evaporation (MTE) technique for accurate abundance ratio measurements of the "minor" isotope-amount ratios of uranium and plutonium in nuclear material and, in combination with a multi-dynamic measurement technique and filament carburization, in environmental samples. Currently IRMM is engaged in a study on the development of plutonium reference materials for "age dating", i.e. determination of the time elapsed since the last separation of plutonium from its daughter nuclides. The decay of a radioactive parent isotope and the build-up of a corresponding amount of daughter nuclide serve as chronometer to calculate the age of a nuclear material. There are no such certified reference materials available yet. Copyright © 2013 Elsevier Ltd. All rights reserved.
Elemental analyses of modern dust in southern Nevada and California
Reheis, M.C.; Budahn, J.R.; Lamothe, P.J.
1999-01-01
Selected samples of modern dust collected in marble traps at sites in southern Nevada and California (Reheis and Kihl, 1995; Reheis, 1997) have been analyzed for elemental composition using instrumental neutron activation analysis (INAA) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) and inductively coupled plasma mass spectroscopy (ICP-MS). For information on these analytical techniques and their levels of precision and accuracy, refer to Baedecker and McKown (1987) for INAA, to Briggs (1996) for ICP-AES, and to Briggs and Meier (1999) for ICP-MS. This report presents the elemental compositions obtained using these techniques on dust samples collected from 1991 through 1997.The dust-trap sites were established at varying times; some have been maintained since 1984, others since 1991. For details on site location, dust-trap construction, and collection techniques, see Reheis and Kihl (1995) and Reheis (1997). Briefly, the trap consists of a coated angel-food cake pan painted black on the outside and mounted on a post about 2 m above the ground. Glass marbles rest on a circular piece of galvanized hardware cloth (now replaced by stainless-steel mesh), which is fitted into the pan so that it rests 3-4 cm below the rim. The 2-m height eliminates most saltating sand-sized particles. The marbles simulate the effect of a gravelly fan surface and prevent dust that has filtered or washed into the bottom of the pan from being blown back out. The dust traps are fitted with two metal straps looped in an inverted basket shape; the top surfaces of the straps are coated with a sticky material that effectively discourages birds from roosting.
He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping
2017-01-01
There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Psycho-informatics: Big Data shaping modern psychometrics.
Markowetz, Alexander; Błaszkiewicz, Konrad; Montag, Christian; Switala, Christina; Schlaepfer, Thomas E
2014-04-01
For the first time in history, it is possible to study human behavior on great scale and in fine detail simultaneously. Online services and ubiquitous computational devices, such as smartphones and modern cars, record our everyday activity. The resulting Big Data offers unprecedented opportunities for tracking and analyzing behavior. This paper hypothesizes the applicability and impact of Big Data technologies in the context of psychometrics both for research and clinical applications. It first outlines the state of the art, including the severe shortcomings with respect to quality and quantity of the resulting data. It then presents a technological vision, comprised of (i) numerous data sources such as mobile devices and sensors, (ii) a central data store, and (iii) an analytical platform, employing techniques from data mining and machine learning. To further illustrate the dramatic benefits of the proposed methodologies, the paper then outlines two current projects, logging and analyzing smartphone usage. One such study attempts to thereby quantify severity of major depression dynamically; the other investigates (mobile) Internet Addiction. Finally, the paper addresses some of the ethical issues inherent to Big Data technologies. In summary, the proposed approach is about to induce the single biggest methodological shift since the beginning of psychology or psychiatry. The resulting range of applications will dramatically shape the daily routines of researches and medical practitioners alike. Indeed, transferring techniques from computer science to psychiatry and psychology is about to establish Psycho-Informatics, an entire research direction of its own. Copyright © 2013 Elsevier Ltd. All rights reserved.
Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gougar, Hans
2015-02-01
The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less
On the pursuit of a nuclear development capability: The case of the Cuban nuclear program
NASA Astrophysics Data System (ADS)
Benjamin-Alvarado, Jonathan Calvert
1998-09-01
While there have been many excellent descriptive accounts of modernization schemes in developing states, energy development studies based on prevalent modernization theory have been rare. Moreover, heretofore there have been very few analyses of efforts to develop a nuclear energy capability by developing states. Rarely have these analyses employed social science research methodologies. The purpose of this study was to develop a general analytical framework, based on such a methodology to analyze nuclear energy development and to utilize this framework for the study of the specific case of Cuba's decision to develop nuclear energy. The analytical framework developed focuses on a qualitative tracing of the process of Cuban policy objectives and implementation to develop a nuclear energy capability, and analyzes the policy in response to three models of modernization offered to explain the trajectory of policy development. These different approaches are the politically motivated modernization model, the economic and technological modernization model and the economic and energy security model. Each model provides distinct and functionally differentiated expectations for the path of development toward this objective. Each model provides expected behaviors to external stimuli that would result in specific policy responses. In the study, Cuba's nuclear policy responses to stimuli from domestic constraints and intensities, institutional development, and external influences are analyzed. The analysis revealed that in pursuing the nuclear energy capability, Cuba primarily responded by filtering most of the stimuli through the twin objectives of economic rationality and technological advancement. Based upon the Cuban policy responses to the domestic and international stimuli, the study concluded that the economic and technological modernization model of nuclear energy development offered a more complete explanation of the trajectory of policy development than either the politically-motivated or economic and energy security models. The findings of this case pose some interesting questions for the general study of energy programs in developing states. By applying the analytical framework employed in this study to a number of other cases, perhaps the understanding of energy development schemes may be expanded through future research.
Expanding the Security Dimension of Surety
DOE Office of Scientific and Technical Information (OSTI.GOV)
SENGLAUB, MICHAEL E.
1999-10-01
A small effort was conducted at Sandia National Laboratories to explore the use of a number of modern analytic technologies in the assessment of terrorist actions and to predict trends. This work focuses on Bayesian networks as a means of capturing correlations between groups, tactics, and targets. The data that was used as a test of the methodology was obtained by using a special parsing algorithm written in JAVA to create records in a database from information articles captured electronically. As a vulnerability assessment technique the approach proved very useful. The technology also proved to be a valuable development mediummore » because of the ability to integrate blocks of information into a deployed network rather than waiting to fully deploy only after all relevant information has been assembled.« less
Shortcuts to adiabaticity. Suppression of pair production in driven Dirac dynamics
Deffner, Sebastian
2015-12-21
By achieving effectively adiabatic dynamics in finite time, we have found that it is our ubiquitous goal in virtually all areas of modern physics. So-called shortcuts to adiabaticity refer to a set of methods and techniques that allow us to produce in a short time the same final state that would result from an adiabatic, infinitely slow process. In this paper we generalize one of these methods—the fast-forward technique—to driven Dirac dynamics. We find that our main result shortcuts to adiabaticity for the (1+1)-dimensional Dirac equation are facilitated by a combination of both scalar and pseudoscalar potentials. Our findings aremore » illustrated for two analytically solvable examples, namely charged particles driven in spatially homogeneous and linear vector fields.« less
Large deviation analysis of a simple information engine
NASA Astrophysics Data System (ADS)
Maitland, Michael; Grosskinsky, Stefan; Harris, Rosemary J.
2015-11-01
Information thermodynamics provides a framework for studying the effect of feedback loops on entropy production. It has enabled the understanding of novel thermodynamic systems such as the information engine, which can be seen as a modern version of "Maxwell's Dæmon," whereby a feedback controller processes information gained by measurements in order to extract work. Here, we analyze a simple model of such an engine that uses feedback control based on measurements to obtain negative entropy production. We focus on the distribution and fluctuations of the information obtained by the feedback controller. Significantly, our model allows an analytic treatment for a two-state system with exact calculation of the large deviation rate function. These results suggest an approximate technique for larger systems, which is corroborated by simulation data.
First centenary of Röntgen's discovery of X-rays
NASA Astrophysics Data System (ADS)
Valkovic, V.
1996-04-01
Usually it takes a decade or even several decades, from a discovery to its practical applications. This was not the case with X-rays; they were widely applied in medical and industrial radiography within a year of their discovery in 1895 by W.C. Röntgen. Today, X-ray analysis covers a wide range of techniques and fields of applications: from deduction of atomic arrangements by observation of diffraction phenomena to measurements of trace element concentration levels, distributions and maps by measuring fluorescence, X-ray attenuation or scattering. Although the contribution of analytical applications of X-rays to the present knowledge is difficult to surpass, modern application cover a wide range of activities from three-dimensional microfabrication using synchroton radiation to collecting information from the deep space by X-ray astronomy.
Flexible use and technique extension of logistics management
NASA Astrophysics Data System (ADS)
Xiong, Furong
2011-10-01
As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Binder, Michaela; Roberts, Charlotte; Spencer, Neal; Antoine, Daniel; Cartwright, Caroline
2014-01-01
Cancer, one of the world’s leading causes of death today, remains almost absent relative to other pathological conditions, in the archaeological record, giving rise to the conclusion that the disease is mainly a product of modern living and increased longevity. This paper presents a male, young-adult individual from the archaeological site of Amara West in northern Sudan (c. 1200BC) displaying multiple, mainly osteolytic, lesions on the vertebrae, ribs, sternum, clavicles, scapulae, pelvis, and humeral and femoral heads. Following radiographic, microscopic and scanning electron microscopic (SEM) imaging of the lesions, and a consideration of differential diagnoses, a diagnosis of metastatic carcinoma secondary to an unknown soft tissue cancer is suggested. This represents the earliest complete example in the world of a human who suffered metastatic cancer to date. The study further draws its strength from modern analytical techniques applied to differential diagnoses and the fact that it is firmly rooted within a well-documented archaeological and historical context, thus providing new insights into the history and antiquity of the disease as well as its underlying causes and progression. PMID:24637948
Applications of Mass Spectrometry Imaging for Safety Evaluation.
Bonnel, David; Stauber, Jonathan
2017-01-01
Mass spectrometry imaging (MSI) was first derived from techniques used in physics, which were then incorporated into chemistry followed by application in biology. Developed over 50 years ago, and with different principles to detect and map compounds on a sample surface, MSI supports modern biology questions by detecting biological compounds within tissue sections. MALDI (matrix-assisted laser desorption/ionization) imaging trend analysis in this field shows an important increase in the number of publications since 2005, especially with the development of the MALDI imaging technique and its applications in biomarker discovery and drug distribution. With recent improvements of statistical tools, absolute and relative quantification protocols, as well as quality and reproducibility evaluations, MALDI imaging has become one of the most reliable MSI techniques to support drug discovery and development phases. MSI allows to potentially address important questions in drug development such as "What is the localization of the drug and its metabolites in the tissues?", "What is the pharmacological effect of the drug in this particular region of interest?", or "Is the drug and its metabolites related to an atypical finding?" However, prior to addressing these questions using MSI techniques, expertise needs to be developed to become proficient at histological procedures (tissue preparation with frozen of fixed tissues), analytical chemistry, matrix application, instrumentation, informatics, and mathematics for data analysis and interpretation.
Suvarapu, Lakshmi Narayana; Baek, Sung-Ok
2015-01-01
This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539
Surface analysis characterisation of gum binders used in modern watercolour paints
NASA Astrophysics Data System (ADS)
Sano, Naoko; Cumpson, Peter J.
2016-02-01
Conducting this study has demonstrated that not only SEM-EDX but also XPS can be an efficient tool for characterising watercolour paint surfaces. We find that surface effects are mediated by water. Once the powdered components in the watercolour come into contact with water they dramatically transform their chemical structures at the surface and show the presence of pigment components with a random dispersion within the gum layer. Hence the topmost surface of the paint is confirmed as being composed of the gum binder components. This result is difficult to confirm using just one analytical technique (either XPS or SEM-EDX). In addition, peak fitting of C1s XPS spectra suggests that the gum binder in the commercial watercolour paints is probably gum arabic (by comparison with the reference materials). This identification is not conclusive, but the combination techniques of XPS and SEM shows the surface structure with material distribution of the gum binder and the other ingredients of the watercolour paints. Therefore as a unique technique, XPS combined with SEM-EDX may prove a useful method in the study of surface structure for not only watercolour objects but also other art objects; which may in future help in the conservation for art.
High Resolution Eddy-Current Wire Testing Based on a Gmr Sensor-Array
NASA Astrophysics Data System (ADS)
Kreutzbruck, Marc; Allweins, Kai; Strackbein, Chris; Bernau, Hendrick
2009-03-01
Increasing demands in materials quality and cost effectiveness have led to advanced standards in manufacturing technology. Especially when dealing with high quality standards in conjunction with high throughput quantitative NDE techniques are vital to provide reliable and fast quality control systems. In this work we illuminate a modern electromagnetic NDE approach using a small GMR sensor array for testing superconducting wires. Four GMR sensors are positioned around the wire. Each GMR sensor provides a field sensitivity of 200 pT/√Hz and a spatial resolution of about 100 μm. This enables us to detect under surface defects of 100 μm in size in a depth of 200 μm with a signal-to-noise ratio of better than 400. Surface defects could be detected with a SNR of up to 10,000. Besides this remarkably SNR the small extent of GMR sensors results in a spatial resolution which offers new visualisation techniques for defect localisation, defect characterization and tomography-like mapping techniques. We also report on inverse algorithms based on either a Finite Element Method or an analytical approach. These allow for accurate defect localization on the urn scale and an estimation of the defect size.
ERIC Educational Resources Information Center
Hough, Susan L.; Hall, Bruce W.
The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…
Chemical and pharmacological comparison of modern and traditional dosage forms of Joshanda.
Parveen, Sajida; Irfan Bukhari, Nadeem; Shehzadi, Naureen; Qamar, Shaista; Ali, Ejaz; Naheed, Surriya; Latif, Abida; Yuchi, Alamgeer; Hussain, Khalid
2017-12-11
Recently, a traditional remedy (Joshanda) has been replaced largely by modern ready-to-use dosage forms, which have not been compared to the original remedy. Therefore, the present study aimed to compare a number of modern dosage forms with traditional remedy. Seven brands, 3 batches each, were compared with a Lab-made formulation with reference to analytical (proximate analyses, spectroscopic and chromatographic metabolomes) and pharmacological profiles (anti-inflammatory and antibacterial activities). Chemical and pharmacological differences were found between Lab-made Joshanda and modern dosage forms. Such variations were also found within the brands and batches of modern formulations (p < 0.05). The Lab-made Joshanda showed significantly higher pharmacological activities as compared to modern brands (p ). The results of the present study indicate that modern dosage forms are unstandardised and less effective than the traditional remedy. Characteristic profiles obtained from Lab-made Joshanda may be used as reference to produce comparable dosage forms.
The Modern Design of Experiments for Configuration Aerodynamics: A Case Study
NASA Technical Reports Server (NTRS)
DeLoach, Richard
2006-01-01
The effects of slowly varying and persisting covariate effects on the accuracy and precision of experimental result is reviewed, as is the rationale for run-order randomization as a quality assurance tactic employed in the Modern Design of Experiments (MDOE) to defend against such effects. Considerable analytical complexity is introduced by restrictions on randomization in configuration aerodynamics tests because they involve hard-to-change configuration variables that cannot be randomized conveniently. Tradeoffs are examined between quality and productivity associated with varying degrees of rigor in accounting for such randomization restrictions. Certain characteristics of a configuration aerodynamics test are considered that may justify a relaxed accounting for randomization restrictions to achieve a significant reduction in analytical complexity with a comparably negligible adverse impact on the validity of the experimental results.
NASA Astrophysics Data System (ADS)
Goodall, Clive
1993-08-01
A decisive and lethal response to a naive radical skepticism concerning the prospects for the existence of Extraterrestrial Intelligence is derivable from core areas of Modern Analytic Philosophy. The naive skeptical view is fundamentally flawed in the way it oversimplifies certain complex issues, failing as it does, to recognize a special class of conceptual problems for what they really are and mistakenly treating them instead as empirical issues. Specifically, this skepticism is based upon an untenable oversimplifying mode of the 'mind-brain' relation. Moreover, independent logical considerations concerning the mind-brain relation provide evidential grounds for why we should in fact expect a priori that an Alien Intelligence will face constraints upon, and immense difficulties in, making its existence known by non- electromagnetic means.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
NASA Astrophysics Data System (ADS)
Uemura, Ryu; Nakamoto, Masashi; Asami, Ryuji; Mishima, Satoru; Gibo, Masakazu; Masaka, Kosuke; Jin-Ping, Chen; Wu, Chung-Che; Chang, Yu-Wei; Shen, Chuan-Chou
2016-01-01
Speleothem inclusion-water isotope compositions are a promising new climatic proxy, but their applicability is limited by their low content in water and by analytical challenges. We have developed a precise and accurate isotopic technique that is based on cavity ring-down spectroscopy (CRDS). This method features a newly developed crushing apparatus, a refined sample extraction line, careful evaluation of the water/carbonate adsorption effect. After crushing chipped speleothem in a newly-developed crushing device, released inclusion water is purified and mixed with a limited amount of nitrogen gas in the extraction line for CRDS measurement. We have measured 50-260 nL of inclusion water from 77 to 286 mg of stalagmite deposits sampled from Gyokusen Cave, Okinawa Island, Japan. The small sample size requirement demonstrates that our analytical technique can offer high-resolution inclusion water-based paleoclimate reconstructions. The 1σ reproducibility for different stalagmites ranges from ±0.05 to 0.61‰ for δ18O and ±0.0 to 2.9‰ for δD. The δD vs. δ18O plot for inclusion water from modern stalagmites is consistent with the local meteoric water line. The 1000 ln α values based on calcite and fluid inclusion measurements from decades-old stalagmites are in agreement with the data from present-day farmed calcite experiment. Combination of coeval carbonate and fluid inclusion data suggests that past temperatures at 9-10 thousand years ago (ka) and 26 ka were 3.4 ± 0.7 °C and 8.2 ± 2.4 °C colder than at present, respectively.
State of the art in treatment of facial paralysis with temporalis tendon transfer.
Sidle, Douglas M; Simon, Patrick
2013-08-01
Temporalis tendon transfer is a technique for dynamic facial reanimation. Since its inception, nearly 80 years ago, it has undergone a wealth of innovation to produce the modern operation. The purpose of this review is to update the literature as to the current techniques and perioperative management of patients undergoing temporalis tendon transfer. The modern technique focuses on the minimally invasive approaches and aesthetic refinements to enhance the final product of the operation. The newest techniques as well as preoperative assessment and postoperative rehabilitation are discussed. When temporalis tendon transfer is indicated for facial reanimation, the modern operation offers a refined technique that produces an aesthetically acceptable outcome. Preoperative smile assessment and postoperative smile rehabilitation are necessary and are important adjuncts to a successful operation.
Modern reaction-based indicator systems†
2010-01-01
Traditional analyte-specific synthetic receptors or sensors have been developed on the basis of supramolecular interactions (e.g., hydrogen bonding, electrostatics, weak coordinative bonds). Unfortunately, this approach is often subject to limitations. As a result, increasing attention within the chemical sensor community is turning to the use of analyte-specific molecular indicators, wherein substrate-triggered reactions are used to signal the presence of a given analyte. This tutorial review highlights recent reaction-based indicator systems that have been used to detect selected anions, cations, reactive oxygen species, and neutral substrates. PMID:19587959
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
Rowe, Aaron A; Bonham, Andrew J; White, Ryan J; Zimmer, Michael P; Yadgar, Ramsin J; Hobza, Tony M; Honea, Jim W; Ben-Yaacov, Ilan; Plaxco, Kevin W
2011-01-01
Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80), open-source (software and hardware), hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license.
Liao, Wei-Ching; Chuang, Min-Chieh; Ho, Ja-An Annie
2013-12-15
Genetically modified (GM) technique, one of the modern biomolecular engineering technologies, has been deemed as profitable strategy to fight against global starvation. Yet rapid and reliable analytical method is deficient to evaluate the quality and potential risk of such resulting GM products. We herein present a biomolecular analytical system constructed with distinct biochemical activities to expedite the computational detection of genetically modified organisms (GMOs). The computational mechanism provides an alternative to the complex procedures commonly involved in the screening of GMOs. Given that the bioanalytical system is capable of processing promoter, coding and species genes, affirmative interpretations succeed to identify specified GM event in terms of both electrochemical and optical fashions. The biomolecular computational assay exhibits detection capability of genetically modified DNA below sub-nanomolar level and is found interference-free by abundant coexistence of non-GM DNA. This bioanalytical system, furthermore, sophisticates in array fashion operating multiplex screening against variable GM events. Such a biomolecular computational assay and biosensor holds great promise for rapid, cost-effective, and high-fidelity screening of GMO. Copyright © 2013 Elsevier B.V. All rights reserved.
CheapStat: An Open-Source, “Do-It-Yourself” Potentiostat for Analytical and Educational Applications
Rowe, Aaron A.; Bonham, Andrew J.; White, Ryan J.; Zimmer, Michael P.; Yadgar, Ramsin J.; Hobza, Tony M.; Honea, Jim W.; Ben-Yaacov, Ilan; Plaxco, Kevin W.
2011-01-01
Although potentiostats are the foundation of modern electrochemical research, they have seen relatively little application in resource poor settings, such as undergraduate laboratory courses and the developing world. One reason for the low penetration of potentiostats is their cost, as even the least expensive commercially available laboratory potentiostats sell for more than one thousand dollars. An inexpensive electrochemical workstation could thus prove useful in educational labs, and increase access to electrochemistry-based analytical techniques for food, drug and environmental monitoring. With these motivations in mind, we describe here the CheapStat, an inexpensive (<$80), open-source (software and hardware), hand-held potentiostat that can be constructed by anyone who is proficient at assembling circuits. This device supports a number of potential waveforms necessary to perform cyclic, square wave, linear sweep and anodic stripping voltammetry. As we demonstrate, it is suitable for a wide range of applications ranging from food- and drug-quality testing to environmental monitoring, rapid DNA detection, and educational exercises. The device's schematics, parts lists, circuit board layout files, sample experiments, and detailed assembly instructions are available in the supporting information and are released under an open hardware license. PMID:21931613
NASA Astrophysics Data System (ADS)
Behling, Hermann; da Costa, Marcondes Lima
2004-12-01
A coastal environment has been interpreted from 110 cm thick mudstone deposits found at the base of a 10 m immature laterite profile, which forms the modern coastal cliff on Mosqueiro Island in northeastern Pará state, northern Brazil. The late Tertiary sediment deposits of the Barreiras Formation are studied by multi-element geochemistry and pollen analyses. The mineralogical and geochemical results show that the gray, organic-rich deposits are composed of kaolinite, quartz, and illite/muscovite, as well as pyrite and anatase. They are rich in SiO 2, Al 2O 3, and some FeO. The composition is homogenous, indicating that the detritus source area is formed of lateritic soils derived from acid rock composition. Their chemical composition, including trace elements, is somewhat comparable to continental shale, and the values are below the upper continental Earth crust composition. The pollen analytical data document that the mudstone deposits were formed by an ancient mangrove ecosystem. Mineralogical, geochemical, and pollen analytical data obtained from late Tertiary mangrove deposits are compared with modern mangrove deposits from the Bragança Peninsula of the northeastern coast of Pará state. Although the pollen composition of the deposits is very similar to the modern one, the geochemical and mineralogical composition is different. Smectite was only found in the modern deposit; illite/mica occurs in the ancient deposit, along with Mg, K, and Na. The pollen signature and detrital minerals (kaolinite, quartz and anatase) found in both mangrove deposits show that during the Miocene, a humid tropical climate condition prevailed, similar to modern conditions.
Analysis of Variance in the Modern Design of Experiments
NASA Technical Reports Server (NTRS)
Deloach, Richard
2010-01-01
This paper is a tutorial introduction to the analysis of variance (ANOVA), intended as a reference for aerospace researchers who are being introduced to the analytical methods of the Modern Design of Experiments (MDOE), or who may have other opportunities to apply this method. One-way and two-way fixed-effects ANOVA, as well as random effects ANOVA, are illustrated in practical terms that will be familiar to most practicing aerospace researchers.
Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.
ERIC Educational Resources Information Center
Heineman, William R.; Kissinger, Peter T.
1980-01-01
Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)
van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W
2014-12-22
Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.
Hathaway, John C.
1971-01-01
The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.
Integrating succession and community assembly perspectives
Chang, Cynthia; HilleRisLambers, Janneke
2016-01-01
Succession and community assembly research overlap in many respects, such as through their focus on how ecological processes like dispersal, environmental filters, and biotic interactions influence community structure. Indeed, many recent advances have been made by successional studies that draw on modern analytical techniques introduced by contemporary community assembly studies. However, community assembly studies generally lack a temporal perspective, both on how the forces structuring communities might change over time and on how historical contingency (e.g. priority effects and legacy effects) and complex transitions (e.g. threshold effects) might alter community trajectories. We believe a full understanding of the complex interacting processes that shape community dynamics across large temporal scales can best be achieved by combining concepts, tools, and study systems into an integrated conceptual framework that draws upon both succession and community assembly theory. PMID:27785355
HPTLC in Herbal Drug Quantification
NASA Astrophysics Data System (ADS)
Shinde, Devanand B.; Chavan, Machindra J.; Wakte, Pravin S.
For the past few decades, compounds from natural sources have been gaining importance because of the vast chemical diversity they offer. This has led to phenomenal increase in the demand for herbal medicines in the last two decades and need has been felt for ensuring the quality, safety, and efficacy of herbal drugs. Phytochemical evaluation is one of the tools for the quality assessment, which include preliminary phytochemical screening, chemoprofiling, and marker compound analysis using modern analytical techniques. High-performance thin-layer chromatography (HPTLC) has been emerged as an important tool for the qualitative, semiquantitative, and quantitative phytochemical analysis of the herbal drugs and formulations. This includes developing TLC fingerprinting profiles and estimation of biomarkers. This review has an attempt to focus on the theoretical considerations of HPTLC and some examples of herbal drugs and formulations analyzed by HPTLC.
Social media for intelligence: practical examples of analysis for understanding
NASA Astrophysics Data System (ADS)
Juhlin, Jonas A.; Richardson, John
2016-05-01
Social media has become a dominating feature in modern life. Platforms like Facebook, Twitter, and Google have users all over the world. People from all walks of life use social media. For the intelligence services, social media is an element that cannot be ignored. It holds immense amount of information, and the potential to extract useful intelligence cannot be ignored. Social media has been around for sufficient time that most intelligence services recognize the fact that social media needs some form of attention. However, for the intelligence collector and analyst several aspects must be uncovered in order to fully exploit social media for intelligence purposes. This paper will present Project Avatar, an experiment in obtaining effective intelligence from social media sources, and several emerging analytic techniques to expand the intelligence gathered from these sources.
Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael
2018-05-01
Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.
[On the way to national reference system of laboratory medicine].
Muravskaia, N P; Men'shikov, V V
2014-10-01
The application of standard samples and reference techniques of implementation of measurements is needed for a valid support of reliability of analyses applied in clinical diagnostic laboratories. They play role of landmarks under metrologic monitoring, calibration of devices and control of quality of results. The article presents analysis of shortcomings interfering with formation of national reference system in Russia harmonized with possibilities provided by international organizations. Among them are the joint Committee on metrologic monitoring in laboratory medicine under the auspices of the International Bureau of Weights and Measures, the International Federation of clinical chemistry and laboratory medicine, etc. The results of the recent development of national normative documents, standard samples and techniques assisted by the authors of article are considered. They are the first steps to organization of national reference system which would comprise all range of modern analytical technologies of laboratory medicine. The national and international measures are proposed to enhance the promptest resolving of task of organization of national reference system for laboratory medicine in the interests of increasing of effectiveness of medical care to citizen of Russia.
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique
NASA Astrophysics Data System (ADS)
Riccò, R.; Nizzero, S.; Penna, E.; Meneghello, A.; Cretaio, E.; Enrichi, F.
2018-05-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. [Figure not available: see fulltext.
Active Fail-Safe Micro-Array Flow Control for Advanced Embedded Propulsion Systems
NASA Technical Reports Server (NTRS)
Anderson, Bernhard H.; Mace, James L.; Mani, Mori
2009-01-01
The primary objective of this research effort was to develop and analytically demonstrate enhanced first generation active "fail-safe" hybrid flow-control techniques to simultaneously manage the boundary layer on the vehicle fore-body and to control the secondary flow generated within modern serpentine or embedded inlet S-duct configurations. The enhanced first-generation technique focused on both micro-vanes and micro-ramps highly-integrated with micro -jets to provide nonlinear augmentation for the "strength' or effectiveness of highly-integrated flow control systems. The study focused on the micro -jet mass flow ratio (Wjet/Waip) range from 0.10 to 0.30 percent and jet total pressure ratios (Pjet/Po) from 1.0 to 3.0. The engine bleed airflow range under study represents about a 10 fold decrease in micro -jet airflow than previously required. Therefore, by pre-conditioning, or injecting a very small amount of high-pressure jet flow into the vortex generated by the micro-vane and/or micro-ramp, active flow control is achieved and substantial augmentation of the controlling flow is realized.
Molecular markers: progress and prospects for understanding reproductive ecology in elasmobranchs.
Portnoy, D S; Heist, E J
2012-04-01
Application of modern molecular tools is expanding the understanding of elasmobranch reproductive ecology. High-resolution molecular markers provide information at scales ranging from the identification of reproductively isolated populations in sympatry (i.e. cryptic species) to the relationships among parents, offspring and siblings. This avenue of study has not only augmented the current understanding of the reproductive biology of elasmobranchs but has also provided novel insights that could not be obtained through experimental or observational techniques. Sharing of genetic polymorphisms across ocean basins indicates that for some species there may be gene flow on global scales. The presence, however, of morphologically similar but genetically distinct entities in sympatry suggests that reproductive isolation can occur with minimal morphological differentiation. This review discusses the recent findings in elasmobranch reproductive biology like philopatry, hybridization and polyandry while highlighting important molecular and analytical techniques. Furthermore, the review examines gaps in current knowledge and discusses how new technologies may be applied to further the understanding of elasmobranch reproductive ecology. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
An Introduction to Modern Missing Data Analyses
ERIC Educational Resources Information Center
Baraldi, Amanda N.; Enders, Craig K.
2010-01-01
A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…
Application of Modern Design of Experiments to CARS Thermometry in a Model Scramjet Engine
NASA Technical Reports Server (NTRS)
Danehy, P. M.; DeLoach, R.; Cutler, A. D.
2002-01-01
We have applied formal experiment design and analysis to optimize the measurement of temperature in a supersonic combustor at NASA Langley Research Center. We used the coherent anti-Stokes Raman spectroscopy (CARS) technique to map the temperature distribution in the flowfield downstream of an 1160 K, Mach 2 freestream into which supersonic hydrogen fuel is injected at an angle of 30 degrees. CARS thermometry is inherently a single-point measurement technique; it was used to map thc flow by translating the measurement volume through the flowfield. The method known as "Modern Design of Experiments" (MDOE) was used to estimate the data volume required, design the test matrix, perform the experiment and analyze the resulting data. MDOE allowed us to match the volume of data acquired to the precision requirements of the customer. Furthermore, one aspect of MDOE, known as response surface methodology, allowed us to develop precise maps of the flowfield temperature, allowing interpolation between measurement points. An analytic function in two spatial variables was fit to the data from a single measurement plane. Fitting with a Cosine Series Bivariate Function allowed the mean temperature to be mapped with 95% confidence interval half-widths of +/- 30 Kelvin, comfortably meeting the confidence of +/- 50 Kelvin specified prior to performing the experiments. We estimate that applying MDOE to the present experiment saved a factor of 5 in data volume acquired, compared to experiments executed in the traditional manner. Furthermore, the precision requirements could have been met with less than half the data acquired.
Popplow, Marcus
2015-12-01
Recent critical approaches to what has conventionally been described as "scientific" and "technical" knowledge in early modern Europe have provided a wealth of new insights. So far, the various analytical concepts suggested by these studies have not yet been comprehensively discussed. The present essay argues that such comprehensive approaches might prove of special value for long-term and cross-cultural reflections on technology-related knowledge. As heuristic tools, the notions of "formalization" and "interaction" are proposed as part of alternative narratives to those highlighting the emergence of "science" as the most relevant development for technology-related knowledge in early modern Europe.
A polymeric micro total analysis system for single-cell analysis
NASA Astrophysics Data System (ADS)
Lai, Hsuan-Hong
The advancement of microengineering has enabled the manipulation and analysis of single cells, which is critical in understanding the molecular mechanisms underlying the basic physiological functions from the point of view of modern biologists. Unfortunately, analysis of single cells remains challenging from a technical perspective, mainly because of the miniature nature of the cell and the high throughput requirements of the analysis. Lab-on-a-chip (LOC) emerges as a research field that shows great promise in this perspective. We have demonstrated a micro total analysis system (mu-TAS) combining chip-based electrophoretic separation, fluorescence detection, and a pulsed Nd:YAG laser cell lysis system, in a Poly(dimethylsiloxane) (PDMS) microfluidic analytical platform for the implementation of single-cell analysis. To accomplish the task, a polymeric microfluidic device was fabricated and UV graft polymerization surface modification techniques were used. To optimize the conditions for the surface treatment techniques, the modified surfaces of PDMS were characterized using AIR-IR spectrum and sessile water drop contact angle measurements, and in-channel surfaces were characterized by their electroosmotic flow mobility. Accurate single-cell analysis relies on rapid cell lysis and therefore an optical measure of fast cell lysis was implemented and optimized in a microscopic station. The influences of pulse energy and the location of the laser beam with respect to the cell in the microchannel were explored. The observation from the cell disruption experiments suggested that the cell lysis was enabled mainly via a thermo-mechanical instead of a plasma-mediated mechanism. Finally, after chip-based electrophoresis and a laser-induced fluorescence (LIF) detection system were incorporated with the laser lysis system in a microfluidic analytical station, a feasibility demonstration of single-cell analysis was implemented. The analytical platform exhibited the capability of fluidic transportation, optical lysis of single cells, separation, and analysis of the lysates by electrophoresis and LIF detection. In comparison with the control experiment, the migration times of the fluorescent signals for the cytosolic fluorophores were in good agreement with those for the standard fluorophores, which confirmed the feasibility of the analytical processes.
Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z
2015-12-01
Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
Yoga and mental health: A dialogue between ancient wisdom and modern psychology
Vorkapic, Camila Ferreira
2016-01-01
Background: Many yoga texts make reference to the importance of mental health and the use of specific techniques in the treatment of mental disorders. Different concepts utilized in modern psychology may not come with contemporary ideas, instead, they seem to share a common root with ancient wisdom. Aims: The goal of this perspective article is to correlate modern techniques used in psychology and psychiatry with yogic practices, in the treatment of mental disorders. Materials and Methods: The current article presented a dialogue between the yogic approach for the treatment of mental disorder and concepts used in modern psychology, such as meta-cognition, disidentification, deconditioning and interoceptive exposure. Conclusions: Contemplative research found out that modern interventions in psychology might not come from modern concepts after all, but share great similarity with ancient yogic knowledge, giving us the opportunity to integrate the psychological wisdom of both East and West. PMID:26865774
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
Global Combat Support System-Marine Corps Proof-of-Concept for Dashboard Analytics
2014-12-01
The core is modern, commercial-off-the-shelf enterprise resource planning ( ERP ) software (Oracle 11i e-Business Suite). GCSS-MCs design is focused...factor in the decision to implement this new software . GCSS-MC is the technology centerpiece of the Logistics Modernization (LogMod) Program...GCSS-MC is based on the implementation of Oracle e-Business Suite 11i as the core software package. This is the same infrastructure that Oracle
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Emma M.; Hendrix, Val; Chertkov, Michael
This white paper introduces the application of advanced data analytics to the modernized grid. In particular, we consider the field of machine learning and where it is both useful, and not useful, for the particular field of the distribution grid and buildings interface. While analytics, in general, is a growing field of interest, and often seen as the golden goose in the burgeoning distribution grid industry, its application is often limited by communications infrastructure, or lack of a focused technical application. Overall, the linkage of analytics to purposeful application in the grid space has been limited. In this paper wemore » consider the field of machine learning as a subset of analytical techniques, and discuss its ability and limitations to enable the future distribution grid and the building-to-grid interface. To that end, we also consider the potential for mixing distributed and centralized analytics and the pros and cons of these approaches. Machine learning is a subfield of computer science that studies and constructs algorithms that can learn from data and make predictions and improve forecasts. Incorporation of machine learning in grid monitoring and analysis tools may have the potential to solve data and operational challenges that result from increasing penetration of distributed and behind-the-meter energy resources. There is an exponentially expanding volume of measured data being generated on the distribution grid, which, with appropriate application of analytics, may be transformed into intelligible, actionable information that can be provided to the right actors – such as grid and building operators, at the appropriate time to enhance grid or building resilience, efficiency, and operations against various metrics or goals – such as total carbon reduction or other economic benefit to customers. While some basic analysis into these data streams can provide a wealth of information, computational and human boundaries on performing the analysis are becoming significant, with more data and multi-objective concerns. Efficient applications of analysis and the machine learning field are being considered in the loop.« less
Depth-resolved monitoring of analytes diffusion in ocular tissues
NASA Astrophysics Data System (ADS)
Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.
2007-02-01
Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.
Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T
2017-02-01
Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.
Professional Competence of a Teacher in Higher Educational Institution
ERIC Educational Resources Information Center
Abykanova, Bakytgul; Tashkeyeva, Gulmira; Idrissov, Salamat; Bilyalova, Zhupar; Sadirbekova, Dinara
2016-01-01
Modern reality brings certain corrections to the understanding of forms and methods of teaching various courses in higher educational institution. A special role among the educational techniques and means in the college educational environment is taken by the modern technologies, such as using the techniques, means and ways, which are aimed at…
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
ERIC Educational Resources Information Center
Fitzgerald, Mary
2017-01-01
This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.
Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun
2017-07-08
Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.
Linking temporal medical records using non-protected health information data.
Bonomi, Luca; Jiang, Xiaoqian
2017-01-01
Modern medical research relies on multi-institutional collaborations which enhance the knowledge discovery and data reuse. While these collaborations allow researchers to perform analytics otherwise impossible on individual datasets, they often pose significant challenges in the data integration process. Due to the lack of a unique identifier, data integration solutions often have to rely on patient's protected health information (PHI). In many situations, such information cannot leave the institutions or must be strictly protected. Furthermore, the presence of noisy values for these attributes may result in poor overall utility. While much research has been done to address these challenges, most of the current solutions are designed for a static setting without considering the temporal information of the data (e.g. EHR). In this work, we propose a novel approach that uses non-PHI for linking patient longitudinal data. Specifically, our technique captures the diagnosis dependencies using patterns which are shown to provide important indications for linking patient records. Our solution can be used as a standalone technique to perform temporal record linkage using non-protected health information data or it can be combined with Privacy Preserving Record Linkage solutions (PPRL) when protected health information is available. In this case, our approach can solve ambiguities in results. Experimental evaluations on real datasets demonstrate the effectiveness of our technique.
Porosity characterization for heterogeneous shales using integrated multiscale microscopy
NASA Astrophysics Data System (ADS)
Rassouli, F.; Andrew, M.; Zoback, M. D.
2016-12-01
Pore size distribution analysis plays a critical role in gas storage capacity and fluid transport characterization of shales. Study of the diverse distribution of pore size and structure in such low permeably rocks is withheld by the lack of tools to visualize the microstructural properties of shale rocks. In this paper we try to use multiple techniques to investigate the full pore size range in different sample scales. Modern imaging techniques are combined with routine analytical investigations (x-ray diffraction, thin section analysis and mercury porosimetry) to describe pore size distribution of shale samples from Haynesville formation in East Texas to generate a more holistic understanding of the porosity structure in shales, ranging from standard core plug down to nm scales. Standard 1" diameter core plug samples were first imaged using a Versa 3D x-ray microscope at lower resolutions. Then we pick several regions of interest (ROIs) with various micro-features (such as micro-cracks and high organic matters) in the rock samples to run higher resolution CT scans using a non-destructive interior tomography scans. After this step, we cut the samples and drill 5 mm diameter cores out of the selected ROIs. Then we rescan the samples to measure porosity distribution of the 5 mm cores. We repeat this step for samples with diameter of 1 mm being cut out of the 5 mm cores using a laser cutting machine. After comparing the pore structure and distribution of the samples measured form micro-CT analysis, we move to nano-scale imaging to capture the ultra-fine pores within the shale samples. At this stage, the diameter of the 1 mm samples will be milled down to 70 microns using the laser beam. We scan these samples in a nano-CT Ultra x-ray microscope and calculate the porosity of the samples by image segmentation methods. Finally, we use images collected from focused ion beam scanning electron microscopy (FIB-SEM) to be able to compare the results of porosity measurements from all different imaging techniques. These multi-scale characterization techniques are then compared with traditional analytical techniques such as Mercury Porosimetry.
A/C Interface: The Electronic Toolbox. Part I.
ERIC Educational Resources Information Center
Dessy, Raymond E., Ed.
1985-01-01
Discusses new solid-state transducers, arrays of nonspecific detectors, hardware and firmware computational elements, and other devices that are transforming modern analytical chemistry. Examples in which microelectroic sensors are used to solve 14 problems are included. (JN)
[The discourse of psychosis in contemporary philosophy].
Stompe, Thomas; Ritter, Kristina
2009-01-01
The preoccupation of philosophy with madness can be traced back till the Greek antiquity. For many philosophers like Descartes psychotic phenomena were symbols for the fragility of human mental powers, while others like Plato or Nietzsche saw madness as a way to escape the constraints of rationality. After 1960 three direction of contemporary philosophy dealt with the topics madness--schizophrenia--psychosis: Following Nietzsche and Bataille, Foucault as well as Deleuze and Guattari considered schizophrenia as the societal oppressed reverse of modern rationality, a notion which had a strong influence on the anti-psychiatric movement. Philosophical phenomenology primarily focussed on ontological problems of the psychotic existence. Finally Philosophy of Mind, the modern Anglo-American version of analytical philosophy, analyzed the logical coherence of psychotic inferences and experiences. Especially the insights of analytical philosophy may be important for a more sophisticated interpretation of psychopathological research as well as of the new findings of neuroscience.
Uncovering the structure of (super)conformal field theories
NASA Astrophysics Data System (ADS)
Liendo, Pedro
Conformal field theories (CFTs) are of central importance in modern theoretical physics, with applications that range from condensed matter physics to particle theory phenomenology. In this Ph.D. thesis we study CFTs from two somehow orthogonal (but complementary) points of view. In the first approach we concentrate our efforts in two specific examples: the Veneziano limit of N = 2 and N = 1 superconformal QCD. The addition of supersymmetry makes these theories amenable to analytical analysis. In particular, we use the correspondence between single trace operators and states of a spin chain to study the integrability properties of each theory. Our results indicate that these theories are not completely integrable, but they do contain some subsectors in which integrability might hold. In the second approach, we consider the so-called "bootstrap program'', which is the ambitious idea that the restrictions imposed by conformal symmetry (crossing symmetry in particular) are so powerful that starting from a few basic assumptions one should be able to fix the form of a theory. In this thesis we apply bootstrap techniques to CFTs in the presence of a boundary. We study two-point functions using analytical and numerical methods. One-loop results were re-obtained from crossing symmetry alone and a variety of numerical bounds for conformal dimensions of operators were obtained. These bounds are quite general and valid for any CFT in the presence of a boundary, in contrast to our first approach where a specific set of theories was studied. A natural continuation of this work is to apply bootstrap techniques to supersymmetric theories. Some preliminary results along these lines are presented.
Engineering Bioluminescent Proteins: Expanding their Analytical Potential
Rowe, Laura; Dikici, Emre; Daunert, Sylvia
2009-01-01
Synopsis Bioluminescence has been observed in nature since the dawn of time, but now, scientists are harnessing it for analytical applications. Laura Rowe, Emre Dikici, and Sylvia Daunert of the University of Kentucky describe the origins of bioluminescent proteins and explore their uses in the modern chemistry laboratory. The cover features spectra of bioluminescent light superimposed on an image of jellyfish, which are a common source of bioluminescent proteins. Images courtesy of Emre Dikici and Shutterstock. PMID:19725502
Rogstad, Sarah; Pang, Eric; Sommers, Cynthia; Hu, Meng; Jiang, Xiaohui; Keire, David A; Boyne, Michael T
2015-11-01
Glatiramer acetate (GA) is a mixture of synthetic copolymers consisting of four amino acids (glutamic acid, lysine, alanine, and tyrosine) with a labeled molecular weight range of 5000 to 9000 Da. GA is marketed as Copaxone™ by Teva for the treatment of multiple sclerosis. Here, the agency has evaluated the structure and composition of GA and a commercially available comparator, Copolymer-1. Modern analytical technologies which can characterize these complex mixtures are desirable for analysis of their comparability and structural "sameness." In the studies herein, a molecular fingerprinting approach is taken using mass-accurate mass spectrometry (MS) analysis, nuclear magnetic resonance (NMR) (1D-(1)H-NMR, 1D-(13)C-NMR, and 2D NMR), and asymmetric field flow fractionation (AFFF) coupled with multi-angle light scattering (MALS) for an in-depth characterization of three lots of the marketplace drug and a formulated sample of the comparator. Statistical analyses were applied to the MS and AFFF-MALS data to assess these methods' ability to detect analytical differences in the mixtures. The combination of multiple orthogonal measurements by liquid chromatography coupled with MS (LC-MS), AFFF-MALS, and NMR on the same sample set was found to be fit for the intended purpose of distinguishing analytical differences between these complex mixtures of peptide chains.
Use of Foodomics for Control of Food Processing and Assessing of Food Safety.
Josić, D; Peršurić, Ž; Rešetar, D; Martinović, T; Saftić, L; Kraljević Pavelić, S
Food chain, food safety, and food-processing sectors face new challenges due to globalization of food chain and changes in the modern consumer preferences. In addition, gradually increasing microbial resistance, changes in climate, and human errors in food handling remain a pending barrier for the efficient global food safety management. Consequently, a need for development, validation, and implementation of rapid, sensitive, and accurate methods for assessment of food safety often termed as foodomics methods is required. Even though, the growing role of these high-throughput foodomic methods based on genomic, transcriptomic, proteomic, and metabolomic techniques has yet to be completely acknowledged by the regulatory agencies and bodies. The sensitivity and accuracy of these methods are superior to previously used standard analytical procedures and new methods are suitable to address a number of novel requirements posed by the food production sector and global food market. © 2017 Elsevier Inc. All rights reserved.
Characterizing visible and invisible cell wall mutant phenotypes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carpita, Nicholas C.; McCann, Maureen C.
2015-04-06
About 10% of a plant's genome is devoted to generating the protein machinery to synthesize, remodel, and deconstruct the cell wall. High-throughput genome sequencing technologies have enabled a reasonably complete inventory of wall-related genes that can be assembled into families of common evolutionary origin. Assigning function to each gene family member has been aided immensely by identification of mutants with visible phenotypes or by chemical and spectroscopic analysis of mutants with ‘invisible’ phenotypes of modified cell wall composition and architecture that do not otherwise affect plant growth or development. This review connects the inference of gene function on the basismore » of deviation from the wild type in genetic functional analyses to insights provided by modern analytical techniques that have brought us ever closer to elucidating the sequence structures of the major polysaccharide components of the plant cell wall.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munasinghe, M.; Meier, P.
1988-01-01
Given the importance of energy in modern economies, the first part of the volume is devoted to examining some of the key conceptual and analytical tools available for energy-policy analysis and planning. Policy tools and institutional frameworks that will facilitate better energy management are also discussed. Energy-policy analysis is explained, while effective energy management techniques are discussed to achieve desirable national objectives, using a selected set of policies and policy instruments. In the second part of the volume, the actual application of the principles set out earlier is explained through a case study of Sri Lanka. The monograph integrates themore » many aspects of the short-term programs already begun with the options for the medium to long term, and ends with the outline of a long-term strategy for Sri Lanka.« less
Ernst, Madeleine; Silva, Denise Brentan; Silva, Ricardo Roberto; Vêncio, Ricardo Z N; Lopes, Norberto Peporine
2014-06-01
Covering: up to 2013. Plant metabolomics is a relatively recent research field that has gained increasing interest in the past few years. Up to the present day numerous review articles and guide books on the subject have been published. This review article focuses on the current applications and limitations of the modern mass spectrometry techniques, especially in combination with electrospray ionisation (ESI), an ionisation method which is most commonly applied in metabolomics studies. As a possible alternative to ESI, perspectives on matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) in metabolomics studies are introduced, a method which still is not widespread in the field. In metabolomics studies the results must always be interpreted in the context of the applied sampling procedures as well as data analysis. Different sampling strategies are introduced and the importance of data analysis is illustrated in the example of metabolic network modelling.
Baharum, Zainal; Akim, Abdah Md; Hin, Taufiq Yap Yun; Hamid, Roslida Abdul; Kasran, Rosmin
2016-01-01
Plants have been a good source of therapeutic agents for thousands of years; an impressive number of modern drugs used for treating human diseases are derived from natural sources. The Theobroma cacao tree, or cocoa, has recently garnered increasing attention and become the subject of research due to its antioxidant properties, which are related to potential anti-cancer effects. In the past few years, identifying and developing active compounds or extracts from the cocoa bean that might exert anti-cancer effects have become an important area of health- and biomedicine-related research. This review provides an updated overview of T. cacao in terms of its potential anti-cancer compounds and their extraction, in vitro bioassay, purification, and identification. This article also discusses the advantages and disadvantages of the techniques described and reviews the processes for future perspectives of analytical methods from the viewpoint of anti-cancer compound discovery. PMID:27019680
Synthesis and characterization of cellulose acetate from rice husk: eco-friendly condition.
Das, Archana M; Ali, Abdul A; Hazarika, Manash P
2014-11-04
Cellulose acetate was synthesized from rice husk by using a simple, efficient, cost-effective and solvent-free method. Cellulose was isolated from rice husk (RH) using standard pretreatment method with dilute alkaline and acid solutions and bleaching with 2% H2O2. Cellulose acetate (CA) was synthesized successfully with the yield of 66% in presence of acetic anhydride and iodine as a catalyst in eco-friendly solvent-free conditions. The reaction parameters were standardized at 80 °C for 300 min and the optimum results were taken for further study. The extent of acetylation was evaluated from % yield and the degree of substitution (DS), which was determined by (1)H NMR and titrimetrically. The synthesized products were characterized with the help modern analytical techniques like FT-IR, (1)H NMR, XRD, etc. and the thermal behavior was evaluated by TGA and DSC thermograms. Copyright © 2014 Elsevier Ltd. All rights reserved.
Application of modern autoradiography to nuclear forensic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc
Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary tomore » information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20 × 40 cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50 μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ~10 μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. Here in this article we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris.« less
Application of modern autoradiography to nuclear forensic analysis
Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc; ...
2018-05-20
Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary tomore » information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20 × 40 cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50 μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ~10 μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. Here in this article we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris.« less
Application of modern autoradiography to nuclear forensic analysis.
Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc; Stone, Gary; Caldeira, Lee; Ramon, Christina; Kristo, Michael
2018-05-01
Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary to information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20×40cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5 ) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ∼10μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. In this paper we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris. Copyright © 2018 Elsevier B.V. All rights reserved.
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
NASA Astrophysics Data System (ADS)
Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.
2017-11-01
Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.
Fogarty, Laurel; Wakano, Joe Yuichiro; Feldman, Marcus W; Aoki, Kenichi
2017-03-01
The forces driving cultural accumulation in human populations, both modern and ancient, are hotly debated. Did genetic, demographic, or cognitive features of behaviorally modern humans (as opposed to, say, early modern humans or Neanderthals) allow culture to accumulate to its current, unprecedented levels of complexity? Theoretical explanations for patterns of accumulation often invoke demographic factors such as population size or density, whereas statistical analyses of variation in cultural complexity often point to the importance of environmental factors such as food stability, in determining cultural complexity. Here we use both an analytical model and an agent-based simulation model to show that a full understanding of the emergence of behavioral modernity, and the cultural evolution that has followed, depends on understanding and untangling the complex relationships among culture, genetically determined cognitive ability, and demographic history. For example, we show that a small but growing population could have a different number of cultural traits from a shrinking population with the same absolute number of individuals in some circumstances.
Big (Bio)Chemical Data Mining Using Chemometric Methods: A Need for Chemists.
Tauler, Roma; Parastar, Hadi
2018-03-23
This review aims to demonstrate abilities to analyze Big (Bio)Chemical Data (BBCD) with multivariate chemometric methods and to show some of the more important challenges of modern analytical researches. In this review, the capabilities and versatility of chemometric methods will be discussed in light of the BBCD challenges that are being encountered in chromatographic, spectroscopic and hyperspectral imaging measurements, with an emphasis on their application to omics sciences. In addition, insights and perspectives on how to address the analysis of BBCD are provided along with a discussion of the procedures necessary to obtain more reliable qualitative and quantitative results. In this review, the importance of Big Data and of their relevance to (bio)chemistry are first discussed. Then, analytical tools which can produce BBCD are presented as well as some basics needed to understand prospects and limitations of chemometric techniques when they are applied to BBCD are given. Finally, the significance of the combination of chemometric approaches with BBCD analysis in different chemical disciplines is highlighted with some examples. In this paper, we have tried to cover some of the applications of big data analysis in the (bio)chemistry field. However, this coverage is not extensive covering everything done in the field. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Analytical electron microscopy of biogenic and inorganic carbonates
NASA Technical Reports Server (NTRS)
Blake, David F.
1989-01-01
In the terrestrial sedimentary environment, the mineralogically predominant carbonates are calcite-type minerals (rhombohedral carbonates) and aragonite-type minerals (orthorhombic carbonates). Most common minerals precipitating either inorganically or biogenically are high magnesium calcite and aragonite. High magnesium calcite (with magnesium carbonate substituting for more than 7 mole percent of the calcium carbonate) is stable only at temperatures greater than 700 C or thereabouts, and aragonite is stable only at pressures exceeding several kilobars of confining pressure. Therefore, these carbonates are expected to undergo chemical stabilization in the diagenetic environment to ultimately form stable calcite and dolomite. Because of the strong organic control of carbonate deposition in organisms during biomineralization, the microchemistry and microstructure of invertebrate skeletal material is much different than that present in inorganic carbonate cements. The style of preservation of microstructural features in skeletal material is therefore often quite distinctive when compared to that of inorganic carbonate even though wholesale recrystallization of the sediment has taken place. Microstructural and microchemical comparisons are made between high magnesium calcite echinoderm skeletal material and modern inorganic high magnesium calcite inorganic cements, using analytical electron microscopy and related techniques. Similar comparisons are made between analogous materials which have undergone stabilization in the diagenetic environment. Similar analysis schemes may prove useful in distinguishing between biogenic and inorganic carbonates in returned Martian carbonate samples.
Accounting for differences in the bioactivity and bioavailability of vitamers
Gregory, Jesse F.
2012-01-01
Essentially all vitamins exist with multiple nutritionally active chemical species often called vitamers. Our quantitative understanding of the bioactivity and bioavailability of the various members of each vitamin family has increased markedly, but many issues remain to be resolved concerning the reporting and use of analytical data. Modern methods of vitamin analysis rely heavily on chromatographic techniques that generally allow the measurement of the individual chemical forms of vitamins. Typical applications of food analysis include the evaluation of shelf life and storage stability, monitoring of nutrient retention during food processing, developing food composition databases and data needed for food labeling, assessing dietary adequacy and evaluating epidemiological relationships between diet and disease. Although the usage of analytical data varies depending on the situation, important issues regarding how best to present and interpret the data in light of the presence of multiple vitamers are common to all aspects of food analysis. In this review, we will evaluate the existence of vitamers that exhibit differences in bioactivity or bioavailability, consider when there is a need to address differences in bioactivity or bioavailability of vitamers, and then consider alternative approaches and possible ways to improve the reporting of data. Major examples are taken from literature and experience with vitamin B6 and folate. PMID:22489223
Nonaqueous capillary electrophoresis with indirect electrochemical detection.
Matysik, Frank-Michael; Marggraf, Daniela; Gläser, Petra; Broekaert, José A C
2002-11-01
Nonaqueous capillary electrophoresis (NACE) which makes use of organic solvents in place of conventional aqueous electrophoresis buffers is gaining increasing importance among modern separation techniques. Recently, it has been shown that amperometric detection in conjunction with acetonitrile-based NACE offers an extended accessible potential range and an enhanced long-term stability of the amperometric responses generated at solid electrodes. The present contribution takes advantage of the latter aspect to develop reliable systems for NACE with indirect electrochemical detection (IED). In this context, several compounds such as (ferrocenylmethyl)trimethylammonium perchlorate, tris(1,10-phenanthroline)cobalt(III) perchlorate and bis(1,4,7-triazacyclononane)nickel(II) perchlorate were studied regarding their suitability to act as electroactive buffer additives for IED in NACE. The performance characteristics for the respective buffer systems were evaluated. Tetraalkylammonium perchlorates served as model compounds for the optimization of the NACE-IED system. Target analytes choline and acetylcholine could easily be separated and determined by means of NACE-IED. In the case of a buffer system containing 10(-4) M tris(1,10-phenanthroline)cobalt(III) perchlorate the limits of detection were 2.5 x 10(-7) M and 4.6 x 10(-7) M for choline and acetylcholine, respectively. With the elaborated analytical procedure choline could be determined in pharmaceutical preparations.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.
Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz
2009-07-15
The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Post-analytical Issues in Hemostasis and Thrombosis Testing.
Favaloro, Emmanuel J; Lippi, Giuseppe
2017-01-01
Analytical concerns within hemostasis and thrombosis testing are continuously decreasing. This is essentially attributable to modern instrumentation, improvements in test performance and reliability, as well as the application of appropriate internal quality control and external quality assurance measures. Pre-analytical issues are also being dealt with in some newer instrumentation, which are able to detect hemolysis, icteria and lipemia, and, in some cases, other issues related to sample collection such as tube under-filling. Post-analytical issues are generally related to appropriate reporting and interpretation of test results, and these are the focus of the current overview, which provides a brief description of these events, as well as guidance for their prevention or minimization. In particular, we propose several strategies for improved post-analytical reporting of hemostasis assays and advise that this may provide the final opportunity to prevent serious clinical errors in diagnosis.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Experimental and analytical determination of stability parameters for a balloon tethered in a wind
NASA Technical Reports Server (NTRS)
Redd, L. T.; Bennett, R. M.; Bland, S. R.
1973-01-01
Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.
Perspectives on making big data analytics work for oncology.
El Naqa, Issam
2016-12-01
Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from incorporating prior knowledge, using information-theoretic techniques to modern ensemble machine learning approaches or combination of these. We will particularly discuss the pros and cons of different approaches to improve mining of big data in oncology. Copyright © 2016 Elsevier Inc. All rights reserved.
Analytical Chemistry: A Literary Approach.
ERIC Educational Resources Information Center
Lucy, Charles A.
2000-01-01
Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)
NASA Astrophysics Data System (ADS)
Taneja, Jayant Kumar
Electricity is an indispensable commodity to modern society, yet it is delivered via a grid architecture that remains largely unchanged over the past century. A host of factors are conspiring to topple this dated yet venerated design: developments in renewable electricity generation technology, policies to reduce greenhouse gas emissions, and advances in information technology for managing energy systems. Modern electric grids are emerging as complex distributed systems in which a portfolio of power generation resources, often incorporating fluctuating renewable resources such as wind and solar, must be managed dynamically to meet uncontrolled, time-varying demand. Uncertainty in both supply and demand makes control of modern electric grids fundamentally more challenging, and growing portfolios of renewables exacerbate the challenge. We study three electricity grids: the state of California, the province of Ontario, and the country of Germany. To understand the effects of increasing renewables, we develop a methodology to scale renewables penetration. Analyzing these grids yields key insights about rigid limits to renewables penetration and their implications in meeting long-term emissions targets. We argue that to achieve deep penetration of renewables, the operational model of the grid must be inverted, changing the paradigm from load-following supplies to supply-following loads. To alleviate the challenge of supply-demand matching on deeply renewable grids, we first examine well-known techniques, including altering management of existing supply resources, employing utility-scale energy storage, targeting energy efficiency improvements, and exercising basic demand-side management. Then, we create several instantiations of supply-following loads -- including refrigerators, heating and cooling systems, and laptop computers -- by employing a combination of sensor networks, advanced control techniques, and enhanced energy storage. We examine the capacity of each load for supply-following and study the behaviors of populations of these loads, assessing their potential at various levels of deployment throughout the California electricity grid. Using combinations of supply-following strategies, we can reduce peak natural gas generation by 19% on a model of the California grid with 60% renewables. We then assess remaining variability on this deeply renewable grid incorporating supply-following loads, characterizing additional capabilities needed to ensure supply-demand matching in future sustainable electricity grids.
Climate Analytics as a Service
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.
2014-01-01
Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.
Hess, Nancy J.; Pasa-Tolic, Ljiljana; Bailey, Vanessa L.; ...
2017-04-12
Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less
LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; ...
2015-08-19
An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd (x)Sb 2 and T´-La 2CuOmore » 4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.« less
Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.
Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn
2017-01-01
The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Nancy J.; Paša-Tolić, Ljiljana; Bailey, Vanessa L.
Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less
Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio
2015-01-01
Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642
3-D Printing as a Tool to Investigate the Effects of Changes in Rock Microstructures on Permeability
NASA Astrophysics Data System (ADS)
Head, D. A.; Vanorio, T.
2016-12-01
Rocks are naturally heterogeneous; two rock samples with identical bulk properties can vary widely in microstructure. Understanding the evolutionary trends of rock properties requires the ability to connect time-lapse measurements of properties at different scales: the macro- scale used in the laboratory and field analyses capturing the bulk scale changes and the micro- scale used in imaging and digital techniques capturing the changes to the pore space. However, measuring those properties at different scales is very challenging, and sometimes impossible. The advent of modern 3D printing has provided an unprecedented opportunity to link those scales by combining the strengths of digital and experimental rock physics. To determine the feasibility of this technique we characterized the resolution capabilities of two different 3D printers. To calibrate our digital models with our printed models, we created a sample with an analytically solvable permeability. This allowed us to directly compare analytic calculation, numerical simulation, and laboratory measurement of permeability of the exact same sample. Next we took a CT-scanned model of a natural carbonate pore space, then iteratively digitally manipulated, 3D printed, and measured the flow properties in the laboratory. This approach allowed us to access multiple scales digitally and experimentally, to test hypotheses about how changes in rock microstructure due to compaction and dissolution affect bulk transport properties, and to connect laboratory measurements of porosity and permeability to quantities that are traditionally impossible to measure in the laboratory such as changes in surface area and tortuosity. As 3D printing technology continues to advance, we expect this technique to contribute to our ability to characterize the properties of remote and/or delicate samples as well as to test the impact of microstructural alteration on bulk physical properties in the lab in a highly consistent, repeatable manner.
Pérez-Parada, Andrés; Gómez-Ramos, María del Mar; Martínez Bueno, María Jesús; Uclés, Samanta; Uclés, Ana; Fernández-Alba, Amadeo R
2012-02-01
Instrumental capabilities and software tools of modern hybrid mass spectrometry (MS) instruments such as high-resolution mass spectrometry (HRMS), quadrupole time-of-flight (QTOF), and quadrupole linear ion trap (QLIT) were experimentally investigated for the study of emerging contaminants in Henares River water samples. Automated screening and confirmatory capabilities of QTOF working in full-scan MS and tandem MS (MS/MS) were explored when dealing with real samples. Investigations on the effect of sensitivity and resolution power influence on mass accuracy were studied for the correct assignment of the amoxicillin transformation product 5(R) amoxicillin-diketopiperazine-2',5' as an example of a nontarget compound. On the other hand, a comparison of quantitative and qualitative strategies based on direct injection analysis and off-line solid-phase extraction sample treatment were assayed using two different QLIT instruments for a selected group of emerging contaminants when operating in selected reaction monitoring (SRM) and information-dependent acquisition (IDA) modes. Software-aided screening usually needs a further confirmatory step. Resolving power and MS/MS feature of QTOF showed to confirm/reject most findings in river water, although sensitivity-related limitations are usually found. Superior sensitivity of modern QLIT-MS/MS offered the possibility of direct injection analysis for proper quantitative study of a variety of contaminants, while it simultaneously reduced the matrix effect and increased the reliability of the results. Confirmation of ethylamphetamine, which lacks on a second SRM transition, was accomplished by using the IDA feature. Hybrid MS instruments equipped with high resolution and high sensitivity contributes to enlarge the scope of targeted analytes in river waters. However, in the tested instruments, there is a margin of improvement principally in required sensitivity and data treatment software tools devoted to reliable confirmation and improved automated data processing.
Curiosity: organic molecules on Mars? (Italian Title: Curiosity: molecole organiche su Marte?)
NASA Astrophysics Data System (ADS)
Guaita, C.
2015-05-01
First analytical results from SAM instrument onboard of Curiosity are coherent with the presence, on Mars, of organic molecules possibly linked to bacterial metabolism. These data require also a modern revision of the debated results obtained by Viking landers.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Andrei Andreevich Bolibrukh's works on the analytic theory of differential equations
NASA Astrophysics Data System (ADS)
Anosov, Dmitry V.; Leksin, Vladimir P.
2011-02-01
This paper contains an account of A.A. Bolibrukh's results obtained in the new directions of research that arose in the analytic theory of differential equations as a consequence of his sensational counterexample to the Riemann-Hilbert problem. A survey of results of his students in developing topics first considered by Bolibrukh is also presented. The main focus is on the role of the reducibility/irreducibility of systems of linear differential equations and their monodromy representations. A brief synopsis of results on the multidimensional Riemann-Hilbert problem and on isomonodromic deformations of Fuchsian systems is presented, and the main methods in the modern analytic theory of differential equations are sketched. Bibliography: 69 titles.
Application of Classical and Lie Transform Methods to Zonal Perturbation in the Artificial Satellite
NASA Astrophysics Data System (ADS)
San-Juan, J. F.; San-Martin, M.; Perez, I.; Lopez-Ochoa, L. M.
2013-08-01
A scalable second-order analytical orbit propagator program is being carried out. This analytical orbit propagator combines modern perturbation methods, based on the canonical frame of the Lie transform, and classical perturbation methods in function of orbit types or the requirements needed for a space mission, such as catalog maintenance operations, long period evolution, and so on. As a first step on the validation of part of our orbit propagator, in this work we only consider the perturbation produced by zonal harmonic coefficients in the Earth's gravity potential, so that it is possible to analyze the behaviour of the perturbation methods involved in the corresponding analytical theories.
ERIC Educational Resources Information Center
Papadopoulos, Ioannis
2010-01-01
The issue of the area of irregular shapes is absent from the modern mathematical textbooks in elementary education in Greece. However, there exists a collection of books written for educational purposes by famous Greek scholars dating from the eighteenth century, which propose certain techniques concerning the estimation of the area of such…
Frontiers in Relativistic Celestial Mechanics, Vol. 2, Applications and Experiments
NASA Astrophysics Data System (ADS)
Kopeikin, Sergei
2014-08-01
Relativistic celestial mechanics - investigating the motion celestial bodies under the influence of general relativity - is a major tool of modern experimental gravitational physics. With a wide range of prominent authors from the field, this two-volume series consists of reviews on a multitude of advanced topics in the area of relativistic celestial mechanics - starting from more classical topics such as the regime of asymptotically-flat spacetime, light propagation and celestial ephemerides, but also including its role in cosmology and alternative theories of gravity as well as modern experiments in this area. This second volume of a two-volume series covers applications of the theory as well as experimental verifications. From tools to determine light travel times in curved space-time to laser ranging between earth and moon and between satellites, and impacts on the definition of time scales and clock comparison techniques, a variety of effects is discussed. On the occasion of his 80-th birthday, these two volumes honor V. A. Brumberg - one of the pioneers in modern relativistic celestial mechanics. Contributions include: J. Simon, A. Fienga: Victor Brumberg and the French school of analytical celestial mechanics T. Fukushima: Elliptic functions and elliptic integrals for celestial mechanics and dynamical astronomy P. Teyssandier: New tools for determining the light travel time in static, spherically symmetric spacetimes beyond the order G2 J. Müller, L. Biskupek, F. Hofmann and E. Mai: Lunar laser ranging and relativity N. Wex: Testing relativistic celestial mechanics with radio pulsars I. Ciufolini et al.: Dragging of inertial frames, fundamental physics, and satellite laser ranging G. Petit, P. Wolf, P. Delva: Atomic time, clocks, and clock comparisons in relativistic spacetime: a review
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Thermoelectrically cooled water trap
Micheels, Ronald H [Concord, MA
2006-02-21
A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
Nicolodelli, Gustavo; Senesi, Giorgio Saverio; de Oliveira Perazzoli, Ivan Luiz; Marangoni, Bruno Spolon; De Melo Benites, Vinícius; Milori, Débora Marcondes Bastos Pereira
2016-09-15
Organic fertilizers are obtained from waste of plant or animal origin. One of the advantages of organic fertilizers is that, from the composting, it recycles waste-organic of urban and agriculture origin, whose disposal would cause environmental impacts. Fast and accurate analysis of both major and minor/trace elements contained in organic mineral and inorganic fertilizers of new generation have promoted the application of modern analytical techniques. In particular, laser induced breakdown spectroscopy (LIBS) is showing to be a very promising, quick and practical technique to detect and measure contaminants and nutrients in fertilizers. Although, this technique presents some limitations, such as a low sensitivity, if compared to other spectroscopic techniques, the use of double pulse (DP) LIBS is an alternative to the conventional LIBS in single pulse (SP). The macronutrients (Ca, Mg, K, P), micronutrients (Cu, Fe, Na, Mn, Zn) and contaminant (Cr) in fertilizer using LIBS in SP and DP configurations were evaluated. A comparative study for both configurations was performed using optimized key parameters for improving LIBS performance. The limit of detection (LOD) values obtained by DP LIBS increased up to seven times as compared to SP LIBS. In general, the marked improvement obtained when using DP system in the simultaneous LIBS quantitative determination for fertilizers analysis could be ascribed to the larger ablated mass of the sample. The results presented in this study show the promising potential of the DP LIBS technique for a qualitative analysis in fertilizers, without requiring sample preparation with chemical reagents. Copyright © 2016 Elsevier B.V. All rights reserved.
New software solutions for analytical spectroscopists
NASA Astrophysics Data System (ADS)
Davies, Antony N.
1999-05-01
Analytical spectroscopists must be computer literate to effectively carry out the tasks assigned to them. This has often been resisted within organizations with insufficient funds to equip their staff properly, a lack of desire to deliver the essential training and a basic resistance amongst staff to learn the new techniques required for computer assisted analysis. In the past these problems were compounded by seriously flawed software which was being sold for spectroscopic applications. Owing to the limited market for such complex products the analytical spectroscopist often was faced with buying incomplete and unstable tools if the price was to remain reasonable. Long product lead times meant spectrometer manufacturers often ended up offering systems running under outdated and sometimes obscure operating systems. Not only did this mean special staff training for each instrument where the knowledge gained on one system could not be transferred to the neighbouring system but these spectrometers were often only capable of running in a stand-alone mode, cut-off from the rest of the laboratory environment. Fortunately a number of developments in recent years have substantially changed this depressing picture. A true multi-tasking operating system with a simple graphical user interface, Microsoft Windows NT4, has now been widely introduced into the spectroscopic computing environment which has provided a desktop operating system which has proved to be more stable and robust as well as requiring better programming techniques of software vendors. The opening up of the Internet has provided an easy way to access new tools for data handling and has forced a substantial re-think about results delivery (for example Chemical MIME types, IUPAC spectroscopic data exchange standards). Improved computing power and cheaper hardware now allows large spectroscopic data sets to be handled without too many problems. This includes the ability to carry out chemometric operations in minutes rather than hours. Fast networks now enable data analysis of even multi-dimensional spectroscopic data sets remote from the measuring instrument. A strong tendency to opt for a more unified graphical user interface which is substantially more user friendly allows even inexperienced users to rapidly get acquainted with even the complex mathematical analyses. Some examples of new spectroscopic software products will be given to demonstrate the aforesaid points and highlight the ease of integration into a modern analytical spectroscopy workplace.
NASA Astrophysics Data System (ADS)
Boatman, Elizabeth Marie
The nanoscale structure of compact bone contains several features that are direct indicators of bulk tissue mechanical properties. Fossil bone tissues represent unique opportunities to understand the compact bone structure/property relationships from a deep time perspective, offering a possible array of new insights into bone diseases, biomimicry of composite materials, and basic knowledge of bioapatite composition and nanoscale bone structure. To date, most work with fossil bone has employed microscale techniques and has counter-indicated the survival of bioapatite and other nanoscale structural features. The obvious disconnect between the use of microscale techniques and the discernment of nanoscale structure has prompted this work. The goal of this study was to characterize the nanoscale constituents of fossil compact bone by applying a suite of diffraction, microscopy, and spectrometry techniques, representing the highest levels of spatial and energy resolution available today, and capable of complementary structural and compositional characterization from the micro- to the nanoscale. Fossil dinosaur and crocodile long bone specimens, as well as modern ratite and crocodile femurs, were acquired from the UC Museum of Paleontology. Preserved physiological features of significance were documented with scanning electron microscopy back-scattered imaging. Electron microprobe wavelength-dispersive X-ray spectroscopy (WDS) revealed fossil bone compositions enriched in fluorine with a complementary loss of oxygen. X-ray diffraction analyses demonstrated that all specimens were composed of apatite. Transmission electron microscopy (TEM) imaging revealed preserved nanocrystallinity in the fossil bones and electron diffraction studies further identified these nanocrystallites as apatite. Tomographic analyses of nanoscale elements imaged by TEM and small angle X-ray scattering were performed, with the results of each analysis further indicating that nanoscale structure is highly conserved in these four fossil specimens. Finally, the results of this study indicate that bioapatite can be preserved in even the most ancient vertebrate specimens, further supporting the idea that fossilization is a preservational process. This work also underlines the importance of using appropriately selected characterization and analytical techniques for the study of fossil bone, especially from the perspective of spatial resolution and the scale of the bone structural features in question.
A continuous latitudinal energy balance model to explore non-uniform climate engineering strategies
NASA Astrophysics Data System (ADS)
Bonetti, F.; McInnes, C. R.
2016-12-01
Current concentrations of atmospheric CO2 exceed measured historical levels in modern times, largely attributed to anthropogenic forcing since the industrial revolution. The required decline in emissions rates has never been achieved leading to recent interest in climate engineering for future risk-mitigation strategies. Climate engineering aims to offset human-driven climate change. It involves techniques developed both to reduce the concentration of CO2 in the atmosphere (Carbon Dioxide Removal (CDR) methods) and to counteract the radiative forcing that it generates (Solar Radiation Management (SRM) methods). In order to investigate effects of SRM technologies for climate engineering, an analytical model describing the main dynamics of the Earth's climate has been developed. The model is a time-dependent Energy Balance Model (EBM) with latitudinal resolution and allows for the evaluation of non-uniform climate engineering strategies. A significant disadvantage of climate engineering techniques involving the management of solar radiation is regional disparities in cooling. This model offers an analytical approach to design multi-objective strategies that counteract climate change on a regional basis: for example, to cool the Artic and restrict undesired impacts at mid-latitudes, or to control the equator-to-pole temperature gradient. Using the Green's function approach the resulting partial differential equation allows for the computation of the surface temperature as a function of time and latitude when a 1% per year increase in the CO2 concentration is considered. After the validation of the model through comparisons with high fidelity numerical models, it will be used to explore strategies for the injection of the aerosol precursors in the stratosphere. In particular, the model involves detailed description of the optical properties of the particles, the wash-out dynamics and the estimation of the radiative cooling they can generate.
Detecting most influencing courses on students grades using block PCA
NASA Astrophysics Data System (ADS)
Othman, Osama H.; Gebril, Rami Salah
2014-12-01
One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-11
A critical overview on automation of modern liquid phase microextraction (LPME) approaches based on the liquid impregnation of porous sorbents and membranes is presented. It is the continuation of part 1, in which non-dispersive LPME techniques based on the use of the extraction phase (EP) in the form of drop, plug, film, or microflow have been surveyed. Compared to the approaches described in part 1, porous materials provide an improved support for the EP. Simultaneously they allow to enlarge its contact surface and to reduce the risk of loss by incident flow or by components of surrounding matrix. Solvent-impregnated membranes or hollow fibres are further ideally suited for analyte extraction with simultaneous or subsequent back-extraction. Their use can therefore improve the procedure robustness and reproducibility as well as it "opens the door" to the new operation modes and fields of application. However, additional work and time are required for membrane replacement and renewed impregnation. Automation of porous support-based and membrane-based approaches plays an important role in the achievement of better reliability, rapidness, and reproducibility compared to manual assays. Automated renewal of the extraction solvent and coupling of sample pretreatment with the detection instrumentation can be named as examples. The different LPME methodologies using impregnated membranes and porous supports for the extraction phase and the different strategies of their automation, and their analytical applications are comprehensively described and discussed in this part. Finally, an outlook on future demands and perspectives of LPME techniques from both parts as a promising area in the field of sample pretreatment is given. Copyright © 2015 Elsevier B.V. All rights reserved.
Twilight of dawn or of evening? A century of research methods in the Journal of Applied Psychology.
Cortina, Jose M; Aguinis, Herman; DeShon, Richard P
2017-03-01
We offer a critical review and synthesis of research methods in the first century of the Journal of Applied Psychology. We divide the chronology into 6 periods. The first emphasizes the first few issues of the journal, which, in many ways, set us on a methodological course that we sail to this day, and then takes us through the mid-1920s. The second is the period through World War II, in which we see the roots of modern methodological concepts and techniques, including a transition from a discovery orientation to a hypotheticodeductive model orientation. The third takes us through roughly 1970, a period in which many of our modern-day practices were formed, such as reliance on null hypothesis significance testing. The fourth, from 1970 through 1989, sees an emphasis on the development of measures of critical constructs. The fifth takes us into the present, which is marked by greater plurality regarding data-analytic approaches. Finally, we offer a glimpse of possible and, from our perspective, desirable futures regarding research methods. Specifically, we highlight the need to conduct replications; study the exceptional and not just the average; improve the quality of the review process, particularly regarding methodological issues; emphasize design and measurement issues; and build and test more specific theories. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Fingerprinting sea-level variations in response to continental ice loss: a benchmark exercise
NASA Astrophysics Data System (ADS)
Barletta, Valentina R.; Spada, Giorgio; Riva, Riccardo E. M.; James, Thomas S.; Simon, Karen M.; van der Wal, Wouter; Martinec, Zdenek; Klemann, Volker; Olsson, Per-Anders; Hagedoorn, Jan; Stocchi, Paolo; Vermeersen, Bert
2013-04-01
Understanding the response of the Earth to the waxing and waning ice sheets is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements to the projections of future sea level trends in response to climate change. All the processes accompanying Glacial Isostatic Adjustment (GIA) can be described solving the so-called Sea Level Equation (SLE), an integral equation that accounts for the interactions between the ice sheets, the solid Earth, and the oceans. Modern approaches to the SLE are based on various techniques that range from purely analytical formulations to fully numerical methods. Here we present the results of a benchmark exercise of independently developed codes designed to solve the SLE. The study involves predictions of current sea level changes due to present-day ice mass loss. In spite of the differences in the methods employed, the comparison shows that a significant number of GIA modellers can reproduce their sea-level computations within 2% for well defined, large-scale present-day ice mass changes. Smaller and more detailed loads need further and dedicated benchmarking and high resolution computation. This study shows how the details of the implementation and the inputs specifications are an important, and often underappreciated, aspect. Hence this represents a step toward the assessment of reliability of sea level projections obtained with benchmarked SLE codes.
NASA Astrophysics Data System (ADS)
Nevin, A.; Cesaratto, A.; D'Andrea, C.; Valentini, Gianluca; Comelli, D.
2013-05-01
We present the non-invasive study of historical and modern Zn- and Cd-based pigments with time-resolved fluorescence spectroscopy, fluorescence multispectral imaging and fluorescence lifetime imaging (FLIM). Zinc oxide and Zinc sulphide are semiconductors which have been used as white pigments in paintings, and the luminescence of these pigments from trapped states is strongly dependent on the presence of impurities and crystal defects. Cadmium sulphoselenide pigments vary in hue from yellow to deep red based on their composition, and are another class of semiconductor pigments which emit both in the visible and the near infrared. The Fluorescence lifetime of historical and modern pigments has been measured using both an Optical Multichannel Analyser (OMA) coupled with a Nd:YAG nslaser, and a streak camera coupled with a ps-laser for spectrally-resolved fluorescence lifetime measurements. For Znbased pigments we have also employed Fluorescence Lifetime Imaging (FLIM) for the measurement of luminescence. A case study of FLIM applied to the analysis of the painting by Vincent Van Gogh on paper - "Les Bretonnes et le pardon de Pont-Aven" (1888) is presented. Through the integration of complementary, portable and non-invasive spectroscopic techniques, new insights into the optical properties of Zn- and Cd-based pigments have been gained which will inform future analysis of late 19th] and early 20th C. paintings.
Analytical methods in multivariate highway safety exposure data estimation
DOT National Transportation Integrated Search
1984-01-01
Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...
Teaching Electronics and Laboratory Automation Using Microcontroller Boards
ERIC Educational Resources Information Center
Mabbott, Gary A.
2014-01-01
Modern microcontroller boards offer the analytical chemist a powerful and inexpensive means of interfacing computers and laboratory equipment. The availability of a host of educational materials, compatible sensors, and electromechanical devices make learning to implement microcontrollers fun and empowering. This article describes the advantages…
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Cantrill, Richard C
2008-01-01
Methods of analysis for products of modern biotechnology are required for national and international trade in seeds, grain and food in order to meet the labeling or import/export requirements of different nations and trading blocks. Although many methods were developed by the originators of transgenic events, governments, universities, and testing laboratories, trade is less complicated if there exists a set of international consensus-derived analytical standards. In any analytical situation, multiple methods may exist for testing for the same analyte. These methods may be supported by regional preferences and regulatory requirements. However, tests need to be sensitive enough to determine low levels of these traits in commodity grain for regulatory purposes and also to indicate purity of seeds containing these traits. The International Organization for Standardization (ISO) and its European counterpart have worked to produce a suite of standards through open, balanced and consensus-driven processes. Presently, these standards are approaching the time for their first review. In fact, ISO 21572, the "protein standard" has already been circulated for systematic review. In order to expedite the review and revision of the nucleic acid standards an ISO Technical Specification (ISO/TS 21098) was drafted to set the criteria for the inclusion of precision data from collaborative studies into the annexes of these standards.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Dynamic Visualization of Co-expression in Systems Genetics Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Huang, Jian; Chesler, Elissa J
2008-01-01
Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less
Nguyen, Quynh C.; Osypuk, Theresa L.; Schmidt, Nicole M.; Glymour, M. Maria; Tchetgen Tchetgen, Eric J.
2015-01-01
Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994–2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. PMID:25693776
Feasibility of Tactical Air Delivery Resupply Using Gliders
2016-12-01
using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...Sparrow,” using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...the desired point(s) of impact due to the atmospheric three-dimensional ( 3D ) wind and density field encountered by the descending load under canopy
Modern Education in China. Bulletin, 1919, No. 44
ERIC Educational Resources Information Center
Edmunds, Charles K.
1919-01-01
The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…
NASA Astrophysics Data System (ADS)
Kashansky, Vladislav V.; Kaftannikov, Igor L.
2018-02-01
Modern numerical modeling experiments and data analytics problems in various fields of science and technology reveal a wide variety of serious requirements for distributed computing systems. Many scientific computing projects sometimes exceed the available resource pool limits, requiring extra scalability and sustainability. In this paper we share the experience and findings of our own on combining the power of SLURM, BOINC and GlusterFS as software system for scientific computing. Especially, we suggest a complete architecture and highlight important aspects of systems integration.
Sub-federal ecological modernization: A case study of Colorado's new energy economy
NASA Astrophysics Data System (ADS)
Giannakouros, Stratis
European nations have often employed policies of explicit government intervention as a preferred means of addressing environmental and economic challenges. These policies have ranged from grey industrial policies focused solely on industrial growth, competitiveness and innovation to policies of stronger ecological modernization, which seek to align industrial interests with environmental protection. In recent years these policies have been mobilized to address the threat of climate change and promote environmental innovation. While some US Administrations have similarly recognized the need to address these challenges, the particular historical and political institutional dynamics of the US have meant that explicit government intervention has been eschewed in favor of more indirect strategies when dealing with economic and environmental challenges. This is evident in the rise of sub-federal policies at the level of US states. Supported by federal laboratories and public research, US states have adopted policies that look very much like sub-federal versions of industrial or ecological modernization policy. This thesis uses the Colorado case to highlight the importance of sub-federal institutions in addressing environmental and economic challenges in the US and explore its similarities to, and differences from, European approaches. To achieve this goal it first develops an analytical scheme within which to place policy initiatives on a continuum from grey industrial policy to strong ecological modernization policy by identifying key institutions that are influential in each policy type. This analytical scheme is then applied to the transitional renewable energy policy period from 2004-2012 in the state of Colorado. This period starts with the adoption of a renewable energy portfolio in 2004 and includes the `new energy economy' period from 2007-2010 as well as the years since. Looking at three key turning points this paper interprets the `new energy economy' strategy using the analytical scheme developed and identifies the political and social institutions that frame this transition. Drawing upon these findings, the paper analyses the implications of the Colorado case for understanding sub-federal initiatives in the US and concludes with a summary of the broader comparative institutional lessons.
The modern rotor aerodynamic limits survey: A report and data survey
NASA Technical Reports Server (NTRS)
Cross, J.; Brilla, J.; Kufeld, R.; Balough, D.
1993-01-01
The first phase of the Modern Technology Rotor Program, the Modern Rotor Aerodynamic Limits Survey, was a flight test conducted by the United States Army Aviation Engineering Flight Activity for NASA Ames Research Center. The test was performed using a United States Army UH-60A Black Hawk aircraft and the United States Air Force HH-60A Night Hawk instrumented main-rotor blade. The primary purpose of this test was to gather high-speed, steady-state, and maneuvering data suitable for correlation purposes with analytical prediction tools. All aspects of the data base, flight-test instrumentation, and test procedures are presented and analyzed. Because of the high volume of data, only select data points are presented. However, access to the entire data set is available upon request.
A profile of the demographics and training characteristics of professional modern dancers.
Weiss, David S; Shah, Selina; Burchette, Raoul J
2008-01-01
Modern dancers are a unique group of artists, performing a diverse repertoire in dance companies of various sizes. In this study, 184 professional modern dancers in the United States (males N=49, females N=135), including members of large and small companies as well as freelance dancers, were surveyed regarding their demographics and training characteristics. The mean age of the dancers was 30.1 +/- 7.3 years, and they had danced professionally for 8.9 +/- 7.2 years. The average Body Mass Index (BMI) was 23.6 +/- 2.4 for males and 20.5 +/- 1.7 for females. Females had started taking dance class earlier (age 6.5 +/- 4.2 years) as compared to males (age 15.6 +/- 6.2 years). Females were more likely to have begun their training in ballet, while males more often began with modern classes (55% and 51% respectively, p < 0.0001). The professional modern dancers surveyed spent 8.3 +/- 6.0 hours in class and 17.2 +/- 12.6 hours in rehearsal each week. Eighty percent took modern technique class and 67% reported that they took ballet technique class. The dancers who specified what modern technique they studied (N=84) reported between two and four different techniques. The dancers also participated in a multitude of additional exercise regimens for a total of 8.2 +/- 6.6 hours per week, with the most common types being Pilates, yoga, and upper body weightlifting. The dancers wore many different types of footwear, depending on the style of dance being performed. For modern dance alone, dancers wore 12 different types of footwear. Reflecting the diversity of the dancers and companies surveyed, females reported performing for 23.3 +/- 14.0 weeks (range: 2-52 weeks) per year; males reported performing 20.4 +/- 13.9 weeks (range: 1-40) per year. Only 18% of the dancers did not have any health insurance, with 54% having some type of insurance provided by their employer. However, 23% of the dancers purchased their own insurance, and 22% had insurance provided by their families. Only 16% of dancers reported that they had Workers' Compensation coverage, despite the fact that they were all professionals, including many employed by major modern dance companies across the United States. It is concluded that understanding the training profile of the professional modern dancer should assist healthcare providers in supplying appropriate medical care for these performers.
Social Information Processing Analysis (SIPA): Coding Ongoing Human Communication.
ERIC Educational Resources Information Center
Fisher, B. Aubrey; And Others
1979-01-01
The purpose of this paper is to present a new analytical system to be used in communication research. Unlike many existing systems devised ad hoc, this research tool, a system for interaction analysis, is embedded in a conceptual rationale based on modern systems theory. (Author)
Autonomous Energy Grids | Grid Modernization | NREL
control themselves using advanced machine learning and simulation to create resilient, reliable, and affordable optimized energy systems. Current frameworks to monitor, control, and optimize large-scale energy of optimization theory, control theory, big data analytics, and complex system theory and modeling to
An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Collis, Peter
2012-01-01
Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
[Aerobic methylobacteria as promising objects of modern biotechnology].
Doronina, N V; Toronskava, L; Fedorov, D N; Trotsenko, Yu A
2015-01-01
The experimental data of the past decade concerning the metabolic peculiarities of aerobic meth ylobacteria and the prospects for their use in different fields of modern biotechnology, including genetic engineering techniques, have been summarized.
Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert
2009-01-01
Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...
Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A
2007-01-01
The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.
Applications of Business Analytics in Healthcare.
Ward, Michael J; Marsolo, Keith A; Froehle, Craig M
2014-09-01
The American healthcare system is at a crossroads, and analytics, as an organizational skill, figures to play a pivotal role in its future. As more healthcare systems capture information electronically and as they begin to collect more novel forms of data, such as human DNA, how will we leverage these resources and use them to improve human health at a manageable cost? In this article, we argue that analytics will play a fundamental role in the transformation of the American healthcare system. However, there are numerous challenges to the application and use of analytics, namely the lack of data standards, barriers to the collection of high-quality data, and a shortage of qualified personnel to conduct such analyses. There are also multiple managerial issues, such as how to get end users of electronic data to employ it consistently for improving healthcare delivery, and how to manage the public reporting and sharing of data. In this article, we explore applications of analytics in healthcare, barriers and facilitators to its widespread adoption, and how analytics can help us achieve the goals of the modern healthcare system: high-quality, responsive, affordable, and efficient care.
Applications of Business Analytics in Healthcare
Ward, Michael J.; Marsolo, Keith A.
2014-01-01
The American healthcare system is at a crossroads, and analytics, as an organizational skill, figures to play a pivotal role in its future. As more healthcare systems capture information electronically and as they begin to collect more novel forms of data, such as human DNA, how will we leverage these resources and use them to improve human health at a manageable cost? In this article, we argue that analytics will play a fundamental role in the transformation of the American healthcare system. However, there are numerous challenges to the application and use of analytics, namely the lack of data standards, barriers to the collection of high-quality data, and a shortage of qualified personnel to conduct such analyses. There are also multiple managerial issues, such as how to get end users of electronic data to employ it consistently for improving healthcare delivery, and how to manage the public reporting and sharing of data. In this article, we explore applications of analytics in healthcare, barriers and facilitators to its widespread adoption, and how analytics can help us achieve the goals of the modern healthcare system: high-quality, responsive, affordable, and efficient care. PMID:25429161
Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.
ERIC Educational Resources Information Center
Hercules, David M.; Hercules, Shirley H.
1984-01-01
Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)
La Nasa, Jacopo; Zanaboni, Marco; Uldanck, Daniele; Degano, Ilaria; Modugno, Francesca; Kutzke, Hartmut; Tveit, Eva Storevik; Topalova-Casadiego, Biljana; Colombini, Maria Perla
2015-10-08
Modern oil paints, introduced at the beginning of the 20th century, differ from those classically used in antiquity in their chemical and compositional features. The main ingredients were still traditional drying oils, often used in mixtures with less expensive oils and added with several classes of additives. Consequently, detailed lipid profiling, together with the study of lipid degradation processes, is essential for the knowledge and the conservation of paint materials used in modern and contemporary art. A multi-analytical approach based on mass spectrometry was used for the study of original paint materials from Munch's atelier, owned by the Munch Museum in Oslo. The results obtained in the analysis of paint tubes were compared with those obtained by characterizing a paint sample collected from one of the artist's sketches for the decoration of the Festival Hall of the University of Oslo (1909-1916). Py-GC/MS was used as screening method to evaluate the presence of lipid, proteic or polysaccaridic materials. GC/MS after hydrolysis and derivatization allowed us to determine the fatty acid profile of the paint tubes, and to evaluate the molecular changes associated to curing and ageing. The determination of the fatty acid profile is not conclusive for the characterization of complex mixtures of lipid materials, thus the characterization of the triglyceride profiles was performed using an analytical procedure based on HPLC-ESI-Q-ToF. This paper describes the first application of HPLC-ESI-Q-ToF for the acquisition of the triglyceride profile in a modern paint sample, showing the potentialities of liquid chromatography in the field of lipid characterization in modern paint materials. Moreover, our results highlighted that the application of this approach can contribute to address dating, authenticity and conservation issues relative to modern and contemporary artworks. Copyright © 2015. Published by Elsevier B.V.
Rodushkin, I; Bergman, T; Douglas, G; Engström, E; Sörlin, D; Baxter, D C
2007-02-05
Different analytical approaches for origin differentiation between vendace and whitefish caviars from brackish- and freshwaters were tested using inductively coupled plasma double focusing sector field mass spectrometry (ICP-SFMS) and multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). These approaches involve identifying differences in elemental concentrations or sample-specific isotopic composition (Sr and Os) variations. Concentrations of 72 elements were determined by ICP-SFMS following microwave-assisted digestion in vendace and whitefish caviar samples from Sweden (from both brackish and freshwater), Finland and USA, as well as in unprocessed vendace roe and salt used in caviar production. This data set allows identification of elements whose contents in caviar can be affected by salt addition as well as by contamination during production and packaging. Long-term method reproducibility was assessed for all analytes based on replicate caviar preparations/analyses and variations in element concentrations in caviar from different harvests were evaluated. The greatest utility for differentiation was demonstrated for elements with varying concentrations between brackish and freshwaters (e.g. As, Br, Sr). Elemental ratios, specifically Sr/Ca, Sr/Mg and Sr/Ba, are especially useful for authentication of vendace caviar processed from brackish water roe, due to the significant differences between caviar from different sources, limited between-harvest variations and relatively high concentrations in samples, allowing precise determination by modern analytical instrumentation. Variations in the 87Sr/86Sr ratio for vendace caviar from different harvests (on the order of 0.05-0.1%) is at least 10-fold less than differences between caviar processed from brackish and freshwater roe. Hence, Sr isotope ratio measurements (either by ICP-SFMS or by MC-ICP-MS) have great potential for origin differentiation. On the contrary, it was impossible to differentiate between Swedish caviar processed from brackish water roe and Finnish freshwater caviar based solely on 187Os/188Os ratios.
A History of Computer Numerical Control.
ERIC Educational Resources Information Center
Haggen, Gilbert L.
Computer numerical control (CNC) has evolved from the first significant counting method--the abacus. Babbage had perhaps the greatest impact on the development of modern day computers with his analytical engine. Hollerith's functioning machine with punched cards was used in tabulating the 1890 U.S. Census. In order for computers to become a…
Reframing Responsibility in an Era of Responsibilisation: Education, Feminist Ethics
ERIC Educational Resources Information Center
McLeod, Julie
2017-01-01
Late modern social theories and critiques of neoliberalism have emphasised the regulatory and negative aspects of responsibility, readily associating it with self-responsibility or analytically converting it to the notion of responsibilisation. This article argues for stepping back from these critiques in order to reframe responsibility as a…
Visualizing the Solute Vaporization Interference in Flame Atomic Absorption Spectroscopy
ERIC Educational Resources Information Center
Dockery, Christopher R.; Blew, Michael J.; Goode, Scott R.
2008-01-01
Every day, tens of thousands of chemists use analytical atomic spectroscopy in their work, often without knowledge of possible interferences. We present a unique approach to study these interferences by using modern response surface methods to visualize an interference in which aluminum depresses the calcium atomic absorption signal. Calcium…
Effect-Size Measures and Meta-Analytic Thinking in Counseling Psychology Research
ERIC Educational Resources Information Center
Henson, Robin K.
2006-01-01
Effect sizes are critical to result interpretation and synthesis across studies. Although statistical significance testing has historically dominated the determination of result importance, modern views emphasize the role of effect sizes and confidence intervals. This article accessibly discusses how to calculate and interpret the effect sizes…
A Study of the U.S. Coast Guard Aviator Training Requirements.
ERIC Educational Resources Information Center
Hall, Eugene R.; And Others
An analytical study conducted to define functional characteristics of modern, synthetic flight training equipment for the purpose of producing potentially better qualified aviators through a combination of aircraft and simulator training. Relevant training which aviators receive in preparation for specific aircraft duties and training requirements…
Model Effectiveness as a Function of Personnel (ME = f(PER))
1986-10-01
Human Factor in Military Modernization, The RAND Corporation, R- 2460-NA, 1979 AD-A072955 D-7. SUPPRESSION Mueller, M. P., K. H. Pietsch , Human Factors in...H. Pietsch , Human Factors in Field Experimentation, Design and Analysis of an Analytical Suppression Model, 1978 A061417 Office of Naval Research
Post-Secularism, Religious Knowledge and Religious Education
ERIC Educational Resources Information Center
Carr, David
2012-01-01
Post-secularism seems to follow in the wake of other (what are here called) "postal" perspectives--post-structuralism, postmodernism, post-empiricism, post-positivism, post-analytical philosophy, post-foundationalism and so on--in questioning or repudiating what it takes to be the epistemic assumptions of "modernism." To be sure, post-secularism…
Isolation by ion-exchange methods. In Sarker S.D. (ed) Natural Products Isolation, 3rd edition
USDA-ARS?s Scientific Manuscript database
The primary goal of many natural products chemists is to extract, isolate, and characterize specific analytes from complex plant, animal, microbial, and food matrices. To achieve this goal, they rely considerably on highly sophisticated and highly hyphenated modern instrumentation. Yet, the vast maj...
Big Data Goes Personal: Privacy and Social Challenges
ERIC Educational Resources Information Center
Bonomi, Luca
2015-01-01
The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…
Temperature Dependence of Viscosities of Common Carrier Gases
ERIC Educational Resources Information Center
Sommers, Trent S.; Nahir, Tal M.
2005-01-01
Theoretical and experimental evidence for the dependence of viscosities of the real gases on temperature is described, suggesting that this dependence is greater than that predicted by the kinetic theory of gases. The experimental results were obtained using common modern instrumentation and could be reproduced by students in analytical or…
About, for, in or through Entrepreneurship in Engineering Education
ERIC Educational Resources Information Center
Mäkimurto-Koivumaa, Soili; Belt, Pekka
2016-01-01
Engineering competences form a potential basis for entrepreneurship. There are pressures to find new approaches to entrepreneurship education (EE) in engineering education, as the traditional analytical logic of engineering does not match the modern view of entrepreneurship. Since the previous models do not give tangible enough tools on how to…
Annual banned-substance review: Analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans
2018-01-01
Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.
Flow chemistry vs. flow analysis.
Trojanowicz, Marek
2016-01-01
The flow mode of conducting chemical syntheses facilitates chemical processes through the use of on-line analytical monitoring of occurring reactions, the application of solid-supported reagents to minimize downstream processing and computerized control systems to perform multi-step sequences. They are exactly the same attributes as those of flow analysis, which has solid place in modern analytical chemistry in several last decades. The following review paper, based on 131 references to original papers as well as pre-selected reviews, presents basic aspects, selected instrumental achievements and developmental directions of a rapidly growing field of continuous flow chemical synthesis. Interestingly, many of them might be potentially employed in the development of new methods in flow analysis too. In this paper, examples of application of flow analytical measurements for on-line monitoring of flow syntheses have been indicated and perspectives for a wider application of real-time analytical measurements have been discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
BARTTest: Community-Standard Atmospheric Radiative-Transfer and Retrieval Tests
NASA Astrophysics Data System (ADS)
Harrington, Joseph; Himes, Michael D.; Cubillos, Patricio E.; Blecic, Jasmina; Challener, Ryan C.
2018-01-01
Atmospheric radiative transfer (RT) codes are used both to predict planetary and brown-dwarf spectra and in retrieval algorithms to infer atmospheric chemistry, clouds, and thermal structure from observations. Observational plans, theoretical models, and scientific results depend on the correctness of these calculations. Yet, the calculations are complex and the codes implementing them are often written without modern software-verification techniques. The community needs a suite of test calculations with analytically, numerically, or at least community-verified results. We therefore present the Bayesian Atmospheric Radiative Transfer Test Suite, or BARTTest. BARTTest has four categories of tests: analytically verified RT tests of simple atmospheres (single line in single layer, line blends, saturation, isothermal, multiple line-list combination, etc.), community-verified RT tests of complex atmospheres, synthetic retrieval tests on simulated data with known answers, and community-verified real-data retrieval tests.BARTTest is open-source software intended for community use and further development. It is available at https://github.com/ExOSPORTS/BARTTest. We propose this test suite as a standard for verifying atmospheric RT and retrieval codes, analogous to the Held-Suarez test for general circulation models. This work was supported by NASA Planetary Atmospheres grant NX12AI69G, NASA Astrophysics Data Analysis Program grant NNX13AF38G, and NASA Exoplanets Research Program grant NNX17AB62G.
NASA Astrophysics Data System (ADS)
Trainoff, Steven
2009-03-01
Many modern pharmaceuticals and naturally occurring biomolecules consist of complexes of proteins and polyethylene glycol or carbohydrates. In the case of vaccine development, these complexes are often used to induce or amplify immune responses. For protein therapeutics they are used to modify solubility and function, or to control the rate of degradation and elimination of a drug from the body. Characterizing the stoichiometry of these complexes is an important industrial problem that presents a formidable challenge to analytical instrument designers. Traditional analytical methods, such as using florescent tagging, chemical assays, and mass spectrometry perturb the system so dramatically that the complexes are often destroyed or uncontrollably modified by the measurement. A solution to this problem consists of fractionating the samples and then measuring the fractions using sequential non-invasive detectors that are sensitive to different components of the complex. We present results using UV absorption, which is primarily sensitive to the protein fraction, Light Scattering, which measures the total weight average molar mass, and Refractive Index detection, which measures the net concentration. We also present a solution of the problem inter-detector band-broadening problem that has heretofore made this approach impractical. Presented will be instrumentation and an analysis method that overcome these obstacles and make this technique a reliable and robust way of non-invasively characterizing these industrially important compounds.
Determining Gender by Raman Spectroscopy of a Bloodstain.
Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K
2017-02-07
The development of novel methods for forensic science is a constantly growing area of modern analytical chemistry. Raman spectroscopy is one of a few analytical techniques capable of nondestructive and nearly instantaneous analysis of a wide variety of forensic evidence, including body fluid stains, at the scene of a crime. In this proof-of-concept study, Raman microspectroscopy was utilized for gender identification based on dry bloodstains. Raman spectra were acquired in mapping mode from multiple spots on a bloodstain to account for intrinsic sample heterogeneity. The obtained Raman spectroscopic data showed highly similar spectroscopic features for female and male blood samples. Nevertheless, support vector machines (SVM) and artificial neuron network (ANN) statistical methods applied to the spectroscopic data allowed for differentiating between male and female bloodstains with high confidence. More specifically, the statistical approach based on a genetic algorithm (GA) coupled with an ANN classification showed approximately 98% gender differentiation accuracy for individual bloodstains. These results demonstrate the great potential of the developed method for forensic applications, although more work is needed for method validation. When this method is fully developed, a portable Raman instrument could be used for the infield identification of traces of body fluids and to obtain phenotypic information about the donor, including gender and race, as well as for the analysis of a variety of other types of forensic evidence.
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Approximate analytical relationships for linear optimal aeroelastic flight control laws
NASA Astrophysics Data System (ADS)
Kassem, Ayman Hamdy
1998-09-01
This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.
Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration
NASA Technical Reports Server (NTRS)
Merritt, D. A.; Brand, W. A.; Hayes, J. M.
1994-01-01
In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).
Triple-helix molecular switch-based aptasensors and DNA sensors.
Bagheri, Elnaz; Abnous, Khalil; Alibolandi, Mona; Ramezani, Mohammad; Taghdisi, Seyed Mohammad
2018-07-15
Utilization of traditional analytical techniques is limited because they are generally time-consuming and require high consumption of reagents, complicated sample preparation and expensive equipment. Therefore, it is of great interest to achieve sensitive, rapid and simple detection methods. It is believed that nucleic acids assays, especially aptamers, are very important in modern life sciences for target detection and biological analysis. Aptamers and DNA-based sensors have been widely used for the design of various sensors owing to their unique features. In recent years, triple-helix molecular switch (THMS)-based aptasensors and DNA sensors have been broadly utilized for the detection and analysis of different targets. The THMS relies on the formation of DNA triplex via Watson-Crick and Hoogsteen base pairings under optimal conditions. This review focuses on recent progresses in the development and applications of electrochemical, colorimetric, fluorescence and SERS aptasensors and DNA sensors, which are based on THMS. Also, the advantages and drawbacks of these methods are discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
de Carvalho, Marcelo Pires Nogueira; Queiroz-Hazarbassanov, Nicolle Gilda Teixeira; de Oliveira Massoco, Cristina; Sant'Anna, Sávio Stefanini; Lourenço, Mariana Mathias; Levin, Gabriel; Sogayar, Mari Cleide; Grego, Kathleen Fernandes; Catão-Dias, José Luiz
2017-09-01
Reptiles are the unique ectothermic amniotes, providing the key link between ectothermic anamniotes fish and amphibians, and endothermic birds and mammals; becoming an important group to study with the aim of providing significant knowledge into the evolutionary history of vertebrate immunity. Classification systems for reptiles' leukocytes have been described by their appearance rather than function, being still inconsistent. With the advent of modern techniques and the establishment of analytical protocols for snakes' blood by flow cytometry, we bring a qualitative and quantitative assessment of innate activities presented by snakes' peripheral blood leukocytes, thereby linking flow cytometric features with fluorescent and light microscopy images. Moreover, since corticosterone is an important immunomodulator in reptiles, hormone levels of all blood samples were measured. We provide novel and additional information which should contribute to better understanding of the development of the immune system of reptiles and vertebrates. Copyright © 2017 Elsevier Ltd. All rights reserved.
Leveraging Social Computing for Personalized Crisis Communication using Social Media.
Leykin, Dmitry; Aharonson-Daniel, Limor; Lahad, Mooli
2016-03-24
The extensive use of social media in modern life redefines social interaction and communication. Communication plays an important role in mitigating, or exacerbating, the psychological and behavioral responses to critical incidents and disasters. As recent disasters demonstrated, people tend to converge to social media during and following emergencies. Authorities can then use this media and other computational methods to gain insights from the public, mainly to enhance situational awareness, but also to improve their communication with the public and public adherence to instructions. The current review presents a conceptual framework for studying psychological aspects of crisis and risk communication using the social media through social computing. Advanced analytical tools can be integrated in the processes and objectives of crisis communication. The availability of the computational techniques can improve communication with the public by a process of Hyper-Targeted Crisis Communication. The review suggests that using advanced computational tools for target-audience profiling and linguistic matching in social media, can facilitate more sensitive and personalized emergency communication.
Bochevarov, Arteum D; Sherrill, C David
2004-08-22
We present a general computer algorithm to contract an arbitrary number of second-quantized expressions and simplify the obtained analytical result. The functions that perform these operations are a part of the program Nostromo which facilitates the handling and analysis of the complicated mathematical formulas which are often encountered in modern quantum-chemical models. In contrast to existing codes of this kind, Nostromo is based solely on the Goldstone-diagrammatic representation of algebraic expressions in Fock space and has capabilities to work with operators as well as scalars. Each Goldstone diagram is internally represented by a line of text which is easy to interpret and transform. The calculation of matrix elements does not exploit Wick's theorem in a direct way, but uses diagrammatic techniques to produce only nonzero terms. The identification of equivalent expressions and their subsequent factorization in the final result is performed easily by analyzing the topological structure of the diagrammatic expressions. (c) 2004 American Institute of Physics
Quality of herbal medicines: challenges and solutions.
Zhang, Junhua; Wider, Barbara; Shang, Hongcai; Li, Xuemei; Ernst, Edzard
2012-01-01
The popularity of herbal medicines has risen worldwide. This increase in usage renders safety issues important. Many adverse events of herbal medicines can be attributed to the poor quality of the raw materials or the finished products. Different types of herbal medicines are associated with different problems. Quality issues of herbal medicines can be classified into two categories: external and internal. In this review, external issues including contamination (e.g. toxic metals, pesticides residues and microbes), adulteration and misidentification are detailed. Complexity and non-uniformity of the ingredients in herbal medicines are the internal issues affecting the quality of herbal medicines. Solutions to the raised problems are discussed. The rigorous implementation of Good Agricultural and Collection Practices (GACP) and Good Manufacturing Practices (GMP) would undoubtedly reduce the risk of external issues. Through the use of modern analytical methods and pharmaceutical techniques, previously unsolved internal issues have become solvable. Standard herbal products can be manufactured from the standard herbal extracts. Copyright © 2011 Elsevier Ltd. All rights reserved.
Characterizing visible and invisible cell wall mutant phenotypes.
Carpita, Nicholas C; McCann, Maureen C
2015-07-01
About 10% of a plant's genome is devoted to generating the protein machinery to synthesize, remodel, and deconstruct the cell wall. High-throughput genome sequencing technologies have enabled a reasonably complete inventory of wall-related genes that can be assembled into families of common evolutionary origin. Assigning function to each gene family member has been aided immensely by identification of mutants with visible phenotypes or by chemical and spectroscopic analysis of mutants with 'invisible' phenotypes of modified cell wall composition and architecture that do not otherwise affect plant growth or development. This review connects the inference of gene function on the basis of deviation from the wild type in genetic functional analyses to insights provided by modern analytical techniques that have brought us ever closer to elucidating the sequence structures of the major polysaccharide components of the plant cell wall. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Biosensing with Paper-Based Miniaturized Printed Electrodes-A Modern Trend.
Silveira, Célia M; Monteiro, Tiago; Almeida, Maria Gabriela
2016-09-28
From the bench-mark work on microfluidics from the Whitesides's group in 2007, paper technology has experienced significant growth, particularly regarding applications in biomedical research and clinical diagnostics. Besides the structural properties supporting microfluidics, other advantageous features of paper materials, including their versatility, disposability and low cost, show off the great potential for the development of advanced and eco-friendly analytical tools. Consequently, paper was quickly employed in the field of electrochemical sensors, being an ideal material for producing custom, tailored and miniaturized devices. Stencil-, inkjet-, or screen-printing are the preferential techniques for electrode manufacturing. Not surprisingly, we witnessed a rapid increase in the number of publications on paper based screen-printed sensors at the turn of the past decade. Among the sensing strategies, various biosensors, coupling electrochemical detectors with biomolecules, have been proposed. This work provides a critical review and a discussion on the future progress of paper technology in the context of miniaturized printed electrochemical biosensors.
Biosensing with Paper-Based Miniaturized Printed Electrodes–A Modern Trend
Silveira, Célia M.; Monteiro, Tiago; Almeida, Maria Gabriela
2016-01-01
From the bench-mark work on microfluidics from the Whitesides’s group in 2007, paper technology has experienced significant growth, particularly regarding applications in biomedical research and clinical diagnostics. Besides the structural properties supporting microfluidics, other advantageous features of paper materials, including their versatility, disposability and low cost, show off the great potential for the development of advanced and eco-friendly analytical tools. Consequently, paper was quickly employed in the field of electrochemical sensors, being an ideal material for producing custom, tailored and miniaturized devices. Stencil-, inkjet-, or screen-printing are the preferential techniques for electrode manufacturing. Not surprisingly, we witnessed a rapid increase in the number of publications on paper based screen-printed sensors at the turn of the past decade. Among the sensing strategies, various biosensors, coupling electrochemical detectors with biomolecules, have been proposed. This work provides a critical review and a discussion on the future progress of paper technology in the context of miniaturized printed electrochemical biosensors. PMID:27690119
Legal protection of public health through control over genetically modified food.
Gutorova, Nataliya; Batyhina, Olena; Trotska, Maryna
2018-01-01
Introduction: Science is constantly being developed which leads to both positive and negative changes in public health and the environment. One of the results of scientific progress is introduction of food based on genetically modified organisms whose effects on human health, to date, remain scantily studied and are ambiguous. The aim: to determine how human health can be influenced by food production based on genetically modified organisms. Materials and methods: international acts, data of international organizations and conclusions of scientists have been examined and used in the study. The article also summarizes information from scientific journals and monographs from a medical and legal point of view with scientific methods. This article is based on dialectical, comparative, analytic, synthetic and comprehensive research methods. Conclusions: Genetically modified organisms are specific human-made organisms being a result of using modern biotechnology techniques. They have both positive and negative effects on human health and the environment. The main disadvantage is not sufficient study of them in various spheres of public life.
The role of data fusion in predictive maintenance using digital twin
NASA Astrophysics Data System (ADS)
Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih
2018-04-01
Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.
Satagopan, Jaya M; Sen, Ananda; Zhou, Qin; Lan, Qing; Rothman, Nathaniel; Langseth, Hilde; Engel, Lawrence S
2016-06-01
Matched case-control studies are popular designs used in epidemiology for assessing the effects of exposures on binary traits. Modern studies increasingly enjoy the ability to examine a large number of exposures in a comprehensive manner. However, several risk factors often tend to be related in a nontrivial way, undermining efforts to identify the risk factors using standard analytic methods due to inflated type-I errors and possible masking of effects. Epidemiologists often use data reduction techniques by grouping the prognostic factors using a thematic approach, with themes deriving from biological considerations. We propose shrinkage-type estimators based on Bayesian penalization methods to estimate the effects of the risk factors using these themes. The properties of the estimators are examined using extensive simulations. The methodology is illustrated using data from a matched case-control study of polychlorinated biphenyls in relation to the etiology of non-Hodgkin's lymphoma. © 2015, The International Biometric Society.
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Mirasole, Cristiana; Di Carro, Marina; Tanwar, Shivani; Magi, Emanuele
2016-09-01
Among the wide range of emerging pollutants, perfluorinated compounds and various pharmaceuticals, such as nonsteroidal anti-inflammatory drugs, are showing growing concern. These contaminants can be found in freshwater ecosystems because of their incomplete removal during wastewater treatments so, their water solubility and poor degradability result in their continuous discharge and pseudo-persistent contamination. Usually, expected levels of these analytes are particularly low; therefore, sensitive and selective analytical techniques are required for their determination. Moreover, sampling and preconcentration are fundamental steps to reach the low detection limits required. The polar organic chemical integrative sampler (POCIS) represents a modern sampling approach that allows the in-situ preconcentration of ultra-trace pollutants. In this work, a fast liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method was developed for the determination of diclofenac, ketoprofen, mefenamic acid, naproxen, ibuprofen, perfluorooctanoic acid, perfluorooctanesulfonate and caffeine in water for human consumption. The chromatographic separation of analytes was achieved in less than 6 min. Quantitative analysis was performed in multiple reaction monitoring mode using ketoprofen-d3 as internal standard. Two different sites of Northern Italy were studied deploying POCIS for four weeks in both inlet and outlet of two drinking water treatment plants. The evaluation of time-weighted average concentration of contaminants was accomplished after the calibration of POCIS; to this aim, the sampling rate values for each compound were obtained by means of a simple calibration system developed in our laboratory. Ketoprofen, perfluorooctane sulfonate, perfluorooctanoate and caffeine were measured in both sites at the ng l(-1) level. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Scadding, Cameron J; Watling, R John; Thomas, Allen G
2005-08-15
The majority of crimes result in the generation of some form of physical evidence, which is available for collection by crime scene investigators or police. However, this debris is often limited in amount as modern criminals become more aware of its potential value to forensic scientists. The requirement to obtain robust evidence from increasingly smaller sized samples has required refinement and modification of old analytical techniques and the development of new ones. This paper describes a new method for the analysis of oxy-acetylene debris, left behind at a crime scene, and the establishment of its co-provenance with single particles of equivalent debris found on the clothing of persons of interest (POI). The ability to rapidly determine and match the elemental distribution patterns of debris collected from crime scenes to those recovered from persons of interest is essential in ensuring successful prosecution. Traditionally, relatively large amounts of sample (up to several milligrams) have been required to obtain a reliable elemental fingerprint of this type of material [R.J. Walting , B.F. Lynch, D. Herring, J. Anal. At. Spectrom. 12 (1997) 195]. However, this quantity of material is unlikely to be recovered from a POI. This paper describes the development and application of laser ablation inductively coupled plasma time of flight mass spectrometry (LA-ICP-TOF-MS), as an analytical protocol, which can be applied more appropriately to the analysis of micro-debris than conventional quadrupole based mass spectrometry. The resulting data, for debris as small as 70mum in diameter, was unambiguously matched between a single spherule recovered from a POI and a spherule recovered from the scene of crime, in an analytical procedure taking less than 5min.
Common aspects influencing the translocation of SERS to Biomedicine.
Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu
2018-01-04
In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
NASA Astrophysics Data System (ADS)
Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr
2017-10-01
Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.
Hydrodynamic Simulations of Protoplanetary Disks with GIZMO
NASA Astrophysics Data System (ADS)
Rice, Malena; Laughlin, Greg
2018-01-01
Over the past several decades, the field of computational fluid dynamics has rapidly advanced as the range of available numerical algorithms and computationally feasible physical problems has expanded. The development of modern numerical solvers has provided a compelling opportunity to reconsider previously obtained results in search for yet undiscovered effects that may be revealed through longer integration times and more precise numerical approaches. In this study, we compare the results of past hydrodynamic disk simulations with those obtained from modern analytical resources. We focus our study on the GIZMO code (Hopkins 2015), which uses meshless methods to solve the homogeneous Euler equations of hydrodynamics while eliminating problems arising as a result of advection between grid cells. By comparing modern simulations with prior results, we hope to provide an improved understanding of the impact of fluid mechanics upon the evolution of protoplanetary disks.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Surface-Enhanced Raman Spectroscopy.
ERIC Educational Resources Information Center
Garrell, Robin L.
1989-01-01
Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
NASA Astrophysics Data System (ADS)
Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.
2002-04-01
Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
NASA Technical Reports Server (NTRS)
Whipple, R. D.
1980-01-01
The potential effectiveness of rockets as an auxiliary means for an aircraft to effect recovery from spins was investigated. The advances in rocket technology produced by the space effort suggested that currently available systems might obviate many of the problems encountered in earlier rocket systems. A modern fighter configuration known to exhibit a flat spin mode was selected. An analytical study was made of the thrust requirements for a rocket spin recovery system for the subject configuration. These results were then applied to a preliminary systems study of rocket components appropriate to the problem. Subsequent spin tunnel tests were run to evaluate the analytical results.
Continuing evolution of in-vitro diagnostic instrumentation
NASA Astrophysics Data System (ADS)
Cohn, Gerald E.
2000-04-01
The synthesis of analytical instrumentation and analytical biochemistry technologies in modern in vitro diagnostic instrumentation continues to generate new systems with improved performance and expanded capability. Detection modalities have expanded to include multichip modes of fluorescence, scattering, luminescence and reflectance so as to accommodate increasingly sophisticated immunochemical and nucleic acid based reagent systems. The time line graph of system development now extends from the earliest automated clinical spectrophotometers through molecule recognition assays and biosensors to the new breakthroughs of biochip and DNA diagnostics. This brief review traces some of the major innovations in the evolution of system technologies and previews the conference program.
Locality-Aware CTA Clustering For Modern GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Liu, Weifeng
2017-04-08
In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less
Modern adjuncts and technologies in microsurgery: an historical and evidence-based review.
Pratt, George F; Rozen, Warren M; Chubb, Daniel; Whitaker, Iain S; Grinsell, Damien; Ashton, Mark W; Acosta, Rafael
2010-11-01
While modern reconstructive surgery was revolutionized with the introduction of microsurgical techniques, microsurgery itself has seen the introduction of a range of technological aids and modern techniques aiming to improve dissection times, anastomotic times, and overall outcomes. These include improved preoperative planning, anastomotic aides, and earlier detection of complications with higher salvage rates. Despite the potential for substantial impact, many of these techniques have been evaluated in a limited fashion, and the evidence for each has not been universally explored. The purpose of this review was to establish and quantify the evidence for each technique. A search of relevant medical databases was performed to identify literature providing evidence for each technology. Levels of evidence were thus accumulated and applied to each technique. There is a relative paucity of evidence for many of the more recent technologies described in the field of microsurgery, with no randomized controlled trials, and most studies in the field comprising case series only. Current evidence-based suggestions include the use of computed tomographic angiography (CTA) for the preoperative planning of perforator flaps, the intraoperative use of a mechanical anastomotic coupling aide (particularly the Unilink® coupler), and postoperative flap monitoring with strict protocols using clinical bedside monitoring and/or the implantable Doppler probe. Despite the breadth of technologies introduced into the field of microsurgery, there is substantial variation in the degree of evidence presented for each, suggesting the role for much future research, particularly from emerging technologies such as robotics and modern simulators. Copyright © 2010 Wiley-Liss, Inc.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Resonance Ionization, Mass Spectrometry.
ERIC Educational Resources Information Center
Young, J. P.; And Others
1989-01-01
Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)
Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods
ERIC Educational Resources Information Center
Zhang, Ying
2011-01-01
Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…
A Survey of Architectural Techniques For Improving Cache Power Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-worldmore » processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.« less
Zakhia, Frédéric; de Lajudie, Philippe
2006-03-01
Taxonomy is the science that studies the relationships between organisms. It comprises classification, nomenclature, and identification. Modern bacterial taxonomy is polyphasic. This means that it is based on several molecular techniques, each one retrieving the information at different cellular levels (proteins, fatty acids, DNA...). The obtained results are combined and analysed to reach a "consensus taxonomy" of a microorganism. Until 1970, a small number of classification techniques were available for microbiologists (mainly phenotypic characterization was performed: a legume species nodulation ability for a Rhizobium, for example). With the development of techniques based on polymerase chain reaction for characterization, the bacterial taxonomy has undergone great changes. In particular, the classification of the legume nodulating bacteria has been repeatedly modified over the last 20 years. We present here a review of the currently used molecular techniques in bacterial characterization, with examples of application of these techniques for the study of the legume nodulating bacteria.
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Ultra-small dye-doped silica nanoparticles via modified sol-gel technique.
Riccò, R; Nizzero, S; Penna, E; Meneghello, A; Cretaio, E; Enrichi, F
2018-01-01
In modern biosensing and imaging, fluorescence-based methods constitute the most diffused approach to achieve optimal detection of analytes, both in solution and on the single-particle level. Despite the huge progresses made in recent decades in the development of plasmonic biosensors and label-free sensing techniques, fluorescent molecules remain the most commonly used contrast agents to date for commercial imaging and detection methods. However, they exhibit low stability, can be difficult to functionalise, and often result in a low signal-to-noise ratio. Thus, embedding fluorescent probes into robust and bio-compatible materials, such as silica nanoparticles, can substantially enhance the detection limit and dramatically increase the sensitivity. In this work, ultra-small fluorescent silica nanoparticles (NPs) for optical biosensing applications were doped with a fluorescent dye, using simple water-based sol-gel approaches based on the classical Stöber procedure. By systematically modulating reaction parameters, controllable size tuning of particle diameters as low as 10 nm was achieved. Particles morphology and optical response were evaluated showing a possible single-molecule behaviour, without employing microemulsion methods to achieve similar results. Graphical abstractWe report a simple, cheap, reliable protocol for the synthesis and systematic tuning of ultra-small (< 10 nm) dye-doped luminescent silica nanoparticles.
NASA Astrophysics Data System (ADS)
Hönicke, Philipp; Krämer, Markus; Lühl, Lars; Andrianov, Konstantin; Beckhoff, Burkhard; Dietsch, Rainer; Holz, Thomas; Kanngießer, Birgit; Weißbach, Danny; Wilhein, Thomas
2018-07-01
With the advent of both modern X-ray fluorescence (XRF) methods and improved analytical reliability requirements the demand for suitable reference samples has increased. Especially in nanotechnology with the very low areal mass depositions, quantification becomes considerably more difficult. However, the availability of suited reference samples is drastically lower than the demand. Physical vapor deposition techniques have been enhanced significantly in the last decade driven by the need for extremely precise film parameters in multilayer production. We have applied those techniques for the development of layer-like reference samples with mass depositions in the ng-range and well below for Ca, Cu, Pb, Mo, Pd, Pb, La, Fe and Ni. Numerous other elements would also be possible. Several types of reference samples were fabricated: multi-elemental layer and extremely low (sub-monolayer) samples for various applications in XRF and total-reflection XRF analysis. Those samples were characterized and compared at three different synchrotron radiation beamlines at the BESSY II electron storage ring employing the reference-free XRF approach based on physically calibrated instrumentation. In addition, the homogeneity of the multi-elemental coatings was checked at the P04 beamline at DESY. The measurements demonstrate the high precision achieved in the manufacturing process as well as the versatility of application fields for the presented reference samples.
Prospects for Practical Laser Ablation U/Pb and (U-Th)/He Double-Dating (LADD) of Detrital Apatite
NASA Astrophysics Data System (ADS)
Horne, A.; Hodges, K. V.; Van Soest, M. C.
2017-12-01
A laser ablation micro-analytical technique for (U-Th)/He dating has been shown to be an effective approach to the thermochronologic study of detrital zircons (Tripathy-Lang et al., J. Geophys. Res., 2013), while Evans et al. (J. Anal. At. Spectrom., 2015) and Horne et al. (Geochim. Cosmochim. Acta, 2016) demonstrated how the technique could be modified to enable laser ablation U/Pb and (U-Th)/He double-dating (LADD) of detrital zircon and titanite. These successes beg the question of whether or not LADD is viable for another commonly encountered detrital mineral: apatite. Exploratory LADD studies in Arizona State University's Group 18 Laboratories - using Durango fluorapatite, apatite from the Fish Canyon tuff, and detrital apatite from modern fluvial sediments in the eastern Sierra Nevada of California - illustrate that the method is indeed viable for detrital apatite. However, the method may not be appropriate for all detrital samples. For example, many apatite grains encountered in detrital samples from young orogenic settings have low concentrations of U and Th and small crystal sizes. This can lead to imprecise laser ablation (U-Th)/He dates, especially for very young grains potentially obscuring or inhibiting relevant interpretations of the data set.
Purdue Rare Isotope Measurement Laboratory
NASA Astrophysics Data System (ADS)
Caffee, M.; Elmore, D.; Granger, D.; Muzikar, P.
2002-12-01
The Purdue Rare Isotope Measurement Laboratory (PRIME Lab) is a dedicated research and service facility for accelerator mass spectrometry. AMS is an ultra-sensitive analytical technique used to measure low levels of long-lived cosmic-ray-produced and anthropogenic radionuclides, and rare trace elements. We measure 10Be (T1/2 = 1.5 My), 26Al (.702 My), 36Cl (.301 My), and 129I (16 My), in geologic samples. Applications include dating the cosmic-ray-exposure time of rocks on Earth's surface, determining rock and sediment burial ages, measuring the erosion rates of rocks and soils, and tracing and dating ground water. We perform sample preparation and separation chemistries for these radio-nuclides for our internal research activities and for those external researchers not possessing this capability. Our chemical preparation laboratories also serve as training sites for members of the geoscience community developing these techniques at their institutions. Research at Purdue involves collaborators among members of the Purdue Departments of Physics, Earth and Atmospheric Sciences, Chemistry, Agronomy, and Anthropology. We also collaborate and serve numerous scientists from other institutions. We are currently in the process of modernizing the facility with the goals of higher precision for routinely measured radio-nuclides, increased sample throughput, and the development of new measurement capabilities for the geoscience community.
Mass spectrometry-driven drug discovery for development of herbal medicine.
Zhang, Aihua; Sun, Hui; Wang, Xijun
2018-05-01
Herbal medicine (HM) has made a major contribution to the drug discovery process with regard to identifying products compounds. Currently, more attention has been focused on drug discovery from natural compounds of HM. Despite the rapid advancement of modern analytical techniques, drug discovery is still a difficult and lengthy process. Fortunately, mass spectrometry (MS) can provide us with useful structural information for drug discovery, has been recognized as a sensitive, rapid, and high-throughput technology for advancing drug discovery from HM in the post-genomic era. It is essential to develop an efficient, high-quality, high-throughput screening method integrated with an MS platform for early screening of candidate drug molecules from natural products. We have developed a new chinmedomics strategy reliant on MS that is capable of capturing the candidate molecules, facilitating their identification of novel chemical structures in the early phase; chinmedomics-guided natural product discovery based on MS may provide an effective tool that addresses challenges in early screening of effective constituents of herbs against disease. This critical review covers the use of MS with related techniques and methodologies for natural product discovery, biomarker identification, and determination of mechanisms of action. It also highlights high-throughput chinmedomics screening methods suitable for lead compound discovery illustrated by recent successes. © 2016 Wiley Periodicals, Inc.
Ethnobotany and Medicinal Plant Biotechnology: From Tradition to Modern Aspects of Drug Development.
Kayser, Oliver
2018-05-24
Secondary natural products from plants are important drug leads for the development of new drug candidates for rational clinical therapy and exhibit a variety of biological activities in experimental pharmacology and serve as structural template in medicinal chemistry. The exploration of plants and discovery of natural compounds based on ethnopharmacology in combination with high sophisticated analytics is still today an important drug discovery to characterize and validate potential leads. Due to structural complexity, low abundance in biological material, and high costs in chemical synthesis, alternative ways in production like plant cell cultures, heterologous biosynthesis, and synthetic biotechnology are applied. The basis for any biotechnological process is deep knowledge in genetic regulation of pathways and protein expression with regard to todays "omics" technologies. The high number genetic techniques allowed the implementation of combinatorial biosynthesis and wide genome sequencing. Consequently, genetics allowed functional expression of biosynthetic cascades from plants and to reconstitute low-performing pathways in more productive heterologous microorganisms. Thus, de novo biosynthesis in heterologous hosts requires fundamental understanding of pathway reconstruction and multitude of genes in a foreign organism. Here, actual concepts and strategies are discussed for pathway reconstruction and genome sequencing techniques cloning tools to bridge the gap between ethnopharmaceutical drug discovery to industrial biotechnology. Georg Thieme Verlag KG Stuttgart · New York.
Strategies for Fermentation Medium Optimization: An In-Depth Review
Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.
2017-01-01
Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566
NASA Astrophysics Data System (ADS)
Mitchell, Justin Chadwick
2011-12-01
Using light to probe the structure of matter is as natural as opening our eyes. Modern physics and chemistry have turned this art into a rich science, measuring the delicate interactions possible at the molecular level. Perhaps the most commonly used tool in computational spectroscopy is that of matrix diagonalization. While this is invaluable for calculating everything from molecular structure and energy levels to dipole moments and dynamics, the process of numerical diagonalization is an opaque one. This work applies symmetry and semi-classical techniques to elucidate numerical spectral analysis for high-symmetry molecules. Semi-classical techniques, such as the Potential Energy Surfaces, have long been used to help understand molecular vibronic and rovibronic spectra and dynamics. This investigation focuses on newer semi-classical techniques that apply Rotational Energy Surfaces (RES) to rotational energy level clustering effects in high-symmetry molecules. Such clusters exist in rigid rotor molecules as well as deformable spherical tops. This study begins by using the simplicity of rigid symmetric top molecules to clarify the classical-quantum correspondence of RES semi-classical analysis and then extends it to a more precise and complete theory of modern high-resolution spectra. RES analysis is extended to molecules having more complex and higher rank tensorial rotational and rovibrational Hamiltonians than were possible to understand before. Such molecules are shown to produce an extraordinary range of rotational level clusters, corresponding to a panoply of symmetries ranging from C4v to C2 and C1 (no symmetry) with a corresponding range of new angular momentum localization and J-tunneling effects. Using RES topography analysis and the commutation duality relations between symmetry group operators in the lab-frame to those in the body-frame, it is shown how to better describe and catalog complex splittings found in rotational level clusters. Symmetry character analysis is generalized to give analytic eigensolutions. An appendix provides vibrational analogies. For the first time, interactions between molecular vibrations (polyads) are described semi-classically by multiple RES. This is done for the nu 3/2nu4 dyad of CF4. The nine-surface RES topology of the U(9)-dyad agrees with both computational and experimental work. A connection between this and a simpler U(2) example is detailed in an Appendix.
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Chapter 16 - Predictive Analytics for Comprehensive Energy Systems State Estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yingchen; Yang, Rui; Hodge, Brian S
Energy sustainability is a subject of concern to many nations in the modern world. It is critical for electric power systems to diversify energy supply to include systems with different physical characteristics, such as wind energy, solar energy, electrochemical energy storage, thermal storage, bio-energy systems, geothermal, and ocean energy. Each system has its own range of control variables and targets. To be able to operate such a complex energy system, big-data analytics become critical to achieve the goal of predicting energy supplies and consumption patterns, assessing system operation conditions, and estimating system states - all providing situational awareness to powermore » system operators. This chapter presents data analytics and machine learning-based approaches to enable predictive situational awareness of the power systems.« less
EvoGraph: On-The-Fly Efficient Mining of Evolving Graphs on GPU
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sengupta, Dipanjan; Song, Shuaiwen
With the prevalence of the World Wide Web and social networks, there has been a growing interest in high performance analytics for constantly-evolving dynamic graphs. Modern GPUs provide massive AQ1 amount of parallelism for efficient graph processing, but the challenges remain due to their lack of support for the near real-time streaming nature of dynamic graphs. Specifically, due to the current high volume and velocity of graph data combined with the complexity of user queries, traditional processing methods by first storing the updates and then repeatedly running static graph analytics on a sequence of versions or snapshots are deemed undesirablemore » and computational infeasible on GPU. We present EvoGraph, a highly efficient and scalable GPU- based dynamic graph analytics framework.« less
Epilepsy analytic system with cloud computing.
Shen, Chia-Ping; Zhou, Weizhi; Lin, Feng-Seng; Sung, Hsiao-Ya; Lam, Yan-Yu; Chen, Wei; Lin, Jeng-Wei; Pan, Ming-Kai; Chiu, Ming-Jang; Lai, Feipei
2013-01-01
Biomedical data analytic system has played an important role in doing the clinical diagnosis for several decades. Today, it is an emerging research area of analyzing these big data to make decision support for physicians. This paper presents a parallelized web-based tool with cloud computing service architecture to analyze the epilepsy. There are many modern analytic functions which are wavelet transform, genetic algorithm (GA), and support vector machine (SVM) cascaded in the system. To demonstrate the effectiveness of the system, it has been verified by two kinds of electroencephalography (EEG) data, which are short term EEG and long term EEG. The results reveal that our approach achieves the total classification accuracy higher than 90%. In addition, the entire training time accelerate about 4.66 times and prediction time is also meet requirements in real time.
An analytical and experimental evaluation of a Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. A.; Cosby, R. M.
1976-01-01
An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Assembly of a Vacuum Chamber: A Hands-On Approach to Introduce Mass Spectrometry
ERIC Educational Resources Information Center
Bussie`re, Guillaume; Stoodley, Robin; Yajima, Kano; Bagai, Abhimanyu; Popowich, Aleksandra K.; Matthews, Nicholas E.
2014-01-01
Although vacuum technology is essential to many aspects of modern physical and analytical chemistry, vacuum experiments are rarely the focus of undergraduate laboratories. We describe an experiment that introduces students to vacuum science and mass spectrometry. The students first assemble a vacuum system, including a mass spectrometer. While…
Modern sanitary practices result in large volumes of human waste, as well as domestic and industrial sewage, being collected and treated at common collection points, wastewater treatment plants (WWTP). In recognition of the growing use of sewage sludges as a fertilizers and as so...
ERIC Educational Resources Information Center
Thompson, Robert Q.
1988-01-01
Describes a laboratory exercise in which acid dissociation constants and molecular weights are extracted from sample data and the sample is identified. Emphasizes accurate volumetric work while bringing to practice the concepts of acid-base equilibria, activity coefficients, and thermodynamic constants. (CW)
ERIC Educational Resources Information Center
Reiners, Torsten; Dreher, Heinz
2009-01-01
In modern learning environments, the lecturer or educational designer is often confronted with multi-national student cohorts, requiring special consideration regarding language, cultural norms and taboos, religion, and ethics. Through a somewhat provocative example we demonstrate that taking such factors into account can be essential to avoid…
Material Development Study for a Hazardous Chemical Protective Clothing Outfit
1980-08-01
Analytical - X VCM Vinyl Chloride Unavailable - X ZEC Zectran Insoluble X X ZCN Zinc Cyanide Insoluble X X ZCT Zirconium Tetrachloride Water React. X 64...Environmental Rating of Plastics, Reprints of Design News, R. L. Peters , 6 December 1967, 20 December 1967, 22 November 1967, 8 November 1967. 13. Modern
Understanding Textual Authorship in the Digital Environment: Lessons from Historical Perspectives
ERIC Educational Resources Information Center
Velagic, Zoran; Hasenay, Damir
2013-01-01
Introduction: The paper explains how the modern understanding of authorship developed and sets out the problems to be considered when discussing digital authorship. Method: The contextual analysis of contents of the key themes is employed; in the articulation of the conclusions, analytic and synthetic approaches are used. Results: At each turning…
Interactive Molecular Graphics for Augmented Reality Using HoloLens.
Müller, Christoph; Krone, Michael; Huber, Markus; Biener, Verena; Herr, Dominik; Koch, Steffen; Reina, Guido; Weiskopf, Daniel; Ertl, Thomas
2018-06-13
Immersive technologies like stereo rendering, virtual reality, or augmented reality (AR) are often used in the field of molecular visualisation. Modern, comparably lightweight and affordable AR headsets like Microsoft's HoloLens open up new possibilities for immersive analytics in molecular visualisation. A crucial factor for a comprehensive analysis of molecular data in AR is the rendering speed. HoloLens, however, has limited hardware capabilities due to requirements like battery life, fanless cooling and weight. Consequently, insights from best practises for powerful desktop hardware may not be transferable. Therefore, we evaluate the capabilities of the HoloLens hardware for modern, GPU-enabled, high-quality rendering methods for the space-filling model commonly used in molecular visualisation. We also assess the scalability for large molecular data sets. Based on the results, we discuss ideas and possibilities for immersive molecular analytics. Besides more obvious benefits like the stereoscopic rendering offered by the device, this specifically includes natural user interfaces that use physical navigation instead of the traditional virtual one. Furthermore, we consider different scenarios for such an immersive system, ranging from educational use to collaborative scenarios.
Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi
2016-10-01
The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (
Steam thermolysis of tire shreds: modernization in afterburning of accompanying gas with waste steam
NASA Astrophysics Data System (ADS)
Kalitko, V. A.
2010-03-01
On the basis of experience in the commercial operation of tire-shred steam thermolysis in EnresTec Inc. (Taiwan) producing high-grade commercial carbon, liquid pyrolysis fuel, and accompanying fuel gas by this method, we have proposed a number of engineering solutions and calculated-analytical substantiations for modernization and intensification of the process by afterburning the accompanying gas with waste steam condensable in the scrubber of water gas cleaning of afterburning products. The condensate is completely freed of the organic pyrolysis impurities and the necessity of separating it from the liquid fuel, as is the case with the active process, is excluded.
Evaluation of analytical errors in a clinical chemistry laboratory: a 3 year experience.
Sakyi, As; Laing, Ef; Ephraim, Rk; Asibey, Of; Sadique, Ok
2015-01-01
Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified.
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
CIEL*a*b* color space predictive models for colorimetry devices--analysis of perfume quality.
Korifi, Rabia; Le Dréau, Yveline; Antinelli, Jean-François; Valls, Robert; Dupuy, Nathalie
2013-01-30
Color perception plays a major role in the consumer evaluation of perfume quality. Consumers need first to be entirely satisfied with the sensory properties of products, before other quality dimensions become relevant. The evaluation of complex mixtures color presents a challenge even for modern analytical techniques. A variety of instruments are available for color measurement. They can be classified as tristimulus colorimeters and spectrophotometers. Obsolescence of the electronics of old tristimulus colorimeter arises from the difficulty in finding repair parts and leads to its replacement by more modern instruments. High quality levels in color measurement, i.e., accuracy and reliability in color control are the major advantages of the new generation of color instrumentation, the integrating sphere spectrophotometer. Two models of spectrophotometer were tested in transmittance mode, employing the d/0° geometry. The CIEL(*)a(*)b(*) color space parameters were measured with each instrument for 380 samples of raw materials and bases used in the perfume compositions. The results were graphically compared between the colorimeter device and the spectrophotometer devices. All color space parameters obtained with the colorimeter were used as dependent variables to generate regression equations with values obtained from the spectrophotometers. The data was statistically analyzed to create predictive model between the reference and the target instruments through two methods. The first method uses linear regression analysis and the second method consists of partial least square regression (PLS) on each component. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illidge, Tim, E-mail: Tim.Illidge@ics.manchester.ac.uk; Specht, Lena; Yahalom, Joachim
2014-05-01
Radiation therapy (RT) is the most effective single modality for local control of non-Hodgkin lymphoma (NHL) and is an important component of therapy for many patients. Many of the historic concepts of dose and volume have recently been challenged by the advent of modern imaging and RT planning tools. The International Lymphoma Radiation Oncology Group (ILROG) has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the ILROG steering committee on the use of RT in NHL in the modern era. The roles of reduced volume and reduced doses aremore » addressed, integrating modern imaging with 3-dimensional planning and advanced techniques of RT delivery. In the modern era, in which combined-modality treatment with systemic therapy is appropriate, the previously applied extended-field and involved-field RT techniques that targeted nodal regions have now been replaced by limiting the RT to smaller volumes based solely on detectable nodal involvement at presentation. A new concept, involved-site RT, defines the clinical target volume. For indolent NHL, often treated with RT alone, larger fields should be considered. Newer treatment techniques, including intensity modulated RT, breath holding, image guided RT, and 4-dimensional imaging, should be implemented, and their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control.« less
NASA Astrophysics Data System (ADS)
Yazdchi, K.; Salehi, M.; Shokrieh, M. M.
2009-03-01
By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.
The use of biochemical methods in extraterrestrial life detection
NASA Astrophysics Data System (ADS)
McDonald, Gene
2006-08-01
Instrument development for in situ extraterrestrial life detection focuses primarily on the ability to distinguish between biological and non-biological material, mostly through chemical analysis for potential biosignatures (e.g., biogenic minerals, enantiomeric excesses). In constrast, biochemical analysis techniques commonly applied to Earth life focus primarily on the exploration of cellular and molecular processes, not on the classification of a given system as biological or non-biological. This focus has developed because of the relatively large functional gap between life and non-life on Earth today. Life on Earth is very diverse from an environmental and physiological point of view, but is highly conserved from a molecular point of view. Biochemical analysis techniques take advantage of this similarity of all terrestrial life at the molecular level, particularly through the use of biologically-derived reagents (e.g., DNA polymerases, antibodies), to enable analytical methods with enormous sensitivity and selectivity. These capabilities encourage consideration of such reagents and methods for use in extraterrestrial life detection instruments. The utility of this approach depends in large part on the (unknown at this time) degree of molecular compositional differences between extraterrestrial and terrestrial life. The greater these differences, the less useful laboratory biochemical techniques will be without significant modification. Biochemistry and molecular biology methods may need to be "de-focused" in order to produce instruments capable of unambiguously detecting a sufficiently wide range of extraterrestrial biochemical systems. Modern biotechnology tools may make that possible in some cases.
Modern quantitative schlieren techniques
NASA Astrophysics Data System (ADS)
Hargather, Michael; Settles, Gary
2010-11-01
Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.
Application of contrast media in post-mortem imaging (CT and MRI).
Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice
2015-09-01
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
An Analytical Solution for Transient Thermal Response of an Insulated Structure
NASA Technical Reports Server (NTRS)
Blosser, Max L.
2012-01-01
An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
Veronesi, Umberto; Martinón-Torres, Marcos
2018-06-18
Glass distillation equipment from an early modern alchemical laboratory was analyzed for its technology of manufacture and potential origin. Chemical data show that the assemblage can be divided into sodium-rich, colorless distillation vessels made with glass from Venice or its European imitation, and potassium-rich dark-brown non-specialized forms produced within the technological tradition of forest glass typical for central and north-western Europe. These results complete our understanding of the supply of technical apparatus at one of the best-preserved alchemical laboratories and highlight an early awareness of the need for high-quality instruments to guarantee the successful outcome of specialized chemical operations. This study demonstrates the potential of archaeological science to inform historical research around the practice of early chemistry and the development of modern science. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Injuries in students of three different dance techniques.
Echegoyen, Soledad; Acuña, Eugenia; Rodríguez, Cristina
2010-06-01
As with any athlete, the dancer has a high risk for injury. Most studies carried out relate to classical and modern dance; however, there is a lack of reports on injuries involving other dance techniques. This study is an attempt to determine the differences in the incidence, the exposure-related rates, and the kind of injuries in three different dance techniques. A prospective study about dance injuries was carried out between 2004 and 2007 on students of modern, Mexican folkloric, and Spanish dance at the Escuela Nacional de Danza. A total of 1,168 injuries were registered in 444 students; the injury rate was 4 injuries/student for modern dance and 2 injuries/student for Mexican folkloric and Spanish dance. The rate per training hours was 4 for modern, 1.8 for Mexican folkloric, and 1.5 injuries/1,000 hr of training for Spanish dance. The lower extremity is the most frequent structure injured (70.47%), and overuse injuries comprised 29% of the total. The most frequent injuries were strain, sprain, back pain, and patellofemoral pain. This study has a consistent medical diagnosis of the injuries and is the first attempt in Mexico to compare the incidence of injuries in different dance techniques. To decrease the frequency of student injury, it is important to incorporate prevention programs into dance program curricula. More studies are necessary to define causes and mechanisms of injury, as well as an analysis of training methodology, to decrease the incidence of the muscle imbalances resulting in injury.
Optical trapping for analytical biotechnology.
Ashok, Praveen C; Dholakia, Kishan
2012-02-01
We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Figueroa, M. C.; Gregory, D. D.; Lyons, T. W.; Williford, K. H.
2017-12-01
Life processes affect trace element abundances in pyrite such that sedimentary and hydrothermal pyrite have significantly different trace element signatures. Thus, we propose that these biogeochemical data could be used to identify pyrite that formed biogenetically either early in our planet's history or on other planets, particularly Mars. The potential for this approach is elevated because pyrite is common in diverse sedimentary settings, and its trace element content can be preserved despite secondary overprints up to greenschist facies, thus minimizing the concerns about remobilization that can plague traditional whole rock studies. We are also including in-situ sulfur isotope analysis to further refine our understanding of the complex signatures of ancient pyrite. Sulfur isotope data can point straightforwardly to the involvement of life, because pyrite in sediments is inextricably linked to bacterial sulfate reduction and its diagnostic isotopic expressions. In addition to analyzing pyrite of known biological origin formed in the modern and ancient oceans under a range of conditions, we are building a data set for pyrite formed by hydrothermal and metamorphic processes to minimize the risk of false positives in life detection. We have used Random Forests (RF), a machine learning statistical technique with proven efficiency for classifying large geological datasets, to classify pyrite into biotic and abiotic end members. Coupling the trace element and sulfur isotope data from our analyses with a large existing dataset from diverse settings has yielded 4500 analyses with 18 different variables. Our initial results reveal the promise of the RF approach, correctly identifying biogenic pyrite 97 percent of the time. We will continue to couple new in-situ S-isotope and trace element analyses of biogenic pyrite grains from modern and ancient environments, using cutting-edge microanalytical techniques, with new data from high temperature settings. Our ultimately goal is a refined search tool with straightforward application in the search for early life on Earth and distant life recorded in meteorites, returned samples, and in situ measurements.
Cost-effectiveness of modern radiotherapy techniques in locally advanced pancreatic cancer.
Murphy, James D; Chang, Daniel T; Abelson, Jon; Daly, Megan E; Yeung, Heidi N; Nelson, Lorene M; Koong, Albert C
2012-02-15
Radiotherapy may improve the outcome of patients with pancreatic cancer but at an increased cost. In this study, the authors evaluated the cost-effectiveness of modern radiotherapy techniques in the treatment of locally advanced pancreatic cancer. A Markov decision-analytic model was constructed to compare the cost-effectiveness of 4 treatment regimens: gemcitabine alone, gemcitabine plus conventional radiotherapy, gemcitabine plus intensity-modulated radiotherapy (IMRT); and gemcitabine with stereotactic body radiotherapy (SBRT). Patients transitioned between the following 5 health states: stable disease, local progression, distant failure, local and distant failure, and death. Health utility tolls were assessed for radiotherapy and chemotherapy treatments and for radiation toxicity. SBRT increased life expectancy by 0.20 quality-adjusted life years (QALY) at an increased cost of $13,700 compared with gemcitabine alone (incremental cost-effectiveness ratio [ICER] = $69,500 per QALY). SBRT was more effective and less costly than conventional radiotherapy and IMRT. An analysis that excluded SBRT demonstrated that conventional radiotherapy had an ICER of $126,800 per QALY compared with gemcitabine alone, and IMRT had an ICER of $1,584,100 per QALY compared with conventional radiotherapy. A probabilistic sensitivity analysis demonstrated that the probability of cost-effectiveness at a willingness to pay of $50,000 per QALY was 78% for gemcitabine alone, 21% for SBRT, 1.4% for conventional radiotherapy, and 0.01% for IMRT. At a willingness to pay of $200,000 per QALY, the probability of cost-effectiveness was 73% for SBRT, 20% for conventional radiotherapy, 7% for gemcitabine alone, and 0.7% for IMRT. The current results indicated that IMRT in locally advanced pancreatic cancer exceeds what society considers cost-effective. In contrast, combining gemcitabine with SBRT increased clinical effectiveness beyond that of gemcitabine alone at a cost potentially acceptable by today's standards. Copyright © 2011 American Cancer Society.
8-Channel acquisition system for Time-Correlated Single-Photon Counting.
Antonioli, S; Miari, L; Cuccato, A; Crotti, M; Rech, I; Ghioni, M
2013-06-01
Nowadays, an increasing number of applications require high-performance analytical instruments capable to detect the temporal trend of weak and fast light signals with picosecond time resolution. The Time-Correlated Single-Photon Counting (TCSPC) technique is currently one of the preferable solutions when such critical optical signals have to be analyzed and it is fully exploited in biomedical and chemical research fields, as well as in security and space applications. Recent progress in the field of single-photon detector arrays is pushing research towards the development of high performance multichannel TCSPC systems, opening the way to modern time-resolved multi-dimensional optical analysis. In this paper we describe a new 8-channel high-performance TCSPC acquisition system, designed to be compact and versatile, to be used in modern TCSPC measurement setups. We designed a novel integrated circuit including a multichannel Time-to-Amplitude Converter with variable full-scale range, a D∕A converter, and a parallel adder stage. The latter is used to adapt each converter output to the input dynamic range of a commercial 8-channel Analog-to-Digital Converter, while the integrated DAC implements the dithering technique with as small as possible area occupation. The use of this monolithic circuit made the design of a scalable system of very small dimensions (95 × 40 mm) and low power consumption (6 W) possible. Data acquired from the TCSPC measurement are digitally processed and stored inside an FPGA (Field-Programmable Gate Array), while a USB transceiver allows real-time transmission of up to eight TCSPC histograms to a remote PC. Eventually, the experimental results demonstrate that the acquisition system performs TCSPC measurements with high conversion rate (up to 5 MHz/channel), extremely low differential nonlinearity (<0.04 peak-to-peak of the time bin width), high time resolution (down to 20 ps Full-Width Half-Maximum), and very low crosstalk between channels.
Nuclear and atomic analytical techniques in environmental studies in South America.
Paschoa, A S
1990-01-01
The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.
Green aspects, developments and perspectives of liquid phase microextraction techniques.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2014-02-01
Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.
Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A
2011-07-01
Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
ERIC Educational Resources Information Center
Vogt, Frank
2011-01-01
Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…
ERIC Educational Resources Information Center
Griffith, James
2002-01-01
Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
Loop shaping design for tracking performance in machine axes.
Schinstock, Dale E; Wei, Zhouhong; Yang, Tao
2006-01-01
A modern interpretation of classical loop shaping control design methods is presented in the context of tracking control for linear motor stages. Target applications include noncontacting machines such as laser cutters and markers, water jet cutters, and adhesive applicators. The methods are directly applicable to the common PID controller and are pertinent to many electromechanical servo actuators other than linear motors. In addition to explicit design techniques a PID tuning algorithm stressing the importance of tracking is described. While the theory behind these techniques is not new, the analysis of their application to modern systems is unique in the research literature. The techniques and results should be important to control practitioners optimizing PID controller designs for tracking and in comparing results from classical designs to modern techniques. The methods stress high-gain controller design and interpret what this means for PID. Nothing in the methods presented precludes the addition of feedforward control methods for added improvements in tracking. Laboratory results from a linear motor stage demonstrate that with large open-loop gain very good tracking performance can be achieved. The resultant tracking errors compare very favorably to results from similar motions on similar systems that utilize much more complicated controllers.
Kim, Saewung; Guenther, Alex; Apel, Eric
2013-07-01
The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.
Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.
Mura, Paola
2015-09-10
Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.
Crossroads: Modern Interactive Intersections and Accessible Pedestrian Signals
ERIC Educational Resources Information Center
Barlow, Janet M.; Franck, Lukas
2005-01-01
This article discusses the interactive nature of modern actuated intersections and the effect of that interface on pedestrians who are visually impaired. Information is provided about accessible pedestrian signals (APS), the role of blindness professionals in APS installation decisions, and techniques for crossing streets with APS.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2014-01-01
Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.
Chlordane is a polychlorinated mixture that was used as a long-lived pesticide and now is considered a potential endocrine-disrupting compound. The Environmental Sciences Division is involved in modernizing methods for a number of analytes that are potential target substances for...
Planning and Evaluation of New Academic Library Services by Means of Web-Based Conjoint Analysis
ERIC Educational Resources Information Center
Decker, Reinhold; Hermelbracht, Antonia
2006-01-01
New product development is an omnipresent challenge to modern libraries in the information age. Therefore, we present the design and selected results of a comprehensive research project aiming at the systematic and user-oriented planning of academic library services by means of conjoint analysis. The applicability of the analytical framework used…
Exploring Conics: Why Does B Squared - 4AC Matter?
ERIC Educational Resources Information Center
Herman, Marlena
2012-01-01
The Ancient Greeks studied conic sections from a geometric point of view--by cutting a cone with a plane. Later, Apollonius (ca. 262-190 BCE) obtained the conic sections from one right double cone. The modern approach to the study of conics can be considered "analytic geometry," in which conic sections are defined in terms of distance…
ERIC Educational Resources Information Center
Xiao, Hui
2009-01-01
This project stands at the juncture of modern Chinese literature, post-socialist studies, cultural history of divorce, and critical studies about global middle-class cultures. Employing analytical tools mainly from literary studies, cultural studies and feminist theories, I examine stories, novels, films and TV dramas about divorce produced…
Use of NMR and NMR Prediction Software to Identify Components in Red Bull Energy Drinks
ERIC Educational Resources Information Center
Simpson, Andre J.; Shirzadi, Azadeh; Burrow, Timothy E.; Dicks, Andrew P.; Lefebvre, Brent; Corrin, Tricia
2009-01-01
A laboratory experiment designed as part of an upper-level undergraduate analytical chemistry course is described. Students investigate two popular soft drinks (Red Bull Energy Drink and sugar-free Red Bull Energy Drink) by NMR spectroscopy. With assistance of modern NMR prediction software they identify and quantify major components in each…
Human Capital and the Internal Rate of Return.
ERIC Educational Resources Information Center
Rosen, Sherwin
The theory of human capital has made a significant impact on the practice of modern labor economics. At a broad and general level, the concept of human capital has obvious appeal for its simplicity, analytical power, and relationship to economic theory. The fundamental problem in labor economics is the determination of wage rates and earnings;…
Respectability and Relevance: Reflections on Richard Peters and Analytic Philosophy of Education
ERIC Educational Resources Information Center
Snook, Ivan
2013-01-01
I argue that, after Dewey, Peters was the first modern philosopher of education to write material (in English) that was both philosophically respectable and relevant to the day-to-day concerns of teachers. Since then, some philosophers of education have remained (more or less) relevant but not really respectable while others have "taken off into…
Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R
2016-01-21
The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Schutz, Bob E.
1993-01-01
Satellite Laser Ranging (SLR) has a rich history of development which began in the 1960s with 10 meter-level first generation systems. These systems evolved with order of magnitude improvements to the systems that now produce several millimeter single shot range precisions. What began, in part, as an interesting application of the new laser technology has become an essential component of modern, precision space geodesy, which in turn enables contributions to a variety of science areas. Modern space geodesy is the beneficiary of technological developments which have enabled precision geodetic measurements. Aside from SLR and its closely related technique, Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI) has made prominent science contributions also. In recent years, the Global Positioning System (GPS) has demonstrated a rapidly growing popularity as the result of demonstrated low cost with high precision instrumentation. Other modern techniques such as DORIS have demonstrated the ability to make significant science contributions; furthermore, PRARE can be expected to contribute in its own right. An appropriate question is 'why should several techniques be financially supported'? While there are several answers, I offer the opinion that, in consideration of the broad science areas that are the benefactors of space geodesy, no single technique can meet all the requirements and/or expectations of the science areas in which space geodesy contributes or has the potential for contributing. The more well-known science areas include plate tectonics, earthquake processes, Earth rotation/orientation, gravity (static and temporal), ocean circulation, land, and ice topography, to name a few applications. It is unfortunate that the modern space geodesy techniques are often viewed as competitive, but this view is usually encouraged by funding competition, especially in an era of growing needs but diminishing budgets. The techniques are, for the most part, complementary and the ability to reduce the data to geodetic parameters from several techniques promotes confidence in the geophysical interpretations. In the following sections, the current SLR applications are reviewed in the context of the other techniques. The strengths and limitations of SLR are reviewed and speculation about the future prospects are offered.
Applications of everyday IT and communications devices in modern analytical chemistry: A review.
Grudpan, Kate; Kolev, Spas D; Lapanantnopakhun, Somchai; McKelvie, Ian D; Wongwilai, Wasin
2015-05-01
This paper reviews the development and recent use of everyday communications and IT equipment (mobile phones, digital cameras, scanners, webcams, etc) as detection devices for colorimetric chemistries. Such devices can readily be applied for visible detection using reaction formats such as microfluidic paper based analytical devices (µPADs), indicator papers, and well plate reaction vessels. Their use is highly advantageous with respect to cost, simplicity and portability, and offers many opportunities in the areas of point of care diagnosis, and at-site monitoring of environmental, agricultural, food and beverage parameters. Copyright © 2015 Elsevier B.V. All rights reserved.
Functionalized xenon as a biosensor
Spence, Megan M.; Rubin, Seth M.; Dimitrov, Ivan E.; Ruiz, E. Janette; Wemmer, David E.; Pines, Alexander; Yao, Shao Qin; Tian, Feng; Schultz, Peter G.
2001-01-01
The detection of biological molecules and their interactions is a significant component of modern biomedical research. In current biosensor technologies, simultaneous detection is limited to a small number of analytes by the spectral overlap of their signals. We have developed an NMR-based xenon biosensor that capitalizes on the enhanced signal-to-noise, spectral simplicity, and chemical-shift sensitivity of laser-polarized xenon to detect specific biomolecules at the level of tens of nanomoles. We present results using xenon “functionalized” by a biotin-modified supramolecular cage to detect biotin–avidin binding. This biosensor methodology can be extended to a multiplexing assay for multiple analytes. PMID:11535830
Accuracy of trace element determinations in alternate fuels
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1980-01-01
NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.
Evaluation of Analytical Errors in a Clinical Chemistry Laboratory: A 3 Year Experience
Sakyi, AS; Laing, EF; Ephraim, RK; Asibey, OF; Sadique, OK
2015-01-01
Background: Proficient laboratory service is the cornerstone of modern healthcare systems and has an impact on over 70% of medical decisions on admission, discharge, and medications. In recent years, there is an increasing awareness of the importance of errors in laboratory practice and their possible negative impact on patient outcomes. Aim: We retrospectively analyzed data spanning a period of 3 years on analytical errors observed in our laboratory. The data covered errors over the whole testing cycle including pre-, intra-, and post-analytical phases and discussed strategies pertinent to our settings to minimize their occurrence. Materials and Methods: We described the occurrence of pre-analytical, analytical and post-analytical errors observed at the Komfo Anokye Teaching Hospital clinical biochemistry laboratory during a 3-year period from January, 2010 to December, 2012. Data were analyzed with Graph Pad Prism 5(GraphPad Software Inc. CA USA). Results: A total of 589,510 tests was performed on 188,503 outpatients and hospitalized patients. The overall error rate for the 3 years was 4.7% (27,520/58,950). Pre-analytical, analytical and post-analytical errors contributed 3.7% (2210/58,950), 0.1% (108/58,950), and 0.9% (512/58,950), respectively. The number of tests reduced significantly over the 3-year period, but this did not correspond with a reduction in the overall error rate (P = 0.90) along with the years. Conclusion: Analytical errors are embedded within our total process setup especially pre-analytical and post-analytical phases. Strategic measures including quality assessment programs for staff involved in pre-analytical processes should be intensified. PMID:25745569
MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...
Product identification techniques used as training aids for analytical chemists
NASA Technical Reports Server (NTRS)
Grillo, J. P.
1968-01-01
Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.
A Simple Laser Microphone for Classroom Demonstration
ERIC Educational Resources Information Center
Moses, James M.; Trout, K. P.
2006-01-01
Communication through the modulation of electromagnetic radiation has become a foundational technique in modern technology. In this paper we discuss a modern day method of eavesdropping based upon the modulation of laser light reflected from a window pane. A simple and affordable classroom demonstration of a "laser microphone" is…
Boublík, Milan; Riesová, Martina; Dubský, Pavel; Gaš, Bohuslav
2018-06-01
Conductivity detection is a universal detection technique often encountered in electrophoretic separation systems, especially in modern chip-electrophoresis based devices. On the other hand, it is sparsely combined with another contemporary trend of enhancing limits of detection by means of various preconcentration strategies. This can be attributed to the fact that a preconcentration experimental setup usually brings about disturbances in a conductivity baseline. Sweeping with a neutral sweeping agent seems a good candidate for overcoming this problem. A neutral sweeping agent does not hinder the conductivity detection while a charged analyte may preconcentrate on its boundary due to a decrease in its effective mobility. This study investigates such sweeping systems theoretically, by means of computer simulations, and experimentally. A formula is provided for the reliable estimation of the preconcentration factor. Additionally, it is demonstrated that the conductivity signal can significantly benefit from slowing down the analyte and thus the overall signal enhancement can easily overweight amplification caused solely by the sweeping process. The overall enhancement factor can be deduced a priori from the linearized theory of electrophoresis implemented in the PeakMaster freeware. Sweeping by neutral cyclodextrin is demonstrated on an amplification of a conductivity signal of flurbiprofen in a real drug sample. Finally, a possible formation of unexpected system peaks in systems with a neutral sweeping agent is revealed by the computer simulation and confirmed experimentally. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan
2006-07-15
ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.
Estimating true evolutionary distances under the DCJ model.
Lin, Yu; Moret, Bernard M E
2008-07-01
Modern techniques can yield the ordering and strandedness of genes on each chromosome of a genome; such data already exists for hundreds of organisms. The evolutionary mechanisms through which the set of the genes of an organism is altered and reordered are of great interest to systematists, evolutionary biologists, comparative genomicists and biomedical researchers. Perhaps the most basic concept in this area is that of evolutionary distance between two genomes: under a given model of genomic evolution, how many events most likely took place to account for the difference between the two genomes? We present a method to estimate the true evolutionary distance between two genomes under the 'double-cut-and-join' (DCJ) model of genome rearrangement, a model under which a single multichromosomal operation accounts for all genomic rearrangement events: inversion, transposition, translocation, block interchange and chromosomal fusion and fission. Our method relies on a simple structural characterization of a genome pair and is both analytically and computationally tractable. We provide analytical results to describe the asymptotic behavior of genomes under the DCJ model, as well as experimental results on a wide variety of genome structures to exemplify the very high accuracy (and low variance) of our estimator. Our results provide a tool for accurate phylogenetic reconstruction from multichromosomal gene rearrangement data as well as a theoretical basis for refinements of the DCJ model to account for biological constraints. All of our software is available in source form under GPL at http://lcbb.epfl.ch.
Kylander, M E; Weiss, D J; Jeffries, T E; Kober, B; Dolgopolova, A; Garcia-Sanchez, R; Coles, B J
2007-01-16
An analytical protocol for rapid and reliable laser ablation-quadrupole (LA-Q)- and multi-collector (MC-) inductively coupled plasma-mass spectrometry (ICP-MS) analysis of Pb isotope ratios ((207)Pb/(206)Pb and (208)Pb/(206)Pb) in peats and lichens is developed. This technique is applicable to source tracing atmospheric Pb deposition in biomonitoring studies and sample screening. Reference materials and environmental samples were dry ashed and pressed into pellets for introduction by laser ablation. No binder was used to reduce contamination. LA-MC-ICP-MS internal and external precisions were <1.1% and <0.3%, respectively, on both (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios. LA-Q-ICP-MS internal precisions on (207)Pb/(206)Pb and (208)Pb/(206)Pb ratios were lower with values for the different sample sets <14.3% while external precisions were <2.9%. The level of external precision acquired in this study is high enough to distinguish between most modern Pb sources. LA-MC-ICP-MS measurements differed from thermal ionisation mass spectrometry (TIMS) values by 1% or less while the accuracy obtained using LA-Q-ICP-MS compared to solution MC-ICP-MS was 3.1% or better using a run bracketing (RB) mass bias correction method. Sample heterogeneity and detector switching when measuring (208)Pb by Q-ICP-MS are identified as sources of reduced analytical performance.
Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakkila, E.A.
1978-10-01
Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.
Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975
1975-09-01
and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Modern Observational Techniques for Comets
NASA Technical Reports Server (NTRS)
Brandt, J. C. (Editor); Greenberg, J. M. (Editor); Donn, B. (Editor); Rahe, J. (Editor)
1981-01-01
Techniques are discussed in the following areas: astrometry, photometry, infrared observations, radio observations, spectroscopy, imaging of coma and tail, image processing of observation. The determination of the chemical composition and physical structure of comets is highlighted.
Eppinger, Robert G.; Giles, Stuart A.; Lee, Gregory K.; Smith, Steven M.
2015-01-01
The geochemical sample media collected by the BGS and BRGM under the PRISM-I contract included rock, sediment, regolith, and soil samples. Details on sample collection procedures are in unpublished reports available from PRISM. These samples were analyzed under PRISM-I contract by ALS Chemex Laboratories using various combinations of modern methods including fire-assay inductively coupled plasma-atomic emission spectrometry (ICPAES) and ICP-mass spectrometry (ICP-MS) for Au; multi-acid digestion, atomic absorption spectroscopy (AAS) for Ag and As; 47-element, four-acid digestion, ICP-MS; 27-element, fouracid digestion, ICP-AES; special four-acid ICP-MS techniques for Pt and B; fire assay followed by ICP-AES for platinum-group elements; whole-rock analyses by wavelength dispersive X-ray fluorescence (XRF); special techniques for loss-on-ignition, inorganic C, and total S; and special ore-grade AAS techniques for Ag, Au, Cu, Ni, Pb, and Zn. Around 30,000 samples were analyzed by at least one technique. However, it is stressed here that: (1) there was no common sample medium collected at all sites, likely due to the vast geological and geomorphologic differences across the country, (2) the sample site distribution is very irregular, likely due in part to access constraints and sand dune cover, and (3) there was no common across-the-board trace element analytical package used for all samples. These three aspects fundamentally affect the ability to produce country-wide geochemical maps of Mauritania. Gold (Au), silver (Ag), and arsenic (As) were the three elements that were most commonly analyzed.
From experimental imaging techniques to virtual embryology.
Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis
2004-01-01
Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.
NASA Technical Reports Server (NTRS)
Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.
1985-01-01
An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.
Airborne chemistry: acoustic levitation in chemical analysis.
Santesson, Sabina; Nilsson, Staffan
2004-04-01
This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.
Odour Detection Methods: Olfactometry and Chemical Sensors
Brattoli, Magda; de Gennaro, Gianluigi; de Pinto, Valentina; Loiotile, Annamaria Demarinis; Lovascio, Sara; Penza, Michele
2011-01-01
The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc.) and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality); this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants) as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective “analytical instrument” for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses) are then described, focusing on their better performances for environmental analysis. Odour emission monitoring carried out through both the techniques is finally reviewed in order to show the complementary responses of human and instrumental sensing. PMID:22163901
Odour detection methods: olfactometry and chemical sensors.
Brattoli, Magda; de Gennaro, Gianluigi; de Pinto, Valentina; Loiotile, Annamaria Demarinis; Lovascio, Sara; Penza, Michele
2011-01-01
The complexity of the odours issue arises from the sensory nature of smell. From the evolutionary point of view olfaction is one of the oldest senses, allowing for seeking food, recognizing danger or communication: human olfaction is a protective sense as it allows the detection of potential illnesses or infections by taking into account the odour pleasantness/unpleasantness. Odours are mixtures of light and small molecules that, coming in contact with various human sensory systems, also at very low concentrations in the inhaled air, are able to stimulate an anatomical response: the experienced perception is the odour. Odour assessment is a key point in some industrial production processes (i.e., food, beverages, etc.) and it is acquiring steady importance in unusual technological fields (i.e., indoor air quality); this issue mainly concerns the environmental impact of various industrial activities (i.e., tanneries, refineries, slaughterhouses, distilleries, civil and industrial wastewater treatment plants, landfills and composting plants) as sources of olfactory nuisances, the top air pollution complaint. Although the human olfactory system is still regarded as the most important and effective "analytical instrument" for odour evaluation, the demand for more objective analytical methods, along with the discovery of materials with chemo-electronic properties, has boosted the development of sensor-based machine olfaction potentially imitating the biological system. This review examines the state of the art of both human and instrumental sensing currently used for the detection of odours. The olfactometric techniques employing a panel of trained experts are discussed and the strong and weak points of odour assessment through human detection are highlighted. The main features and the working principles of modern electronic noses (E-Noses) are then described, focusing on their better performances for environmental analysis. Odour emission monitoring carried out through both the techniques is finally reviewed in order to show the complementary responses of human and instrumental sensing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosarge, Christina L., E-mail: cbosarge@umail.iu.edu; Ewing, Marvene M.; DesRosiers, Colleen M.
To demonstrate the dosimetric advantages and disadvantages of standard anteroposterior-posteroanterior (S-AP/PA{sub AAA}), inverse-planned AP/PA (IP-AP/PA) and volumetry-modulated arc (VMAT) radiotherapies in the treatment of children undergoing whole-lung irradiation. Each technique was evaluated by means of target coverage and normal tissue sparing, including data regarding low doses. A historical approach with and without tissue heterogeneity corrections is also demonstrated. Computed tomography (CT) scans of 10 children scanned from the neck to the reproductive organs were used. For each scan, 6 plans were created: (1) S-AP/PA{sub AAA} using the anisotropic analytical algorithm (AAA), (2) IP-AP/PA, (3) VMAT, (4) S-AP/PA{sub NONE} without heterogeneitymore » corrections, (5) S-AP/PA{sub PB} using the Pencil-Beam algorithm and enforcing monitor units from technique 4, and (6) S-AP/PA{sub AAA[FM]} using AAA and forcing fixed monitor units. The first 3 plans compare modern methods and were evaluated based on target coverage and normal tissue sparing. Body maximum and lower body doses (50% and 30%) were also analyzed. Plans 4 to 6 provide a historic view on the progression of heterogeneity algorithms and elucidate what was actually delivered in the past. Averages of each comparison parameter were calculated for all techniques. The S-AP/PA{sub AAA} technique resulted in superior target coverage but had the highest maximum dose to every normal tissue structure. The IP-AP/PA technique provided the lowest dose to the esophagus, stomach, and lower body doses. VMAT excelled at body maximum dose and maximum doses to the heart, spine, and spleen, but resulted in the highest dose in the 30% body range. It was, however, superior to the S-AP/PA{sub AAA} approach in the 50% range. Each approach has strengths and weaknesses thus associated. Techniques may be selected on a case-by-case basis and by physician preference of target coverage vs normal tissue sparing.« less
Olivieri, Alejandro C
2005-08-01
Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.
Culture-Sensitive Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Vandenberghe, L.
2008-01-01
Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…
NASA Astrophysics Data System (ADS)
Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel
2013-10-01
We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.
Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu
2010-01-01
This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
Developments in flow visualization methods for flight research
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.
1990-01-01
With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
Analytical methods for gelatin differentiation from bovine and porcine origins and food products.
Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B
2012-01-01
Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®
An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. L.; Cosby, R. M.
1976-01-01
Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.
ERIC Educational Resources Information Center
Fischer, Peter; Krueger, Joachim I.; Greitemeyer, Tobias; Vogrincic, Claudia; Kastenmuller, Andreas; Frey, Dieter; Heene, Moritz; Wicher, Magdalena; Kainbacher, Martina
2011-01-01
Research on bystander intervention has produced a great number of studies showing that the presence of other people in a critical situation reduces the likelihood that an individual will help. As the last systematic review of bystander research was published in 1981 and was not a quantitative meta-analysis in the modern sense, the present…
ERIC Educational Resources Information Center
Lewis, William E.; Ferretti, Ralph P.
2011-01-01
Literary scholars use specific critical lenses called "topoi" (Fahnestock & Secor, 1991) to read literature and write their interpretations of these texts. Literary topoi are used in the discourse of modern college literature classrooms (Wilder, 2002) and are associated with higher grades in students' literature classes (Wilder, 2002, 2005).…
ERIC Educational Resources Information Center
Wriedt, Mario; Sculley, Julian P.; Aulakh, Darpandeep; Zhou, Hong-Cai
2016-01-01
A simple and straightforward synthesis of an ultrastable porous metal-organic framework (MOF) based on copper(II) and a mixed N donor ligand system is described as a laboratory experiment for chemistry undergraduate students. These experiments and the resulting analysis are designed to teach students basic research tools and procedures while…
An Analytic Model for DoD Divestments
2015-04-30
Modernization (Research, Development, Test, & Evaluation [RDT&E], Procurement, Military Construction, Science , & Technology, and Weapons Acquisition), and...from the insurance industry as variables that impact actuarial calculations are also worth considering here. For example, as the insurer of national...initiatives (GAO-14-134). Washington, DC: Author. Hardin, G. (1968). The tragedy of the commons. Science , 162, 1243–1248. Harrison, T. (2012, August
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Arbaugh, J. B.; Hwang, Alvin
2013-01-01
Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Kenneth Paul
Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less
Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo
2016-03-24
Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.
Instrumentation and fusion for congenital spine deformities.
Hedequist, Daniel J
2009-08-01
A retrospective clinical review. To review the use of modern instrumentation of the spine for congenital spinal deformities. Spinal instrumentation has evolved since the advent of the Harrington rod. There is a paucity of literature, which discusses the use of modern spinal instrumentation in congenital spine deformity cases. This review focuses on modern instrumentation techniques for congenital scoliosis and kyphosis. A systematic review was performed of the literature to discuss spinal implant use for congenital deformities. Spinal instrumentation may be safely and effectively used in cases of congenital spinal deformity. Spinal surgeons taking care of children with congenital spine deformities need to be trained in all aspects of modern spinal instrumentation.
How Farmers Learn about Environmental Issues: Reflections on a Sociobiographical Approach
ERIC Educational Resources Information Center
Vandenabeele, Joke; Wildemeersch, Danny
2012-01-01
At the time of this research, protests of farmers against new environmental policy measures received much media attention. News reports suggested that farmers' organizations rejected the idea that modern farming techniques cause damage to the environment and even tried to undermine attempts to reconcile the goals of modern agriculture with…
Older Learning Engagement in the Modern City
ERIC Educational Resources Information Center
Lido, Catherine; Osborne, Michael; Livingston, Mark; Thakuriah, Piyushimita; Sila-Nowicka, Katarzyna
2016-01-01
This research employs novel techniques to examine older learners' journeys, educationally and physically, in order to gain a "three-dimensional" picture of lifelong learning in the modern urban context of Glasgow. The data offers preliminary analyses of an ongoing 1,500 household survey by the Urban Big Data Centre (UBDC). A sample of…
Commodification of Ghana's Volta River: An Example of Ellul's Autonomy of Technique
ERIC Educational Resources Information Center
Agbemabiese, Lawrence; Byrne, John
2005-01-01
Jacques Ellul argued that modernity's nearly exclusive reliance on science and technology to design society would threaten human freedom. Of particular concern for Ellul was the prospect of the technical milieu overwhelming culture. The commodification of the Volta River in order to modernize Ghana illustrates the Ellulian dilemma of the autonomy…
Modern Methodology and Techniques Aimed at Developing the Environmentally Responsible Personality
ERIC Educational Resources Information Center
Ponomarenko, Yelena V.; Zholdasbekova, Bibisara A.; Balabekov, Aidarhan T.; Kenzhebekova, Rabiga I.; Yessaliyev, Aidarbek A.; Larchenkova, Liudmila A.
2016-01-01
The article discusses the positive impact of an environmentally responsible individual as the social unit able to live in harmony with the natural world, himself/herself and other people. The purpose of the article is to provide theoretical substantiation of modern teaching methods. The authors considered the experience of philosophy, psychology,…
Pape, G; Raiss, P; Kleinschmidt, K; Schuld, C; Mohr, G; Loew, M; Rickert, M
2010-12-01
Loosening of the glenoid component is one of the major causes of failure in total shoulder arthroplasty. Possible risk factors for loosening of cemented components include an eccentric loading, poor bone quality, inadequate cementing technique and insufficient cement penetration. The application of a modern cementing technique has become an established procedure in total hip arthroplasty. The goal of modern cementing techniques in general is to improve the cement-penetration into the cancellous bone. Modern cementing techniques include the cement vacuum-mixing technique, retrograde filling of the cement under pressurisation and the use of a pulsatile lavage system. The main purpose of this study was to analyse cement penetration into the glenoid bone by using modern cement techniques and to investigate the relationship between the bone mineral density (BMD) and the cement penetration. Furthermore we measured the temperature at the glenoid surface before and after jet-lavage of different patients during total shoulder arthroplasty. It is known that the surrounding temperature of the bone has an effect on the polymerisation of the cement. Data from this experiment provide the temperature setting for the in-vitro study. The glenoid surface temperature was measured in 10 patients with a hand-held non-contact temperature measurement device. The bone mineral density was measured by DEXA. Eight paired cadaver scapulae were allocated (n = 16). Each pair comprised two scapulae from one donor (matched-pair design). Two different glenoid components were used, one with pegs and the other with a keel. The glenoids for the in-vitro study were prepared with the bone compaction technique by the same surgeon in all cases. Pulsatile lavage was used to clean the glenoid of blood and bone fragments. Low viscosity bone cement was applied retrogradely into the glenoid by using a syringe. A constant pressure was applied with a modified force sensor impactor. Micro-computed tomography scans were applied to analyse the cement penetration into the cancellous bone. The mean temperature during the in-vivo arthroplasty of the glenoid was 29.4 °C (27.2-31 °C) before and 26.2 °C (25-27.5 °C) after jet-lavage. The overall peak BMD was 0.59 (range 0.33-0.99) g/cm (2). Mean cement penetration was 107.9 (range 67.6-142.3) mm (2) in the peg group and 128.3 (range 102.6-170.8) mm (2) in the keel group. The thickness of the cement layer varied from 0 to 2.1 mm in the pegged group and from 0 to 2.4 mm in the keeled group. A strong negative correlation between BMD and mean cement penetration was found for the peg group (r (2) = -0.834; p < 0.01) and for the keel group (r (2) = -0.727; p < 0.041). Micro-CT shows an inhomogenous dispersion of the cement into the cancellous bone. Data from the in-vivo temperature measurement indicate that the temperature at the glenohumeral surface under operation differs from the body core temperature and should be considered in further in-vitro studies with human specimens. Bone mineral density is negatively correlated to cement penetration in the glenoid. The application of a modern cementing technique in the glenoid provides sufficient cementing penetration although there is an inhomogenous dispersion of the cement. The findings of this study should be considered in further discussions about cementing technique and cement penetration into the cancellous bone of the glenoid. © Georg Thieme Verlag KG Stuttgart · New York.
Cache Energy Optimization Techniques For Modern Processors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2013-01-01
Modern multicore processors are employing large last-level caches, for example Intel's E7-8800 processor uses 24MB L3 cache. Further, with each CMOS technology generation, leakage energy has been dramatically increasing and hence, leakage energy is expected to become a major source of energy dissipation, especially in last-level caches (LLCs). The conventional schemes of cache energy saving either aim at saving dynamic energy or are based on properties specific to first-level caches, and thus these schemes have limited utility for last-level caches. Further, several other techniques require offline profiling or per-application tuning and hence are not suitable for product systems. In thismore » book, we present novel cache leakage energy saving schemes for single-core and multicore systems; desktop, QoS, real-time and server systems. Also, we present cache energy saving techniques for caches designed with both conventional SRAM devices and emerging non-volatile devices such as STT-RAM (spin-torque transfer RAM). We present software-controlled, hardware-assisted techniques which use dynamic cache reconfiguration to configure the cache to the most energy efficient configuration while keeping the performance loss bounded. To profile and test a large number of potential configurations, we utilize low-overhead, micro-architecture components, which can be easily integrated into modern processor chips. We adopt a system-wide approach to save energy to ensure that cache reconfiguration does not increase energy consumption of other components of the processor. We have compared our techniques with state-of-the-art techniques and have found that our techniques outperform them in terms of energy efficiency and other relevant metrics. The techniques presented in this book have important applications in improving energy-efficiency of higher-end embedded, desktop, QoS, real-time, server processors and multitasking systems. This book is intended to be a valuable guide for both newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.« less