ERIC Educational Resources Information Center
Vogt, Frank
2011-01-01
Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…
Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework
ERIC Educational Resources Information Center
Ranjan, Jayanthi; Bhatnagar, Vishal
2011-01-01
Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…
ERIC Educational Resources Information Center
Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew
2015-01-01
Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…
Marine geodetic control for geoidal profile mapping across the Puerto Rican Trench
NASA Technical Reports Server (NTRS)
Fubara, D. M.; Mourad, A. G.
1975-01-01
A marine geodetic control was established for the northern end of the geoidal profile mapping experiment across the Puerto Rican Trench by determining the three-dimensional geodetic coordinates of the four ocean-bottom mounted acoustic transponders. The data reduction techniques employed and analytical processes involved are described. Before applying the analytical techniques to the field data, they were tested with simulated data and proven to be effective in theory as well as in practice.
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.
Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen
2015-10-01
Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.
Analytical methods for gelatin differentiation from bovine and porcine origins and food products.
Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B
2012-01-01
Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®
Accuracy of selected techniques for estimating ice-affected streamflow
Walker, John F.
1991-01-01
This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.
A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...
Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co
NASA Technical Reports Server (NTRS)
Migneault, G. E.
1979-01-01
Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.
Khan, Wahid; Kumar, Neeraj
2011-06-01
Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of < 2%RSD. Characterization techniques were established and stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd. Copyright © 2010 John Wiley & Sons, Ltd.
Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
Qualitative evaluation of water displacement in simulated analytical breaststroke movements.
Martens, Jonas; Daly, Daniel
2012-05-01
One purpose of evaluating a swimmer is to establish the individualized optimal technique. A swimmer's particular body structure and the resulting movement pattern will cause the surrounding water to react in differing ways. Consequently, an assessment method based on flow visualization was developed complimentary to movement analysis and body structure quantification. A fluorescent dye was used to make the water displaced by the body visible on video. To examine the hypothesis on the propulsive mechanisms applied in breaststroke swimming, we analyzed the movements of the surrounding water during 4 analytical breaststroke movements using the flow visualization technique.
A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines
Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua
2018-01-01
The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905
Torque Transient of Magnetically Drive Flow for Viscosity Measurement
NASA Technical Reports Server (NTRS)
Ban, Heng; Li, Chao; Su, Ching-Hua; Lin, Bochuan; Scripa, Rosalia N.; Lehoczky, Sandor L.
2004-01-01
Viscosity is a good indicator of structural changes for complex liquids, such as semiconductor melts with chain or ring structures. This paper discusses the theoretical and experimental results of the transient torque technique for non-intrusive viscosity measurement. Such a technique is essential for the high temperature viscosity measurement of high pressure and toxic semiconductor melts. In this paper, our previous work on oscillating cup technique was expanded to the transient process of a magnetically driven melt flow in a damped oscillation system. Based on the analytical solution for the fluid flow and cup oscillation, a semi-empirical model was established to extract the fluid viscosity. The analytical and experimental results indicated that such a technique has the advantage of short measurement time and straight forward data analysis procedures
Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin
2015-01-01
In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.
Code of Federal Regulations, 2013 CFR
2013-01-01
... processed for export to the United States; (E) Complete separation of establishments certified under... potential contaminants, in accordance with sampling and analytical techniques approved by the Administrator...
Code of Federal Regulations, 2014 CFR
2014-01-01
... processed for export to the United States; (E) Complete separation of establishments certified under... potential contaminants, in accordance with sampling and analytical techniques approved by the Administrator...
Code of Federal Regulations, 2012 CFR
2012-01-01
... processed for export to the United States; (E) Complete separation of establishments certified under... potential contaminants, in accordance with sampling and analytical techniques approved by the Administrator...
Airborne chemistry: acoustic levitation in chemical analysis.
Santesson, Sabina; Nilsson, Staffan
2004-04-01
This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.
Suba, Dávid; Urbányi, Zoltán; Salgó, András
2016-10-01
Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.
ElMekawy, A; Hegab, H M; Pant, D; Saint, C P
2018-01-01
Globally, sustainable provision of high-quality safe water is a major challenge of the 21st century. Various chemical and biological monitoring analytics are presently utilized to guarantee the availability of high-quality water. However, these techniques still face some challenges including high costs, complex design and onsite and online limitations. The recent technology of using microbial fuel cell (MFC)-based biosensors holds outstanding potential for the rapid and real-time monitoring of water source quality. MFCs have the advantages of simplicity in design and efficiency for onsite sensing. Even though some sensing applications of MFCs were previously studied, e.g. biochemical oxygen demand sensor, recently numerous research groups around the world have presented new practical applications of this technique, which combine multidisciplinary scientific knowledge in materials science, microbiology and electrochemistry fields. This review presents the most updated research on the utilization of MFCs as potential biosensors for monitoring water quality and considers the range of potentially toxic analytes that have so far been detected using this methodology. The advantages of MFCs over established technology are also considered as well as future work required to establish their routine use. © 2017 The Society for Applied Microbiology.
The Effect of Multispectral Image Fusion Enhancement on Human Efficiency
2017-03-20
human visual system by applying a technique commonly used in visual percep- tion research : ideal observer analysis. Using this approach, we establish...applications, analytic tech- niques, and procedural methods used across studies. This paper uses ideal observer analysis to establish a frame- work that allows...augmented similarly to incorpo- rate research involving more complex stimulus content. Additionally, the ideal observer can be adapted for a number of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.
2009-04-14
This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less
DOT National Transportation Integrated Search
1973-02-01
The volume presents the models used to analyze basic features of the system, establish feasibility of techniques, and evaluate system performance. The models use analytical expressions and computer simulations to represent the relationship between sy...
CF6 jet engine diagnostics program. High pressure turbine roundness/clearance investigation
NASA Technical Reports Server (NTRS)
Howard, W. D.; Fasching, W. A.
1982-01-01
The effects of high pressure turbine clearance changes on engine and module performance was evaluated in addition to the measurement of CF6-50C high pressure turbine Stage 1 tip clearance and stator out-of-roundness during steady-state and transient operation. The results indicated a good correlation of the analytical model of round engine clearance response with measured data. The stator out-of-roundness measurements verified that the analytical technique for predicting the distortion effects of mechanical loads is accurate, whereas the technique for calculating the effects of certain circumferential thermal gradients requires some modifications. A potential for improvement in roundness was established in the order of 0.38 mm (0.015 in.), equivalent to 0.86 percent turbine efficiency which translates to a cruise SFC improvement of 0.36 percent. The HP turbine Stage 1 tip clearance performance derivative was established as 0.44 mm (17 mils) per percent of turbine efficiency at take-off power, somewhat smaller, therefore, more sensitive than predicted from previous investigations.
NASA Technical Reports Server (NTRS)
Rader, W. P.; Barrett, S.; Payne, K. R.
1975-01-01
Data measurement and interpretation techniques were defined for application to the first few space shuttle flights, so that the dynamic environment could be sufficiently well established to be used to reduce the cost of future payloads through more efficient design and environmental test techniques. It was concluded that: (1) initial payloads must be given comprehensive instrumentation coverage to obtain detailed definition of acoustics, vibration, and interface loads, (2) analytical models of selected initial payloads must be developed and verified by modal surveys and flight measurements, (3) acoustic tests should be performed on initial payloads to establish realistic test criteria for components and experiments in order to minimize unrealistic failures and retest requirements, (4) permanent data banks should be set up to establish statistical confidence in the data to be used, (5) a more unified design/test specification philosophy is needed, (6) additional work is needed to establish a practical testing technique for simulation of vehicle transients.
Yan, Guanyong; Wang, Xiangzhao; Li, Sikun; Yang, Jishuo; Xu, Dongbo; Erdmann, Andreas
2014-03-10
We propose an in situ aberration measurement technique based on an analytical linear model of through-focus aerial images. The aberrations are retrieved from aerial images of six isolated space patterns, which have the same width but different orientations. The imaging formulas of the space patterns are investigated and simplified, and then an analytical linear relationship between the aerial image intensity distributions and the Zernike coefficients is established. The linear relationship is composed of linear fitting matrices and rotation matrices, which can be calculated numerically in advance and utilized to retrieve Zernike coefficients. Numerical simulations using the lithography simulators PROLITH and Dr.LiTHO demonstrate that the proposed method can measure wavefront aberrations up to Z(37). Experiments on a real lithography tool confirm that our method can monitor lens aberration offset with an accuracy of 0.7 nm.
Generalized simulation technique for turbojet engine system analysis
NASA Technical Reports Server (NTRS)
Seldner, K.; Mihaloew, J. R.; Blaha, R. J.
1972-01-01
A nonlinear analog simulation of a turbojet engine was developed. The purpose of the study was to establish simulation techniques applicable to propulsion system dynamics and controls research. A schematic model was derived from a physical description of a J85-13 turbojet engine. Basic conservation equations were applied to each component along with their individual performance characteristics to derive a mathematical representation. The simulation was mechanized on an analog computer. The simulation was verified in both steady-state and dynamic modes by comparing analytical results with experimental data obtained from tests performed at the Lewis Research Center with a J85-13 engine. In addition, comparison was also made with performance data obtained from the engine manufacturer. The comparisons established the validity of the simulation technique.
Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R
2016-01-21
The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.
Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.
Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C
2016-09-01
Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.
Broeckhoven, Ken; Desmet, Gert
2007-11-16
Using a combination of both analytical and numerical techniques, approximate analytical expressions have been established for the transient and long time limit band broadening, originating from the presence of a thin disturbed sidewall layer in liquid chromatography columns, including packed, monolithic as well as microfabricated columns. The established expressions can be used to compare the importance of a thin disturbed sidewall layer with that of other radial heterogeneity effects (such as transcolumn packing density variations due to the relief of packing stresses). The expressions are independent of the actual velocity profile inside the layer as long as the disturbed sidewall layer occupies less than 2.5% of the column width.
NASA Astrophysics Data System (ADS)
Stosnach, Hagen
2010-09-01
Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.
DOT National Transportation Integrated Search
2011-04-01
The objectives of this study were to 1) identify the effects of site location, storm hydrology, and water quality parameters on the concentration of dissolved copper (Cu2+diss) in Oregon highway runoff; 2) establish an analytical technique suitable f...
NASA Technical Reports Server (NTRS)
Nakamura, N.; Nyquist, L. E.; Reese, Y.; Shih, C-Y; Fujitani, T.; Okano, O.
2011-01-01
We have established a precise analytical technique for stable chlorine isotope measurements of tiny planetary materials by TIMS (Thermal Ionization Mass Spectrometry) [1], for which the results are basically consistent with the IRMS tech-nique (gas source mass spectrometry) [2,3,4]. We present here results for Martian shergottites and nakhlites; whole rocks, HNO3-leachates and residues, and discuss the chlorine isotope evolution of planetary Mars.
A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques
Weller, Michael G.
2012-01-01
The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539
Hyphenated analytical techniques for materials characterisation
NASA Astrophysics Data System (ADS)
Armstrong, Gordon; Kailas, Lekshmi
2017-09-01
This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.
A Corona Discharge Initiated Electrochemical Electrospray Ionization Technique
Lloyd, John R.; Hess, Sonja
2009-01-01
We report here the development of a corona discharge (CD) initiated electrochemical (EC) electrospray ionization (ESI) technique using a standard electrospray ion source. This is a new ionization technique distinct from ESI, electrochemistry inherent to ESI, APCI, and techniques using hydroxyl radicals produced under atmospheric pressure conditions. By maximizing the observable CD at the tip of a stainless steel ESI capillary, efficient electrochemical oxidation of electrochemically active compounds is observed. For electrochemical oxidation to be observed, the ionization potential of the analyte must be lower than Fe. Ferrocene labeled compounds were chosen as the electrochemically active moiety. The electrochemical cell in the ESI source was robust and generated ions with selectivity according to the ionization potential of the analytes and up to zeptomolar sensitivity. Our results indicate that CD initiated electrochemical ionization has the potential to become a powerful technique to increase the dynamic range, sensitivity and selectivity of ESI experiments. Synopsis Using a standard ESI source a corona discharge initiated electrochemical ionization technique was established resulting from the electrochemistry occurring at the CD electrode surface. PMID:19747843
Covaci, Adrian; Voorspoels, Stefan; Abdallah, Mohamed Abou-Elwafa; Geens, Tinne; Harrad, Stuart; Law, Robin J
2009-01-16
The present article reviews the available literature on the analytical and environmental aspects of tetrabromobisphenol-A (TBBP-A), a currently intensively used brominated flame retardant (BFR). Analytical methods, including sample preparation, chromatographic separation, detection techniques, and quality control are discussed. An important recent development in the analysis of TBBP-A is the growing tendency for liquid chromatographic techniques. At the detection stage, mass-spectrometry is a well-established and reliable technology in the identification and quantification of TBBP-A. Although interlaboratory exercises for BFRs have grown in popularity in the last 10 years, only a few participating laboratories report concentrations for TBBP-A. Environmental levels of TBBP-A in abiotic and biotic matrices are low, probably due to the major use of TBBP-A as reactive FR. As a consequence, the expected human exposure is low. This is in agreement with the EU risk assessment that concluded that there is no risk for humans concerning TBBP-A exposure. Much less analytical and environmental information exists for the various groups of TBBP-A derivatives which are largely used as additive flame retardants.
XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.
2009-01-01
Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.
Fifty years of solid-phase extraction in water analysis--historical development and overview.
Liska, I
2000-07-14
The use of an appropriate sample handling technique is a must in an analysis of organic micropollutants in water. The efforts to use a solid phase for the recovery of analytes from a water matrix prior to their detection have a long history. Since the first experimental trials using activated carbon filters that were performed 50 years ago, solid-phase extraction (SPE) has become an established sample preparation technique. The initial experimental applications of SPE resulted in widespread use of this technique in current water analysis and also to adoption of SPE into standardized analytical methods. During the decades of its evolution, chromatographers became aware of the advantages of SPE and, despite many innovations that appeared in the last decade, new SPE developments are still expected in the future. A brief overview of 50 years of the history of the use of SPE in organic trace analysis of water is given in presented paper.
NASA Technical Reports Server (NTRS)
Mineck, Raymond E.
1992-01-01
A two dimensional airfoil model was tested in the adaptive wall test section of the NASA Langley 0.3 meter Transonic Cryogenic Tunnel (TCT) and in the ventilated test section of the National Aeronautical Establishment Two Dimensional High Reynold Number Facility (HRNF). The primary goal of the tests was to compare different techniques (adaptive test section walls and classical, analytical corrections) to account for wall interference. Tests were conducted over a Mach number range from 0.3 to 0.8 at chord Reynolds numbers of 10 x 10(exp 6), 15 x 10(exp 6), and 20 x 10(exp 6). The angle of attack was varied from about 12 degrees up to stall. Movement of the top and bottom test section walls was used to account for the wall interference in the HRNF tests. The test results are in good agreement.
Mohanasubha, R.; Chandrasekar, V. K.; Senthilvelan, M.; Lakshmanan, M.
2015-01-01
We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle–Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples. PMID:27547076
Mohanasubha, R; Chandrasekar, V K; Senthilvelan, M; Lakshmanan, M
2015-04-08
We unearth the interconnection between various analytical methods which are widely used in the current literature to identify integrable nonlinear dynamical systems described by third-order nonlinear ODEs. We establish an important interconnection between the extended Prelle-Singer procedure and λ-symmetries approach applicable to third-order ODEs to bring out the various linkages associated with these different techniques. By establishing this interconnection we demonstrate that given any one of the quantities as a starting point in the family consisting of Jacobi last multipliers, Darboux polynomials, Lie point symmetries, adjoint-symmetries, λ-symmetries, integrating factors and null forms one can derive the rest of the quantities in this family in a straightforward and unambiguous manner. We also illustrate our findings with three specific examples.
NASA Astrophysics Data System (ADS)
Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay
2003-09-01
The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.
ERIC Educational Resources Information Center
Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.
2016-01-01
The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…
Interviewing a Silent (Radioactive) Witness through Nuclear Forensic Analysis.
Mayer, Klaus; Wallenius, Maria; Varga, Zsolt
2015-12-01
Nuclear forensics is a relatively young discipline in science which aims at providing information on nuclear material of unknown origin. The determination of characteristic parameters through tailored analytical techniques enables establishing linkages to the material's processing history and hence provides hints on its place and date of production and on the intended use.
A Study of Online Exams Procrastination Using Data Analytics Techniques
ERIC Educational Resources Information Center
Levy, Yair; Ramim, Michelle M.
2012-01-01
Procrastination appears to be an inevitable part of daily life, especially for activities that are bounded by deadlines. It has implications for performance and is known to be linked to poor personal time management. Although research related to procrastination as a general behavior has been well established, studies assessing procrastination in…
USDA-ARS?s Scientific Manuscript database
Immunoassay for low molecular weight food contaminants, such as pesticides, veterinary drugs, and mycotoxins is now a well-established technique which meets the demands for a rapid, reliable, and cost-effective analytical method. However, due to limited understanding of the fundamental aspects of i...
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Propeller flow visualization techniques
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.
1982-01-01
Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.
López-Guerra, Enrique A
2017-01-01
We explore the contact problem of a flat-end indenter penetrating intermittently a generalized viscoelastic surface, containing multiple characteristic times. This problem is especially relevant for nanoprobing of viscoelastic surfaces with the highly popular tapping-mode AFM imaging technique. By focusing on the material perspective and employing a rigorous rheological approach, we deliver analytical closed-form solutions that provide physical insight into the viscoelastic sources of repulsive forces, tip–sample dissipation and virial of the interaction. We also offer a systematic comparison to the well-established standard harmonic excitation, which is the case relevant for dynamic mechanical analysis (DMA) and for AFM techniques where tip–sample sinusoidal interaction is permanent. This comparison highlights the substantial complexity added by the intermittent-contact nature of the interaction, which precludes the derivation of straightforward equations as is the case for the well-known harmonic excitations. The derivations offered have been thoroughly validated through numerical simulations. Despite the complexities inherent to the intermittent-contact nature of the technique, the analytical findings highlight the potential feasibility of extracting meaningful viscoelastic properties with this imaging method. PMID:29114450
Coconut matting bezoar identified by a combined analytical approach.
Levison, D A; Crocker, P R; Boxall, T A; Randall, K J
1986-01-01
A rare type of bezoar composed of coconut matting was found in the stomach of a caucasian man. The exact identity of the fibres was established by scanning electron microscopy, x-ray energy spectroscopy, and microscopic infrared spectroscopy. This report illustrates the importance of these techniques for identifying the nature of foreign material. Images PMID:3950038
ERIC Educational Resources Information Center
Wyman, Steven K.; And Others
This exploratory study establishes analytical tools (based on both technical criteria and user feedback) by which federal Web site administrators may assess the quality of their websites. The study combined qualitative and quantitative data collection techniques to achieve the following objectives: (1) identify and define key issues regarding…
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel
2016-09-07
We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.
PFOA and PFOS: Treatment and Analytics | Science Inventory ...
PFOA and PFOS are not regulated by the USEPA. However, in 2016, USEPA established a Lifetime Drinking Water Health Advisory limit of 70 ng/L for the combined concentration of PFOA and PFOS. This presentation will cover the available technologies that can treat for PFOA and PFOS and discuss the costs of those treatments. It will also cover the implementation of EPA's Method 537 that can be used to analyze for PFOA and PFOS. To present on the available treatments a community could use to treat PFOA or PFOS, and the analytical technique to analyze them.
NASA Technical Reports Server (NTRS)
Weinstock, K. J.; Morrissey, L. A.
1984-01-01
Rock type identification may be assisted by the use of remote sensing of associated vegetation, particularly in areas of dense vegetative cover where surface materials are not imaged directly by the sensor. The geobotanical discrimination of ultramafic parent materials was investigated and analytical techniques for lithologic mapping and mineral exploration were developed. The utility of remotely sensed data to discriminate vegetation types associated with ultramafic parent materials in a study area in southwest Oregon were evaluated. A number of specific objectives were identified, which include: (1) establishment of the association between vegetation and rock types; (2) examination of the spectral separability of vegetation types associated with rock types; (3) determination of the contribution of each TMS band for discriminating vegetation associated with rock types and (4) comparison of analytical techniques for spectrally classifying vegetation.
NASA Astrophysics Data System (ADS)
Anselmi, C.; Presciutti, F.; Doherty, B.; Brunetti, B. G.; Sgamellotti, A.; Miliani, C.
2011-07-01
This contribution focuses on an analytical evaluation of the use of cyclododecane (CDD) (C12H24) as a temporary protective coating for non-porous stone materials of cultural heritage interest. A facile solvent spray application technique for the production of an adherent continuous film has been assessed. The criterion for monitoring the sublimation of the cyclododecane film on marble has been established through the use of non-invasive analytical techniques so as to avoid any interaction with the process under study, where results serve to integrate and enhance knowledge into the use of cyclododecane in this discipline. Research is directed towards testing the applicability of portable infrared reflectance spectroscopy and nuclear magnetic resonance profilometry systems to follow the in-situ behavior of temporary consolidants. In particular, the coupling of two spectroscopic techniques such as IR and NMR has been possible, enabling the descriptions of both the formation of the film and its kinetics of sublimation.
Wille, G; Lerouge, C; Schmidt, U
2018-01-16
In cassiterite, tin is associated with metals (titanium, niobium, tantalum, indium, tungsten, iron, manganese, mercury). Knowledge of mineral chemistry and trace-element distribution is essential for: the understanding of ore formation, the exploration phase, the feasibility of ore treatment, and disposal/treatment of tailings after the exploitation phase. However, the availability of analytical methods make these characterisations difficult. We present a multitechnical approach to chemical and structural data that includes scanning electron microscopy (SEM)-based imaging and microanalysis techniques such as: secondary and backscattered electrons, cathodoluminescence (CL), electron probe microanalyser (EPMA), electron backscattered diffraction (EBSD) and confocal Raman-imaging integrated in a SEM (RISE). The presented results show the complementarity of the used analytical techniques. SEM, CL, EBSD, EPMA provide information from the interaction of an electron beam with minerals, leading to atomistic information about their composition, whereas RISE, Raman spectroscopy and imaging completes the studies with information about molecular vibrations, which are sensitive to structural modifications of the minerals. The correlation of Raman bands with the presence/absence of Nb, Ta, Fe (heterovalent substitution) and Ti (homovalent substitution) is established at a submicrometric scale. Combination of the different techniques makes it possible to establish a direct link between chemical and crystallographic data of cassiterite. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.
Adhesion, friction, wear, and lubrication research by modern surface science techniques.
NASA Technical Reports Server (NTRS)
Keller, D. V., Jr.
1972-01-01
The field of surface science has undergone intense revitalization with the introduction of low-energy electron diffraction, Auger electron spectroscopy, ellipsometry, and other surface analytical techniques which have been sophisticated within the last decade. These developments have permitted submono- and monolayer structure analysis as well as chemical identification and quantitative analysis. The application of a number of these techniques to the solution of problems in the fields of friction, lubrication, and wear are examined in detail for the particular case of iron; and in general to illustrate how the accumulation of pure data will contribute toward the establishment of physiochemical concepts which are required to understand the mechanisms that are operational in friction systems. In the case of iron, LEED, Auger and microcontact studies have established that hydrogen and light-saturated organic vapors do not establish interfaces which prevent iron from welding, whereas oxygen and some oxygen and sulfur compounds do reduce welding as well as the coefficient of friction. Interpretation of these data suggests a mechanism of sulfur interaction in lubricating systems.
Ribeiro, José A; Silva, F; Pereira, Carlos M
2013-02-05
In this work, the ion transfer mechanism of the anticancer drug daunorubicin (DNR) at a liquid/liquid interface has been studied for the first time. This study was carried out using electrochemical techniques, namely cyclic voltammetry (CV) and differential pulse voltammetry (DPV). The lipophilicity of DNR was investigated at the water/1,6-dichlorohexane (DCH) interface, and the results obtained were presented in the form of an ionic partition diagram. The partition coefficients of both neutral and ionic forms of the drug were determined. The analytical parameter for the detection of DNR was also investigated in this work. An electrochemical DNR sensor is proposed by means of simple ion transfer at the water/DCH interface, using DPV as the quantification technique. Experimental conditions for the analytical determination of DNR were established, and a detection limit of 0.80 μM was obtained.
Characterization and measurement of polymer wear
NASA Technical Reports Server (NTRS)
Buckley, D. H.; Aron, P. R.
1984-01-01
Analytical tools which characterize the polymer wear process are discussed. The devices discussed include: visual observation of polymer wear with SEM, the quantification with surface profilometry and ellipsometry, to study the chemistry with AES, XPS and SIMS, to establish interfacial polymer orientation and accordingly bonding with QUARTIR, polymer state with Raman spectroscopy and stresses that develop in polymer films using a X-ray double crystal camera technique.
Analytical procedures for water-soluble vitamins in foods and dietary supplements: a review.
Blake, Christopher J
2007-09-01
Water-soluble vitamins include the B-group vitamins and vitamin C. In order to correctly monitor water-soluble vitamin content in fortified foods for compliance monitoring as well as to establish accurate data banks, an accurate and precise analytical method is a prerequisite. For many years microbiological assays have been used for analysis of B vitamins. However they are no longer considered to be the gold standard in vitamins analysis as many studies have shown up their deficiencies. This review describes the current status of analytical methods, including microbiological assays and spectrophotometric, biosensor and chromatographic techniques. In particular it describes the current status of the official methods and highlights some new developments in chromatographic procedures and detection methods. An overview is made of multivitamin extractions and analyses for foods and supplements.
Doerrer, Nancy; Ladics, Gregory; McClain, Scott; Herouet-Guicheney, Corinne; Poulsen, Lars K; Privalle, Laura; Stagg, Nicola
2010-12-01
The International Life Sciences Institute Health and Environmental Sciences Institute Protein Allergenicity Technical Committee hosted an international workshop November 16-17, 2009, in Paris, France, with over 60 participants from academia, government, and industry to review and discuss the potential utility of "-omics" technologies for assessing the variability in plant gene, protein, and metabolite expression. The goal of the workshop was to illustrate how a plant's constituent makeup and phenotypic processes can be surveyed analytically. Presentations on the "-omics" techniques (i.e., genomics, proteomics, and metabolomics) highlighted the workshop, and summaries of these presentations are published separately in this supplemental issue. This paper summarizes key messages, as well as the consensus points reached, in a roundtable discussion on eight specific questions posed during the final session of the workshop. The workshop established some common, though not unique, challenges for all "-omics" techniques, and include (a) standardization of separation/extraction and analytical techniques; (b) difficulty in associating environmental impacts (e.g., planting, soil texture, location, climate, stress) with potential alterations in plants at genomic, proteomic, and metabolomic levels; (c) many independent analytical measurements, but few replicates/subjects--poorly defined accuracy and precision; and (d) bias--a lack of hypothesis-driven science. Information on natural plant variation is critical in establishing the utility of new technologies due to the variability in specific analytes that may result from genetic differences (crop genotype), different crop management practices (conventional high input, low input, organic), interaction between genotype and environment, and the use of different breeding methods. For example, variations of several classes of proteins were reported among different soybean, rice, or wheat varieties or varieties grown at different locations. Data on the variability of allergenic proteins are important in defining the risk of potential allergenicity. Once established as a standardized assay, survey approaches such as the "-omics" techniques can be considered in a hypothesis-driven analysis of plants, such as determining unintended effects in genetically modified (GM) crops. However, the analysis should include both the GM and control varieties that have the same breeding history and exposure to the same environmental conditions. Importantly, the biological relevance and safety significance of changes in "-omic" data are still unknown. Furthermore, the current compositional assessment for evaluating the substantial equivalence of GM crops is robust, comprehensive, and a good tool for food safety assessments. The overall consensus of the workshop participants was that many "-omics" techniques are extremely useful in the discovery and research phases of biotechnology, and are valuable for hypothesis generation. However, there are many methodological shortcomings identified with "-omics" approaches, a paucity of reference materials, and a lack of focused strategy for their use that currently make them not conducive for the safety assessment of GM crops. Copyright © 2010 Elsevier Inc. All rights reserved.
Design and fabrication of planar structures with graded electromagnetic properties
NASA Astrophysics Data System (ADS)
Good, Brandon Lowell
Successfully integrating electromagnetic properties in planar structures offers numerous benefits to the microwave and optical communities. This work aims at formulating new analytic and optimized design methods, creating new fabrication techniques for achieving those methods, and matching appropriate implementation of methods to fabrication techniques. The analytic method consists of modifying an approach that realizes perfect antireflective properties from graded profiles. This method is shown for all-dielectric and magneto-dielectric grading profiles. The optimized design methods are applied to transformer (discrete) or taper (continuous) designs. From these methods, a subtractive and an additive manufacturing technique were established and are described. The additive method, dry powder dot deposition, enables three dimensional varying electromagnetic properties in a structural composite. Combining the methods and fabrication is shown in two applied methodologies. The first uses dry powder dot deposition to design one dimensionally graded electromagnetic profiles in a planar fiberglass composite. The second method simultaneously applies antireflective properties and adjusts directivity through a slab through the use of subwavelength structures to achieve a flat antireflective lens. The end result of this work is a complete set of methods, formulations, and fabrication techniques to achieve integrated electromagnetic properties in planar structures.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Chylewska, Agnieszka; Ogryzek, M; Makowski, Mariusz
2017-10-23
New analytical and molecular methods for microorganisms are being developed on various features of identification i.e. selectivity, specificity, sensitivity, rapidity and discrimination of the viable cell. The presented review was established following the current trends in improved pathogens separation and detection methods and their subsequent use in medical diagnosis. This contribution also focuses on the development of analytical and biological methods in the analysis of microorganisms, with special attention paid to bio-samples containing microbes (blood, urine, lymph, wastewater). First, the paper discusses microbes characterization, their structure, surface, properties, size and then it describes pivotal points in the bacteria, viruses and fungi separation procedure obtained by researchers in the last 30 years. According to the above, detection techniques can be classified into three categories, which were, in our opinion, examined and modified most intensively during this period: electrophoretic, nucleic-acid-based, and immunological methods. The review covers also the progress, limitations and challenges of these approaches and emphasizes the advantages of new separative techniques in selective fractionating of microorganisms. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Lianidou, Evi; Ahmad-Nejad, Parviz; Ferreira-Gonzalez, Andrea; Izuhara, Kenji; Cremonesi, Laura; Schroeder, Maria-Eugenia; Richter, Karin; Ferrari, Maurizio; Neumaier, Michael
2014-09-25
Molecular techniques are becoming commonplace in the diagnostic laboratory. Their applications influence all major phases of laboratory medicine including predisposition/genetic risk, primary diagnosis, therapy stratification and prognosis. Readily available laboratory hardware and wetware (i.e. consumables and reagents) foster rapid dissemination to countries that are just establishing molecular testing programs. Appropriate skill levels extending beyond the technical procedure are required for analytical and diagnostic proficiency that is mandatory in molecular genetic testing. An international committee (C-CMBC) of the International Federation for Clinical Chemistry (IFCC) was established to disseminate skills in molecular genetic testing in member countries embarking on the respective techniques. We report the ten-year experience with different teaching and workshop formats for beginners in molecular diagnostics. Copyright © 2014 Elsevier B.V. All rights reserved.
Automated measurement of respiratory gas exchange by an inert gas dilution technique
NASA Technical Reports Server (NTRS)
Sawin, C. F.; Rummel, J. A.; Michel, E. L.
1974-01-01
A respiratory gas analyzer (RGA) has been developed wherein a mass spectrometer is the sole transducer required for measurement of respiratory gas exchange. The mass spectrometer maintains all signals in absolute phase relationships, precluding the need to synchronize flow and gas composition as required in other systems. The RGA system was evaluated by comparison with the Douglas bag technique. The RGA system established the feasibility of the inert gas dilution method for measuring breath-by-breath respiratory gas exchange. This breath-by-breath analytical capability permits detailed study of transient respiratory responses to exercise.
A complete and partial integrability technique of the Lorenz system
NASA Astrophysics Data System (ADS)
Bougoffa, Lazhar; Al-Awfi, Saud; Bougouffa, Smail
2018-06-01
In this paper we deal with the well-known nonlinear Lorenz system that describes the deterministic chaos phenomenon. We consider an interesting problem with time-varying phenomena in quantum optics. Then we establish from the motion equations the passage to the Lorenz system. Furthermore, we show that the reduction to the third order non linear equation can be performed. Therefore, the obtained differential equation can be analytically solved in some special cases and transformed to Abel, Dufing, Painlevé and generalized Emden-Fowler equations. So, a motivating technique that permitted a complete and partial integrability of the Lorenz system is presented.
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338
Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi
2017-01-01
Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.
Finding Waldo: Learning about Users from their Interactions.
Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco
2014-12-01
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.
[An evaluation of costs in nephrology by means of analytical accounting system].
Hernández-Jaras, J; García Pérez, H; Pons, R; Calvo, C
2005-01-01
The analytical accounting is a countable technique directed to the evaluation, by means of pre-established criteria of distribution, of the internal economy of the hospital, in order to know the effectiveness and efficiency of Clinical Units. The aim of this study was to analyze the activity and costs of the Nephrology Department of General Hospital of Castellón. Activity of Hospitalization and Ambulatory Care, during 2003 was analysed. Hospitalization discharges were grouped in DGR and the costs per DGR were determinated. Total costs Hospitalisation and Ambulatory Care were 560.434,9 and 146.317,8 Euros, respectively. And the costs of one stay, one first outpatient visit and maintenance visit were 200, 63, and 31,6 Euros, respectively. Eighty per cent of the discharges were grouped in 9 DGR and DRG number 316 (Renal Failure) represented 30% of the total productivity. Costs of DGR 316 were 3.178,2 Euros and 16% represented laboratory cost and costs of diagnostic or therapeutic procedures. With introduction of analytical accounting and DGR system, the Nephrology Departments can acquire more full information on the results and costs of treatment. These techniques permits to improve the financial and economic performance.
On effective temperature in network models of collective behavior
DOE Office of Scientific and Technical Information (OSTI.GOV)
Porfiri, Maurizio, E-mail: mporfiri@nyu.edu; Ariel, Gil, E-mail: arielg@math.biu.ac.il
Collective behavior of self-propelled units is studied analytically within the Vectorial Network Model (VNM), a mean-field approximation of the well-known Vicsek model. We propose a dynamical systems framework to study the stochastic dynamics of the VNM in the presence of general additive noise. We establish that a single parameter, which is a linear function of the circular mean of the noise, controls the macroscopic phase of the system—ordered or disordered. By establishing a fluctuation–dissipation relation, we posit that this parameter can be regarded as an effective temperature of collective behavior. The exact critical temperature is obtained analytically for systems withmore » small connectivity, equivalent to low-density ensembles of self-propelled units. Numerical simulations are conducted to demonstrate the applicability of this new notion of effective temperature to the Vicsek model. The identification of an effective temperature of collective behavior is an important step toward understanding order–disorder phase transitions, informing consistent coarse-graining techniques and explaining the physics underlying the emergence of collective phenomena.« less
Effects of joints in truss structures
NASA Technical Reports Server (NTRS)
Ikegami, R.
1988-01-01
The response of truss-type structures for future space applications, such as Large Deployable Reflector (LDR), will be directly affected by joint performance. Some of the objectives of research at BAC were to characterize structural joints, establish analytical approaches that incorporate joint characteristics, and experimentally establish the validity of the analytical approaches. The test approach to characterize joints for both erectable and deployable-type structures was based upon a Force State Mapping Technique. The approach pictorially shows how the nonlinear joint results can be used for equivalent linear analysis. Testing of the Space Station joints developed at LaRC (a hinged joint at 2 Hz and a clevis joint at 2 Hz) successfully revealed the nonlinear characteristics of the joints. The Space Station joints were effectively linear when loaded to plus or minus 500 pounds with a corresponding displacement of about plus or minus 0.0015 inch. It was indicated that good linear joints exist which are compatible with errected structures, but that difficulty may be encountered if nonlinear-type joints are incorporated in the structure.
NASA Astrophysics Data System (ADS)
Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke
2016-05-01
Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.
Harel, Elad; Schröder, Leif; Xu, Shoujun
2008-01-01
Nuclear magnetic resonance (NMR) is a well-established analytical technique in chemistry. The ability to precisely control the nuclear spin interactions that give rise to the NMR phenomenon has led to revolutionary advances in fields as diverse as protein structure determination and medical diagnosis. Here, we discuss methods for increasing the sensitivity of magnetic resonance experiments, moving away from the paradigm of traditional NMR by separating the encoding and detection steps of the experiment. This added flexibility allows for diverse applications ranging from lab-on-a-chip flow imaging and biological sensors to optical detection of magnetic resonance imaging at low magnetic fields. We aim to compare and discuss various approaches for a host of problems in material science, biology, and physics that differ from the high-field methods routinely used in analytical chemistry and medical imaging.
METHOD 544. DETERMINATION OF MICROCYSTINS AND ...
Method 544 is an accurate and precise analytical method to determine six microcystins (including MC-LR) and nodularin in drinking water using solid phase extraction and liquid chromatography tandem mass spectrometry (SPE-LC/MS/MS). The advantage of this SPE-LC/MS/MS is its sensitivity and ability to speciate the microcystins. This method development task establishes sample preservation techniques, sample concentration and analytical procedures, aqueous and extract holding time criteria and quality control procedures. Draft Method 544 undergone a multi-laboratory verification to ensure other laboratories can implement the method and achieve the quality control measures specified in the method. It is anticipated that Method 544 may be used in UCMR 4 to collect nationwide occurrence data for selected microcystins in drinking water. The purpose of this research project is to develop an accurate and precise analytical method to concentrate and determine selected MCs and nodularin in drinking water.
2012-01-01
Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949
[Standard addition determination of impurities in Na2CrO4 by ICP-AES].
Wang, Li-ping; Feng, Hai-tao; Dong, Ya-ping; Peng, Jiao-yu; Li, Wu; Shi, Hai-qin; Wang, Yong
2015-02-01
Coupled plasma atomic emission spectrometry (ICP-AES) was used to determine the trace impurities of Ca, Mg, Al, Fe and Si in industrial sodium chromate. Wavelengths of 167.079, 393.366, 259.940, 279.533 and 251.611 nm were selected as analytical lines for the determination of Al, Ca, Fe, Mg and Si, respectively. The analytical errors can be eliminated by adjusting the determined solution with high pure hydrochloric acid. Standard addition method was used to eliminate matrix effects. The linear correlation, detection limit, precision and recovery for the concerned trace impurities have been examined. The effect of standard addition method on the accuracy for the determination under the selected analytical lines has been studied in detail. The results show that the linear correlations of standard curves were very good (R2 = 0.9988 to 0.9996) under the determined conditions. Detection limits of these trace impurities were in the range of 0.0134 to 0.0280 mg x L(-1). Sample recoveries were within 97.30% to 107.50%, and relative standard deviations were lower than 5.86% for eleven repeated determinations. The detection limits and accuracies established by the experiment can meet the analytical requirements and the analytic procedure was used to determine trace impurities in sodium chromate by ion membrane electrolysis technique successfully. Due to sodium chromate can be changed into sodium dichromate and chromic acid by adding acids, the established method can be further used to monitor trace impurities in these compounds or other hexavalent chromium compounds.
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.; Matijevic, J. R.
1987-01-01
Novel system engineering techniques have been developed and applied to establishing structured design and performance objectives for the Telerobotics Testbed that reduce technical risk while still allowing the testbed to demonstrate an advancement in state-of-the-art robotic technologies. To estblish the appropriate tradeoff structure and balance of technology performance against technical risk, an analytical data base was developed which drew on: (1) automation/robot-technology availability projections, (2) typical or potential application mission task sets, (3) performance simulations, (4) project schedule constraints, and (5) project funding constraints. Design tradeoffs and configuration/performance iterations were conducted by comparing feasible technology/task set configurations against schedule/budget constraints as well as original program target technology objectives. The final system configuration, task set, and technology set reflected a balanced advancement in state-of-the-art robotic technologies, while meeting programmatic objectives and schedule/cost constraints.
Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...
2014-10-03
Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less
Shen, Weijian; Xu, Jinzhong; Yang, Wenquan; Shen, Chongyu; Zhao, Zengyun; Ding, Tao; Wu, Bin
2007-09-01
An analytical method of solid phase extraction-gas chromatography-mass spectrometry with two different ionization techniques was established for simultaneous determination of 12 acetanilide herbicide residues in tea-leaves. Herbicides were extracted from tea-leaf samples with ethyl acetate. The extract was cleaned-up on an active carbon SPE column connected to a Florisil SPE column. Analytical screening was determined by the technique of gas chromatography (GC)-mass spectrometry (MS) in the selected ion monitoring (SIM) mode with either electron impact ionization (EI) or negative chemical ionization (NCI). It is reliable and stable that the recoveries of all herbicides were in the range from 50% to 110% at three spiked levels, 10 microg/kg, 20 microg/kg and 40 microg/kg, and the relative standard deviations (RSDs) were no more than 10.9%. The two different ionization techniques are complementary as more ion fragmentation information can be obtained from the EI mode while more molecular ion information from the NCI mode. By comparison of the two techniques, the selectivity of NCI-SIM was much better than that of EI-SIM method. The sensitivities of the both techniques were high, the limit of quantitative (LOQ) for each herbicide was no more than 2.0 microg/kg, and the limit of detection (LOD) with NCI-SIM technique was much lower than that of EI-SIM when analyzing herbicides with several halogen atoms in the molecule.
Vapor phase diamond growth technology
NASA Technical Reports Server (NTRS)
Angus, J. C.
1981-01-01
Ion beam deposition chambers used for carbon film generation were designed and constructed. Features of the developed equipment include: (1) carbon ion energies down to approx. 50 eV; (2) in suit surface monitoring with HEED; (3) provision for flooding the surface with ultraviolet radiation; (4) infrared laser heating of substrate; (5) residual gas monitoring; (6) provision for several source gases, including diborane for doping studies; and (7) growth from either hydrocarbon source gases or from carbon/argon arc sources. Various analytical techniques for characterization of from carbon/argon arc sources. Various analytical techniques for characterization of the ion deposited carbon films used to establish the nature of the chemical bonding and crystallographic structure of the films are discussed. These include: H2204/HN03 etch; resistance measurements; hardness tests; Fourier transform infrared spectroscopy; scanning auger microscopy; electron spectroscopy for chemical analysis; electron diffraction and energy dispersive X-ray analysis; electron energy loss spectroscopy; density measurements; secondary ion mass spectroscopy; high energy electron diffraction; and electron spin resonance. Results of the tests are summarized.
NASA Astrophysics Data System (ADS)
Yen, S. P. S.; Lewis, C. R.
Research is reported to identify polycarbonate (PC) film characteristics and fabrication procedures which extend the reliable performance range of PC capacitors to 125 C without derating, and establish quality control techniques and transfer technology to US PC film manufacturers. The approach chosen to solve these problems was to develop techniques for fabricating biaxially oriented (BX) 2 microns or thinner PC film with a low dissipation factor up to 140 C; isotropic dimensional stability; high crystallinity; and high voltage breakdown strength. The PC film structure and morphology was then correlated to thermal and electrical capacitor behavior. Analytical techniques were developed to monitor film quality during capacitor fabrication, and as a result, excellent performance was demonstrated during initial capacitor testing.
An intelligent content discovery technique for health portal content management.
De Silva, Daswin; Burstein, Frada
2014-04-23
Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current.
Analytical techniques for steroid estrogens in water samples - A review.
Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza
2016-12-01
In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
Determining Kinetic Parameters for Isothermal Crystallization of Glasses
NASA Technical Reports Server (NTRS)
Ray, C. S.; Zhang, T.; Reis, S. T.; Brow, R. K.
2006-01-01
Non-isothermal crystallization techniques are frequently used to determine the kinetic parameters for crystallization in glasses. These techniques are experimentally simple and quick compared to the isothermal techniques. However, the analytical models used for non-isothermal data analysis, originally developed for describing isothermal transformation kinetics, are fundamentally flawed. The present paper describes a technique for determining the kinetic parameters for isothermal crystallization in glasses, which eliminates most of the common problems that generally make the studies of isothermal crystallization laborious and time consuming. In this technique, the volume fraction of glass that is crystallized as a function of time during an isothermal hold was determined using differential thermal analysis (DTA). The crystallization parameters for the lithium-disilicate (Li2O.2SiO2) model glass were first determined and compared to the same parameters determined by other techniques to establish the accuracy and usefulness of the present technique. This technique was then used to describe the crystallization kinetics of a complex Ca-Sr-Zn-silicate glass developed for sealing solid oxide fuel cells.
Insights from two industrial hygiene pilot e-cigarette passive vaping studies.
Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A
2016-01-01
While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.
Cespi, Marco; Perinelli, Diego R; Casettari, Luca; Bonacucina, Giulia; Caporicci, Giuseppe; Rendina, Filippo; Palmieri, Giovanni F
2014-12-30
The use of process analytical technologies (PAT) to ensure final product quality is by now a well established practice in pharmaceutical industry. To date, most of the efforts in this field have focused on development of analytical methods using spectroscopic techniques (i.e., NIR, Raman, etc.). This work evaluated the possibility of using the parameters derived from the processing of in-line raw compaction data (the forces and displacement of the punches) as a PAT tool for controlling the tableting process. To reach this goal, two commercially available formulations were used, changing the quantitative composition and compressing them on a fully instrumented rotary pressing machine. The Heckel yield pressure and the compaction energies, together with the tablets hardness and compaction pressure, were selected and evaluated as discriminating parameters in all the prepared formulations. The apparent yield pressure, as shown in the obtained results, has the necessary sensitivity to be effectively included in a PAT strategy to monitor the tableting process. Additional investigations were performed to understand the criticalities and the mechanisms beyond this performing parameter and the associated implications. Specifically, it was discovered that the efficiency of the apparent yield pressure depends on the nominal drug title, the drug densification mechanism and the error in pycnometric density. In this study, the potential of using some parameters derived from the compaction raw data has been demonstrated to be an attractive alternative and complementary method to the well established spectroscopic techniques to monitor and control the tableting process. The compaction data monitoring method is also easy to set up and very cost effective. Copyright © 2014 Elsevier B.V. All rights reserved.
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2011 CFR
2011-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2014 CFR
2014-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2012 CFR
2012-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
21 CFR 530.22 - Safe levels and analytical methods for food-producing animals.
Code of Federal Regulations, 2013 CFR
2013-04-01
... analytical method; or (3) Establish a safe level based on other appropriate scientific, technical, or... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Safe levels and analytical methods for food... § 530.22 Safe levels and analytical methods for food-producing animals. (a) FDA may establish a safe...
Evaluating sustainable energy harvesting systems for human implantable sensors
NASA Astrophysics Data System (ADS)
AL-Oqla, Faris M.; Omar, Amjad A.; Fares, Osama
2018-03-01
Achieving most appropriate energy-harvesting technique for human implantable sensors is still challenging for the industry where keen decisions have to be performed. Moreover, the available polymeric-based composite materials are offering plentiful renewable applications that can help sustainable development as being useful for the energy-harvesting systems such as photovoltaic, piezoelectric, thermoelectric devices as well as other energy storage systems. This work presents an expert-based model capable of better evaluating and examining various available renewable energy-harvesting techniques in urban surroundings subject to various technical and economic, often conflicting, criteria. Wide evaluation criteria have been adopted in the proposed model after examining their suitability as well as ensuring the expediency and reliability of the model by worldwide experts' feedback. The model includes establishing an analytic hierarchy structure with simultaneous 12 conflicting factors to establish a systematic road map for designers to better assess such techniques for human implantable medical sensors. The energy-harvesting techniques considered were limited to Wireless, Thermoelectric, Infrared Radiator, Piezoelectric, Magnetic Induction and Electrostatic Energy Harvesters. Results have demonstrated that the best decision was in favour of wireless-harvesting technology for the medical sensors as it is preferable by most of the considered evaluation criteria in the model.
Enhance your team-based qualitative research.
Fernald, Douglas H; Duclos, Christine W
2005-01-01
Qualitative research projects often involve the collaborative efforts of a research team. Challenges inherent in teamwork include changes in membership and differences in analytical style, philosophy, training, experience, and skill. This article discusses teamwork issues and tools and techniques used to improve team-based qualitative research. We drew on our experiences in working on numerous projects of varying, size, duration, and purpose. Through trials of different tools and techniques, expert consultation, and review of the literature, we learned to improve how we build teams, manage information, and disseminate results. Attention given to team members and team processes is as important as choosing appropriate analytical tools and techniques. Attentive team leadership, commitment to early and regular team meetings, and discussion of roles, responsibilities, and expectations all help build more effective teams and establish clear norms. As data are collected and analyzed, it is important to anticipate potential problems from differing skills and styles, and how information and files are managed. Discuss analytical preferences and biases and set clear guidelines and practices for how data will be analyzed and handled. As emerging ideas and findings disperse across team members, common tools (such as summary forms and data grids), coding conventions, intermediate goals or products, and regular documentation help capture essential ideas and insights. In a team setting, little should be left to chance. This article identifies ways to improve team-based qualitative research with more a considered and systematic approach. Qualitative researchers will benefit from further examination and discussion of effective, field-tested, team-based strategies.
Finding Waldo: Learning about Users from their Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Eli T.; Ottley, Alvitta; Zhao, Helen
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less
Combined sensing platform for advanced diagnostics in exhaled mouse breath
NASA Astrophysics Data System (ADS)
Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris
2013-03-01
Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas chromatography coupled to mass spectrometric detection. Here, we aim at potentially continuously analyzing the TTR via iHWGs and LS flow-through sensors requiring only minute (< 1 mL) sample volumes. Furthermore, this study explores non-linearities observed for the calibration functions of 12CO2 and 13CO2 potentially resulting from effects related to optical collision diameters e.g., in presence of molecular oxygen. It is anticipated that the simultaneous continuous analysis of oxygen via LS will facilitate the correction of these effects after inclusion within appropriate multivariate calibration models, thus providing more reliable and robust calibration schemes for continuously monitoring relevant breath constituents.
Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed
2017-11-01
There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.
Monitoring of Cr, Cu, Pb, V and Zn in polluted soils by laser induced breakdown spectroscopy (LIBS).
Dell'Aglio, Marcella; Gaudiuso, Rosalba; Senesi, Giorgio S; De Giacomo, Alessandro; Zaccone, Claudio; Miano, Teodoro M; De Pascale, Olga
2011-05-01
Laser Induced Breakdown Spectroscopy (LIBS) is a fast and multi-elemental analytical technique particularly suitable for the qualitative and quantitative analysis of heavy metals in solid samples, including environmental ones. Although LIBS is often recognised in the literature as a well-established analytical technique, results about quantitative analysis of elements in chemically complex matrices such as soils are quite contrasting. In this work, soil samples of various origins have been analyzed by LIBS and data compared to those obtained by Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES). The emission intensities of one selected line for each of the five analytes (i.e., Cr, Cu, Pb, V, and Zn) were normalized to the background signal, and plotted as a function of the concentration values previously determined by ICP-OES. Data showed a good linearity for all calibration lines drawn, and the correlation between ICP-OES and LIBS was confirmed by the satisfactory agreement obtained between the corresponding values. Consequently, LIBS method can be used at least for metal monitoring in soils. In this respect, a simple method for the estimation of the soil pollution degree by heavy metals, based on the determination of an anthropogenic index, was proposed and determined for Cr and Zn.
Elliptic-cylindrical analytical flux-rope model for ICMEs
NASA Astrophysics Data System (ADS)
Nieves-Chinchilla, T.; Linton, M.; Hidalgo, M. A. U.; Vourlidas, A.
2016-12-01
We present an analytical flux-rope model for realistic magnetic structures embedded in Interplanetary Coronal Mass Ejections. The framework of this model was established by Nieves-Chinchilla et al. (2016) with the circular-cylindrical analytical flux rope model and under the concept developed by Hidalgo et al. (2002). Elliptic-cylindrical geometry establishes the first-grade of complexity of a series of models. The model attempts to describe the magnetic flux rope topology with distorted cross-section as a possible consequence of the interaction with the solar wind. In this model, the flux rope is completely described in the non-euclidean geometry. The Maxwell equations are solved using tensor calculus consistently with the geometry chosen, invariance along the axial component, and with the only assumption of no radial current density. The model is generalized in terms of the radial dependence of the poloidal current density component and axial current density component. The misalignment between current density and magnetic field is studied in detail for the individual cases of different pairs of indexes for the axial and poloidal current density components. This theoretical analysis provides a map of the force distribution inside of the flux-rope. The reconstruction technique has been adapted to the model and compared with in situ ICME set of events with different in situ signatures. The successful result is limited to some cases with clear in-situ signatures of distortion. However, the model adds a piece in the puzzle of the physical-analytical representation of these magnetic structures. Other effects such as axial curvature, expansion and/or interaction could be incorporated in the future to fully understand the magnetic structure. Finally, the mathematical formulation of this model opens the door to the next model: toroidal flux rope analytical model.
The role of analytical chemistry in Niger Delta petroleum exploration: a review.
Akinlua, Akinsehinwa
2012-06-12
Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.
Analytical techniques: A compilation
NASA Technical Reports Server (NTRS)
1975-01-01
A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.
Estimation on nonlinear damping in second order distributed parameter systems
NASA Technical Reports Server (NTRS)
Banks, H. T.; Reich, Simeon; Rosen, I. G.
1989-01-01
An approximation and convergence theory for the identification of nonlinear damping in abstract wave equations is developed. It is assumed that the unknown dissipation mechanism to be identified can be described by a maximal monotone operator acting on the generalized velocity. The stiffness is assumed to be linear and symmetric. Functional analytic techniques are used to establish that solutions to a sequence of finite dimensional (Galerkin) approximating identification problems in some sense approximate a solution to the original infinite dimensional inverse problem.
Suvarapu, Lakshmi Narayana; Baek, Sung-Ok
2015-01-01
This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2010-04-01
The annual update of the list of prohibited substances and doping methods as issued by the World Anti-Doping Agency (WADA) allows the implementation of most recent considerations of performance manipulation and emerging therapeutics into human sports doping control programmes. The annual banned-substance review for human doping controls critically summarizes recent innovations in analytical approaches that support the efforts of convicting cheating athletes by improved or newly established methods that focus on known as well as newly outlawed substances and doping methods. In the current review, literature published between October 2008 and September 2009 reporting on new and/or enhanced procedures and techniques for doping analysis, as well as aspects relevant to the doping control arena, was considered to complement the 2009 annual banned-substance review.
NASA Astrophysics Data System (ADS)
Gupta, Lokesh Kumar
2012-11-01
Seven process related impurities were identified by LC-MS in the atorvastatin calcium drug substance. These impurities were identified by LC-MS. The structure of impurities was confirmed by modern spectroscopic techniques like 1H NMR and IR and physicochemical studies conducted by using synthesized authentic reference compounds. The synthesized reference samples of the impurity compounds were used for the quantitative HPLC determination. These impurities were detected by newly developed gradient, reverse phase high performance liquid chromatographic (HPLC) method. The system suitability of HPLC analysis established the validity of the separation. The analytical method was validated according to International Conference of Harmonization (ICH) with respect to specificity, precision, accuracy, linearity, robustness and stability of analytical solutions to demonstrate the power of newly developed HPLC method.
ERIC Educational Resources Information Center
Hough, Susan L.; Hall, Bruce W.
The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
The effect of analytic and experiential modes of thought on moral judgment.
Kvaran, Trevor; Nichols, Shaun; Sanfey, Alan
2013-01-01
According to dual-process theories, moral judgments are the result of two competing processes: a fast, automatic, affect-driven process and a slow, deliberative, reason-based process. Accordingly, these models make clear and testable predictions about the influence of each system. Although a small number of studies have attempted to examine each process independently in the context of moral judgment, no study has yet tried to experimentally manipulate both processes within a single study. In this chapter, a well-established "mode-of-thought" priming technique was used to place participants in either an experiential/emotional or analytic mode while completing a task in which participants provide judgments about a series of moral dilemmas. We predicted that individuals primed analytically would make more utilitarian responses than control participants, while emotional priming would lead to less utilitarian responses. Support was found for both of these predictions. Implications of these findings for dual-process theories of moral judgment will be discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1974-01-01
Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.
NASA Astrophysics Data System (ADS)
Chen, Jianbo; Guo, Baolin; Yan, Rui; Sun, Suqin; Zhou, Qun
2017-07-01
With the utilization of the hand-held equipment, Fourier transform infrared (FT-IR) spectroscopy is a promising analytical technique to minimize the time cost for the chemical identification of herbal materials. This research examines the feasibility of the hand-held FT-IR spectrometer for the on-site testing of herbal materials, using Lonicerae Japonicae Flos (LJF) and Lonicerae Flos (LF) as examples. Correlation-based linear discriminant models for LJF and LF are established based on the benchtop and hand-held FT-IR instruments. The benchtop FT-IR models can exactly recognize all articles of LJF and LF. Although a few LF articles are misjudged at the sub-class level, the hand-held FT-IR models are able to exactly discriminate LJF and LF. As a direct and label-free analytical technique, FT-IR spectroscopy has great potential in the rapid and automatic chemical identification of herbal materials either in laboratories or in fields. This is helpful to prevent the spread and use of adulterated herbal materials in time.
Graphene liquid cells for multi-technique analysis of biological cells in water environment
NASA Astrophysics Data System (ADS)
Matruglio, A.; Zucchiatti, P.; Birarda, G.; Marmiroli, B.; D'Amico, F.; Kocabas, C.; Kiskinova, M.; Vaccari, L.
2018-05-01
In-cell exploration of biomolecular constituents is the new frontier of cellular biology that will allow full access to structure-activity correlation of biomolecules, overcoming the limitations imposed by dissecting the cellular milieu. However, the presence of water, which is a very strong IR absorber and incompatible with the vacuum working conditions of all analytical methods using soft x-rays and electrons, poses severe constraint to perform important imaging and spectroscopic analyses under physiological conditions. Recent advances to separate the sample compartment in liquid cell are based on electron and photon transparent but molecular-impermeable graphene membranes. This strategy has opened a unique opportunity to explore technological materials under realistic operation conditions using various types of electron microscopy. However, the widespread of the graphene liquid cell applications is still impeded by the lack of well-established approaches for their massive production. We report on the first preliminary results for the fabrication of reproducible graphene liquid cells appropriate for the analysis of biological specimens in their natural hydrated environment with several crucial analytical techniques, namely FTIR microscopy, Raman spectroscopy, AFM, SEM and TEM.
Takegawa, Yasuhiro; Araki, Kayo; Fujitani, Naoki; Furukawa, Jun-ichi; Sugiyama, Hiroaki; Sakai, Hideaki; Shinohara, Yasuro
2011-12-15
Glycosaminoglycans (GAGs) play important roles in cell adhesion and growth, maintenance of extracellular matrix (ECM) integrity, and signal transduction. To fully understand the biological functions of GAGs, there is a growing need for sensitive, rapid, and quantitative analysis of GAGs. The present work describes a novel analytical technique that enables high throughput cellular/tissue glycosaminoglycomics for all three families of uronic acid-containing GAGs, hyaluronan (HA), chondroitin sulfate (CS)/dermatan sulfate (DS), and heparan sulfate (HS). A one-pot purification and labeling procedure for GAG Δ-disaccharides was established by chemo-selective ligation of disaccharides onto high density hydrazide beads (glycoblotting) and subsequent labeling by fluorescence. The 17 most common disaccharides (eight comprising HS, eight CS/DS, and one comprising HA) could be separated with a single chromatography for the first time by employing a zwitter-ionic type of hydrophilic-interaction chromatography column. These novel analytical techniques were able to precisely characterize the glycosaminoglycome in various cell types including embryonal carcinoma cells and ocular epithelial tissues (cornea, conjunctiva, and limbus).
Toward improved understanding and control in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hieftje, Gary M.
1989-01-01
As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.
Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael
2009-06-01
Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.
Non-traditional isotopes in analytical ecogeochemistry assessed by MC-ICP-MS
NASA Astrophysics Data System (ADS)
Prohaska, Thomas; Irrgeher, Johanna; Horsky, Monika; Hanousek, Ondřej; Zitek, Andreas
2014-05-01
Analytical ecogeochemistry deals with the development and application of tools of analytical chemistry to study dynamic biological and ecological processes within ecosystems and across ecosystem boundaries in time. It can be best described as a linkage between modern analytical chemistry and a holistic understanding of ecosystems ('The total human ecosystem') within the frame of transdisciplinary research. One focus of analytical ecogeochemistry is the advanced analysis of elements and isotopes in abiotic and biotic matrices and the application of the results to basic questions in different research fields like ecology, environmental science, climatology, anthropology, forensics, archaeometry and provenancing. With continuous instrumental developments, new isotopic systems have been recognized for their potential to study natural processes and well established systems could be analyzed with improved techniques, especially using multi collector inductively coupled plasma mass spectrometry (MC-ICP-MS). For example, in case of S, isotope ratio measurements at high mass resolution could be achieved at much lower S concentrations with ICP-MS as compared to IRMS, still keeping suitable uncertainty. Almost 50 different isotope systems have been investigated by ICP-MS, so far, with - besides Sr, Pb and U - Ca, Mg, Cd, Li, Hg, Si, Ge and B being the most prominent and considerably pushing the limits of plasma based mass spectrometry also by applying high mass resolution. The use of laser ablation in combination with MC-ICP-MS offers the possibility to achieve isotopic information on high spatial (µm-range) and temporal scale (in case of incrementally growing structures). The information gained with these analytical techniques can be linked between different hierarchical scales in ecosystems, offering means to better understand ecosystem processes. The presentation will highlight the use of different isotopic systems in ecosystem studies accomplished by ICP-MS. Selected examples on combining isotopic systems for the study of ecosystem processes on different spatial scales will underpin the great opportunities substantiated by the field of analytical ecogeochemistry. Moreover, recent developments in plasma mass spectrometry and the application of new isotopic systems require sound metrological approaches in order to prevent scientific conclusions drawn from analytical artifacts.
Deriving Earth Science Data Analytics Tools/Techniques Requirements
NASA Astrophysics Data System (ADS)
Kempler, S. J.
2015-12-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.
Green analytical chemistry--theory and practice.
Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek
2010-08-01
This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.
An Intelligent Content Discovery Technique for Health Portal Content Management
2014-01-01
Background Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. Objective This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content Methods A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. Results The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. Conclusions The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current. PMID:25654440
Strotmann, Uwe; Reuschenbach, Peter; Schwarz, Helmut; Pagga, Udo
2004-01-01
Well-established biodegradation tests use biogenously evolved carbon dioxide (CO2) as an analytical parameter to determine the ultimate biodegradability of substances. A newly developed analytical technique based on the continuous online measurement of conductivity showed its suitability over other techniques. It could be demonstrated that the method met all criteria of established biodegradation tests, gave continuous biodegradation curves, and was more reliable than other tests. In parallel experiments, only small variations in the biodegradation pattern occurred. When comparing the new online CO2 method with existing CO2 evolution tests, growth rates and lag periods were similar and only the final degree of biodegradation of aniline was slightly lower. A further test development was the unification and parallel measurement of all three important summary parameters for biodegradation—i.e., CO2 evolution, determination of the biochemical oxygen demand (BOD), and removal of dissolved organic carbon (DOC)—in a multicomponent biodegradation test system (MCBTS). The practicability of this test method was demonstrated with aniline. This test system had advantages for poorly water-soluble and highly volatile compounds and allowed the determination of the carbon fraction integrated into biomass (heterotrophic yield). The integrated online measurements of CO2 and BOD systems produced continuous degradation curves, which better met the stringent criteria of ready biodegradability (60% biodegradation in a 10-day window). Furthermore the data could be used to calculate maximal growth rates for the modeling of biodegradation processes. PMID:15294794
Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.
ERIC Educational Resources Information Center
Heineman, William R.; Kissinger, Peter T.
1980-01-01
Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)
Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne
2010-02-01
Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.
Metabolomic Strategies Involving Mass Spectrometry Combined with Liquid and Gas Chromatography.
Lopes, Aline Soriano; Cruz, Elisa Castañeda Santa; Sussulini, Alessandra; Klassen, Aline
2017-01-01
Amongst all omics sciences, there is no doubt that metabolomics is undergoing the most important growth in the last decade. The advances in analytical techniques and data analysis tools are the main factors that make possible the development and establishment of metabolomics as a significant research field in systems biology. As metabolomic analysis demands high sensitivity for detecting metabolites present in low concentrations in biological samples, high-resolution power for identifying the metabolites and wide dynamic range to detect metabolites with variable concentrations in complex matrices, mass spectrometry is being the most extensively used analytical technique for fulfilling these requirements. Mass spectrometry alone can be used in a metabolomic analysis; however, some issues such as ion suppression may difficultate the quantification/identification of metabolites with lower concentrations or some metabolite classes that do not ionise as well as others. The best choice is coupling separation techniques, such as gas or liquid chromatography, to mass spectrometry, in order to improve the sensitivity and resolution power of the analysis, besides obtaining extra information (retention time) that facilitates the identification of the metabolites, especially when considering untargeted metabolomic strategies. In this chapter, the main aspects of mass spectrometry (MS), liquid chromatography (LC) and gas chromatography (GC) are discussed, and recent clinical applications of LC-MS and GC-MS are also presented.
Bounded extremum seeking with discontinuous dithers
Scheinker, Alexander; Scheinker, David
2016-03-21
The analysis of discontinuous extremum seeking (ES) controllers, e.g. those applicable to digital systems, has historically been more complicated than that of continuous controllers. We establish a simple and general extension of a recently developed bounded form of ES to a general class of oscillatory functions, including functions discontinuous with respect to time, such as triangle or square waves with dead time. We establish our main results by combining a novel idea for oscillatory control with an extension of functional analytic techniques originally utilized by Kurzweil, Jarnik, Sussmann, and Liu in the late 80s and early 90s and recently studiedmore » by Durr et al. Lastly, we demonstrate the value of the result with an application to inverter switching control.« less
OASIS: Organics Analyzer for Sampling Icy Surfaces
NASA Technical Reports Server (NTRS)
Getty, S. A.; Dworkin, J. P.; Glavin, D. P.; Martin, M.; Zheng, Y.; Balvin, M.; Southard, A. E.; Ferrance, J.; Malespin, C.
2012-01-01
Liquid chromatography mass spectrometry (LC-MS) is a well established laboratory technique for detecting and analyzing organic molecules. This approach has been especially fruitful in the analysis of nucleobases, amino acids, and establishing chirol ratios [1 -3]. We are developing OASIS, Organics Analyzer for Sampling Icy Surfaces, for future in situ landed missions to astrochemically important icy bodies, such as asteroids, comets, and icy moons. The OASIS design employs a microfabricated, on-chip analytical column to chromatographically separate liquid ana1ytes using known LC stationary phase chemistries. The elution products are then interfaced through electrospray ionization (ESI) and analyzed by a time-of-flight mass spectrometer (TOF-MS). A particular advantage of this design is its suitability for microgravity environments, such as for a primitive small body.
Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Chang Jae; Han, Seung; Yun, Jae Hee
2015-07-01
Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less
Measuring solids concentration in stormwater runoff: comparison of analytical methods.
Clark, Shirley E; Siu, Christina Y S
2008-01-15
Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.
Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael
2018-05-01
Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.
Hewavitharana, Amitha K; Abu Kassim, Nur Sofiah; Shaw, Paul Nicholas
2018-06-08
With mass spectrometric detection in liquid chromatography, co-eluting impurities affect the analyte response due to ion suppression/enhancement. Internal standard calibration method, using co-eluting stable isotope labelled analogue of each analyte as the internal standard, is the most appropriate technique available to correct for these matrix effects. However, this technique is not without drawbacks, proved to be expensive because separate internal standard for each analyte is required, and the labelled compounds are expensive or require synthesising. Traditionally, standard addition method has been used to overcome the matrix effects in atomic spectroscopy and was a well-established method. This paper proposes the same for mass spectrometric detection, and demonstrates that the results are comparable to those with the internal standard method using labelled analogues, for vitamin D assay. As conventional standard addition procedure does not address procedural errors, we propose the inclusion of an additional internal standard (not co-eluting). Recoveries determined on human serum samples show that the proposed method of standard addition yields more accurate results than the internal standardisation using stable isotope labelled analogues. The precision of the proposed method of standard addition is superior to the conventional standard addition method. Copyright © 2018 Elsevier B.V. All rights reserved.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Mission Assurance Modeling and Simulation: A Cyber Security Roadmap
NASA Technical Reports Server (NTRS)
Gendron, Gerald; Roberts, David; Poole, Donold; Aquino, Anna
2012-01-01
This paper proposes a cyber security modeling and simulation roadmap to enhance mission assurance governance and establish risk reduction processes within constrained budgets. The term mission assurance stems from risk management work by Carnegie Mellon's Software Engineering Institute in the late 19905. By 2010, the Defense Information Systems Agency revised its cyber strategy and established the Program Executive Officer-Mission Assurance. This highlights a shift from simply protecting data to balancing risk and begins a necessary dialogue to establish a cyber security roadmap. The Military Operations Research Society has recommended a cyber community of practice, recognizing there are too few professionals having both cyber and analytic experience. The authors characterize the limited body of knowledge in this symbiotic relationship. This paper identifies operational and research requirements for mission assurance M&S supporting defense and homeland security. M&S techniques are needed for enterprise oversight of cyber investments, test and evaluation, policy, training, and analysis.
Comparison of critical methods developed for fatty acid analysis: A review.
Wu, Zhuona; Zhang, Qi; Li, Ning; Pu, Yiqiong; Wang, Bing; Zhang, Tong
2017-01-01
Fatty acids are important nutritional substances and metabolites in living organisms. These acids are abundant in Chinese herbs, such as Brucea javanica, Notopterygium forbesii, Isatis tinctoria, Astragalus membranaceus, and Aconitum szechenyianum. This review illustrates the types of fatty acids and their significant roles in the human body. Many analytical methods are used for the qualitative and quantitative evaluation of fatty acids. Some of the methods used to analyze fatty acids in more than 30 kinds of plants, drugs, and other samples are presented in this paper. These analytical methods include gas chromatography, liquid chromatography, near-infrared spectroscopy, and NMR spectroscopy. The advantages and disadvantages of these techniques are described and compared. This review provides a valuable reference for establishing methods for fatty acid determination. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Graphite nanocomposites sensor for multiplex detection of antioxidants in food.
Ng, Khan Loon; Tan, Guan Huat; Khor, Sook Mei
2017-12-15
Butylated hydroxyanisole (BHA), butylated hydroxytoluene (BHT), and tert-butylhydroquinone (TBHQ) are synthetic antioxidants used in the food industry. Herein, we describe the development of a novel graphite nanocomposite-based electrochemical sensor for the multiplex detection and measurement of BHA, BHT, and TBHQ levels in complex food samples using a linear sweep voltammetry technique. Moreover, our newly established analytical method exhibited good sensitivity, limit of detection, limit of quantitation, and selectivity. The accuracy and reliability of analytical results were challenged by method validation and comparison with the results of the liquid chromatography method, where a linear correlation of more than 0.99 was achieved. The addition of sodium dodecyl sulfate as supporting additive further enhanced the LSV response (anodic peak current, I pa ) of BHA and BHT by 2- and 20-times, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z
2015-12-01
Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.
Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette
2014-08-15
The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hoffmann, Thomas; Dorrestein, Pieter C.
2015-11-01
Matrix deposition on agar-based microbial colonies for MALDI imaging mass spectrometry is often complicated by the complex media on which microbes are grown. This Application Note demonstrates how consecutive short spray pulses of a matrix solution can form an evenly closed matrix layer on dried agar. Compared with sieving dry matrix onto wet agar, this method supports analyte cocrystallization, which results in significantly more signals, higher signal-to-noise ratios, and improved ionization efficiency. The even matrix layer improves spot-to-spot precision of measured m/z values when using TOF mass spectrometers. With this technique, we established reproducible imaging mass spectrometry of myxobacterial cultures on nutrient-rich cultivation media, which was not possible with the sieving technique.
NASA Astrophysics Data System (ADS)
Parvathi, S. P.; Ramanan, R. V.
2018-06-01
An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.
Hauser-Davis, Rachel Ann; Lopes, Renato Matos; Mota, Fábio Batista; Moreira, Josino Costa
2017-06-01
Metalloproteomic studies in environmental scenarios are of significant value in elucidating metal uptake, trafficking, accumulation and metabolism linked to biomolecules in biological systems. The advent of this field occurred in the early 2000s, and it has since become an interesting and growing area of interdisciplinary research, although the number of publications in Environmental Metalloprotemics is still very low compared to other metallomic areas. In this context, the evolution of Environmental Metalloprotemics in the last decades was evaluated herein through the use of bibliometric techniques, identifying variables that may aid researchers in this area to form collaborative networks with established scientists in this regard, such as main authors, published articles, institutions, countries and established collaborations involved in academic research on this subject. Results indicate a growing trend of publications over time, reflecting the interest of the scientific community in Environmental Metalloprotemics, but also demonstrated that the research interactions in this field are still country- and organization-specific. Higher amounts of publications are observed from the late 2000's onwards, related to the increasing technological advances in the area, such as the development of techniques combining atomic spectroscopy and biochemical or proteomic techniques. The retrieved publications also indicate that the recent advances in genomic, proteomic and metallomic areas have allowed for extended applications of Environmental Metalloprotemics in non-model organisms. The results reported herein indicate that Environmental Metalloprotemics seems to now be reaching a more mature stage, in which analytical techniques are now well established and can be routinely applied in environmental scenarios, benefitting researchers and allowing for further insights into this fascinating field. Copyright © 2017 Elsevier Inc. All rights reserved.
Interpretation and classification of microvolt T wave alternans tests
NASA Technical Reports Server (NTRS)
Bloomfield, Daniel M.; Hohnloser, Stefan H.; Cohen, Richard J.
2002-01-01
Measurement of microvolt-level T wave alternans (TWA) during routine exercise stress testing now is possible as a result of sophisticated noise reduction techniques and analytic methods that have become commercially available. Even though this technology is new, the available data suggest that microvolt TWA is a potent predictor of arrhythmia risk in diverse disease states. As this technology becomes more widely available, physicians will be called upon to interpret microvolt TWA tracings. This review seeks to establish uniform standards for the clinical interpretation of microvolt TWA tracings.
Operational experience in underwater photogrammetry
NASA Astrophysics Data System (ADS)
Leatherdale, John D.; John Turner, D.
Underwater photogrammetry has become established as a cost-effective technique for inspection and maintenance of platforms and pipelines for the offshore oil industry. A commercial service based in Scotland operates in the North Sea, USA, Brazil, West Africa and Australia. 70 mm cameras and flash units are built for the purpose and analytical plotters and computer graphics systems are used for photogrammetric measurement and analysis of damage, corrosion, weld failures and redesign of underwater structures. Users are seeking simple, low-cost systems for photogrammetric analysis which their engineers can use themselves.
NASA Astrophysics Data System (ADS)
Melia, F.; Frisch, D. H.
1985-06-01
Techniques to establish communication between earth and extraterrestrial intelligent beings are examined analytically, emphasizing that the success of searches for extraterrestrial intelligence (SETIs) depends on the selection by both sender and receiver of one of a few mutually helpful SETI strategies. An equation for estimating the probability that an SETI will result in the recognition of an ETI signal is developed, and numerical results for various SETI strategies are presented in tables. A minimum approach employing 10 40-m 20-kW dish antennas for a 30-yr SETI in a 2500-light-year disk is proposed.
Portal scatter to primary dose ratio of 4 to 18 MV photon spectra incident on heterogeneous phantoms
NASA Astrophysics Data System (ADS)
Ozard, Siobhan R.
Electronic portal imagers designed and used to verify the positioning of a cancer patient undergoing radiation treatment can also be employed to measure the in vivo dose received by the patient. This thesis investigates the ratio of the dose from patient-scattered particles to the dose from primary (unscattered) photons at the imaging plane, called the scatter to primary dose ratio (SPR). The composition of the SPR according to the origin of scatter is analyzed more thoroughly than in previous studies. A new analytical method for calculating the SPR is developed and experimentally verified for heterogeneous phantoms. A novel technique that applies the analytical SPR method for in vivo dosimetry with a portal imager is evaluated. Monte Carlo simulation was used to determine the imager dose from patient-generated electrons and photons that scatter one or more times within the object. The database of SPRs reported from this investigation is new since the contribution from patient-generated electrons was neglected by previous Monte Carlo studies. The SPR from patient-generated electrons was found here to be as large as 0.03. The analytical SPR method relies on the established result that the scatter dose is uniform for an air gap between the patient and the imager that is greater than 50 cm. This method also applies the hypothesis that first-order Compton scatter only, is sufficient for scatter estimation. A comparison of analytical and measured SPRs for neck, thorax, and pelvis phantoms showed that the maximum difference was within +/-0.03, and the mean difference was less than +/-0.01 for most cases. This accuracy was comparable to similar analytical approaches that are limited to homogeneous phantoms. The analytical SPR method could replace lookup tables of measured scatter doses that can require significant time to measure. In vivo doses were calculated by combining our analytical SPR method and the convolution/superposition algorithm. Our calculated in vivo doses agreed within +/-3% with the doses measured in the phantom. The present in vivo method was faster compared to other techniques that use convolution/superposition. Our method is a feasible and satisfactory approach that contributes to on-line patient dose monitoring.
Depth-resolved monitoring of analytes diffusion in ocular tissues
NASA Astrophysics Data System (ADS)
Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.
2007-02-01
Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.
NASA Astrophysics Data System (ADS)
Akinlalu, A. A.; Adegbuyiro, A.; Adiat, K. A. N.; Akeredolu, B. E.; Lateef, W. Y.
2017-06-01
Groundwater Potential of Oke-Ana area southwestern Nigeria have been evaluated using the integration of electrical resistivity method, remote sensing and geographic information systems. The effect of five hydrogeological indices, namely lineament density, drainage density, lithology, overburden thickness and aquifer layer resistivity on groundwater occurrence was established. Multi-criteria decision analysis technique was employed to assign weight to each of the index using the concept of analytical hierarchy process. The assigned weight was normalized and consistency ratio was established. In order to evaluate the groundwater potential of Oke-Ana, sixty-seven (67) vertical electrical sounding points were occupied. Ten curve types were delineated in the study area. The curve types vary from simple three layer A and H-type curves to the more complex four, five and six layer AA, HA, KH, QH, AKH, HKH, KHA and KHKH curves. Four subsurface geo-electric sequences of top soil, weathered layer, partially weathered/fractured basement and the fresh basement were delineated in the area. The analytical process assisted in classifying Oke-Ana into, low, medium and high groundwater potential zones. Validation of the model from well information and two aborted boreholes suggest 70% agreement.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
New developments of X-ray fluorescence imaging techniques in laboratory
NASA Astrophysics Data System (ADS)
Tsuji, Kouichi; Matsuno, Tsuyoshi; Takimoto, Yuki; Yamanashi, Masaki; Kometani, Noritsugu; Sasaki, Yuji C.; Hasegawa, Takeshi; Kato, Shuichi; Yamada, Takashi; Shoji, Takashi; Kawahara, Naoki
2015-11-01
X-ray fluorescence (XRF) analysis is a well-established analytical technique with a long research history. Many applications have been reported in various fields, such as in the environmental, archeological, biological, and forensic sciences as well as in industry. This is because XRF has a unique advantage of being a nondestructive analytical tool with good precision for quantitative analysis. Recent advances in XRF analysis have been realized by the development of new x-ray optics and x-ray detectors. Advanced x-ray focusing optics enables the making of a micro x-ray beam, leading to micro-XRF analysis and XRF imaging. A confocal micro-XRF technique has been applied for the visualization of elemental distributions inside the samples. This technique was applied for liquid samples and for monitoring chemical reactions such as the metal corrosion of steel samples in the NaCl solutions. In addition, a principal component analysis was applied for reducing the background intensity in XRF spectra obtained during XRF mapping, leading to improved spatial resolution of confocal micro-XRF images. In parallel, the authors have proposed a wavelength dispersive XRF (WD-XRF) imaging spectrometer for a fast elemental imaging. A new two dimensional x-ray detector, the Pilatus detector was applied for WD-XRF imaging. Fast XRF imaging in 1 s or even less was demonstrated for Euro coins and industrial samples. In this review paper, these recent advances in laboratory-based XRF imaging, especially in a laboratory setting, will be introduced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hien, P.D.
1994-12-31
Over ten years since the commissioning of the Dalat nuclear research reactor a number of nuclear techniques have been developed and applied in Vietnam Manufacturing of radioisotopes and nuclear instruments, development of isotope tracer and nuclear analytical techniques for environmental studies, exploitation of filtered neutron beams, ... have been major activities of reactor utilizations. Efforts made during ten years of reactor operation have resulted also in establishing and sustaining the applications of nuclear techniques in medicine, industry, agriculture, etc. The successes achieved and lessons teamed over the past ten years are discussed illustrating the approaches taken for developing the nuclearmore » science in the conditions of a country having a very low national income and experiencing a transition from a centrally planned to a market-oriented economic system.« less
NASA Astrophysics Data System (ADS)
Frew, Russell; Cannavan, Andrew; Zandric, Zora; Maestroni, Britt; Abrahim, Aiman
2013-04-01
Traceability systems play a key role in assuring a safe and reliable food supply. Analytical techniques harnessing the spatial patterns in distribution of stable isotope and trace element ratios can be used for the determination of the provenance of food. Such techniques offer the potential to enhance global trade by providing an independent means of verifying "paper" traceability systems and can also help to prove authenticity, to combat fraudulent practices, and to control adulteration, which are important issues for economic, religious or cultural reasons. To address some of the challenges that developing countries face in attempting to implement effective food traceability systems, the IAEA, through its Joint FAO/IAEA Division on Nuclear Techniques in Food and Agriculture, has initiated a 5-year coordinated research project involving institutes in 15 developing and developed countries (Austria, Botswana, Chile, China, France, India, Lebanon, Morocco, Portugal, Singapore, Sweden, Thailand, Uganda, UK, USA). The objective is to help in member state laboratories to establish robust analytical techniques and databases, validated to international standards, to determine the provenance of food. Nuclear techniques such as stable isotope and multi-element analysis, along with complementary methods, will be applied for the verification of food traceability systems and claims related to food origin, production, and authenticity. This integrated and multidisciplinary approach to strengthening capacity in food traceability will contribute to the effective implementation of holistic systems for food safety and control. The project focuses mainly on the development of techniques to confirm product authenticity, with several research partners also considering food safety issues. Research topics encompass determination of the geographical origin of a variety of commodities, including seed oils, rice, wine, olive oil, wheat, orange juice, fish, groundnuts, tea, pork, honey and coffee, the adulteration of milk with soy protein, chemical contamination of food products, and inhomogeneity in isotopic ratios in poultry and eggs as a means to determine production history. Analytical techniques include stable isotope ratio measurements (2H/1H, 13C/12C, 15N/14N, 18O/16O, 34S/32S, 87Sr/86Sr, 208Pb/207Pb/206Pb), elemental analysis, DNA fingerprinting, fatty acid and other biomolecule profiling, chromatography-mass spectrometry and near infra-red spectroscopy.
Simulation and statistics: Like rhythm and song
NASA Astrophysics Data System (ADS)
Othman, Abdul Rahman
2013-04-01
Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.
NASA Astrophysics Data System (ADS)
Cucu, Daniela; Woods, Mike
2008-08-01
The paper aims to present a practical approach for testing laboratories to ensure the quality of their test results. It is based on the experience gained in assessing a large number of testing laboratories, discussing with management and staff, reviewing results obtained in national and international PTs and ILCs and exchanging information in the EA laboratory committee. According to EN ISO/IEC 17025, an accredited laboratory has to implement a programme to ensure the quality of its test results for each measurand. Pre-analytical, analytical and post-analytical measures shall be applied in a systematic manner. They shall include both quality control and quality assurance measures. When designing the quality assurance programme a laboratory should consider pre-analytical activities (like personnel training, selection and validation of test methods, qualifying equipment), analytical activities ranging from sampling, sample preparation, instrumental analysis and post-analytical activities (like decoding, calculation, use of statistical tests or packages, management of results). Designed on different levels (analyst, quality manager and technical manager), including a variety of measures, the programme shall ensure the validity and accuracy of test results, the adequacy of the management system, prove the laboratory's competence in performing tests under accreditation and last but not least show the comparability of test results. Laboratory management should establish performance targets and review periodically QC/QA results against them, implementing appropriate measures in case of non-compliance.
Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.
Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun
2017-07-08
Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.
Burdick, J D; Boni, R L; Fochtman, F W
1997-05-01
A simple solid phase extraction (SPE) technique combined with gas chromatography-mass spectrometry (GC/MS) operated in selected ion monitoring (SIM) mode is described for quantitation of cocaine and cocaethylene in small samples (250 microliters) of rat whole blood. Use of (N-[2H3C])-cocaine and (N-[2H3C])-cocaethylene internal standards resulted in high sensitivity and selectivity for this analytical method. Analysis was performed using a Hewlett-Packard 5890 GC equipped with a 7673A Automatic Liquid Sampler linked to a Hewlett-Packard 5972 Mass Selective Detector. Separation of analytes was accomplished on a cross-linked methyl silicone gum capillary column (Ultra 1: 12m x 0.2mm (i.d.) x 0.33 microns). Linearity was established over a wide range of concentrations (5.0-2000.0 ng ml-1) with good precision. Limits of detection (LOD) were 1.0 and 2.0 ng ml-1 for cocaine and cocaethylene, respectively. This analytical method was designed for use in pharmacokinetic experiments studying the formation of cocaethylene following ethanol pretreatment in rats administered cocaine.
Huerta, B; Rodríguez-Mozaz, S; Barceló, D
2012-11-01
The presence of pharmaceuticals in the aquatic environment is an ever-increasing issue of concern as they are specifically designed to target specific metabolic and molecular pathways in organisms, and they may have the potential for unintended effects on nontarget species. Information on the presence of pharmaceuticals in biota is still scarce, but the scientific literature on the subject has established the possibility of bioaccumulation in exposed aquatic organisms through other environmental compartments. However, few studies have correlated both bioaccumulation of pharmaceutical compounds and the consequent effects. Analytical methodology to detect pharmaceuticals at trace quantities in biota has advanced significantly in the last few years. Nonetheless, there are still unresolved analytical challenges associated with the complexity of biological matrices, which require exhaustive extraction and purification steps, and highly sensitive and selective detection techniques. This review presents the trends in the analysis of pharmaceuticals in aquatic organisms in the last decade, recent data about the occurrence of these compounds in natural biota, and the environmental implications that chronic exposure could have on aquatic wildlife.
Aspects of Voyager photogrammetry
NASA Technical Reports Server (NTRS)
Wu, Sherman S. C.; Schafer, Francis J.; Jordan, Raymond; Howington, Annie-Elpis
1987-01-01
In January 1986, Voyager 2 took a series of pictures of Uranus and its satellites with the Imaging Science System (ISS) on board the spacecraft. Based on six stereo images from the ISS narrow-angle camera, a topographic map was compiled of the Southern Hemisphere of Miranda, one of Uranus' moons. Assuming a spherical figure, a 20-km surface relief is shown on the map. With three additional images from the ISS wide-angle camera, a control network of Miranda's Southern Hemisphere was established by analytical photogrammetry, producing 88 ground points for the control of multiple-model compilation on the AS-11AM analytical stereoplotter. Digital terrain data from the topographic map of Miranda have also been produced. By combining these data and the image data from the Voyager 2 mission, perspective views or even a movie of the mapped area can be made. The application of these newly developed techniques to Voyager 1 imagery, which includes a few overlapping pictures of Io and Ganymede, permits the compilation of contour maps or topographic profiles of these bodies on the analytical stereoplotters.
Matsche, Mark A; Arnold, Jill; Jenkins, Erin; Townsend, Howard; Rosemary, Kevin
2014-09-01
The imperiled status of Atlantic sturgeon (Acipenser oxyrinchus oxyrinchus), a large, long-lived, anadromous fish found along the Atlantic coast of North America, has prompted efforts at captive propagation for research and stock enhancement. The purpose of this study was to establish hematology and plasma chemistry reference intervals of captive Atlantic sturgeon maintained under different culture conditions. Blood specimens were collected from a total of 119 fish at 3 hatcheries: Lamar, PA (n = 36, ages 10-14 years); Chalk Point, MD (n = 40, siblings of Lamar); and Horn Point, Cambridge, MD (n = 43, mixed population from Chesapeake Bay). Reference intervals (using robust techniques), median, mean, and standard deviations were determined for WBC, RBC, thrombocytes, PCV, HGB, MCV, MCH, MCHC, and absolute counts for lymphocytes (L), neutrophils (N), monocytes, and eosinophils. Chemistry analytes included concentrations of total proteins, albumin, glucose, urea, calcium, phosphate, sodium, potassium, chloride, and globulins, AST, CK, and LDH activities, and osmolality. Mean concentrations of total proteins, albumin, and glucose were at or below the analytic range. Statistical comparisons showed significant differences among hatcheries for each remaining plasma chemistry analyte and for PCV, RBC, MCHC, MCH, eosinophil and monocyte counts, and N:L ratio throughout all 3 groups. Therefore, reference intervals were calculated separately for each population. Reference intervals for fish maintained under differing conditions should be established per population. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
A study of digital gyro compensation loops. [data conversion routines and breadboard models
NASA Technical Reports Server (NTRS)
1975-01-01
The feasibility is discussed of replacing existing state-of-the-art analog gyro compensation loops with digital computations. This was accomplished by designing appropriate compensation loops for the dry turned TDF gyro, selecting appropriate data conversion and processing techniques and algorithms, and breadboarding the design for laboratory evaluation. A breadboard design was established in which one axis of a Teledyne turned-gimbal TDF gyro was caged digitally while the other was caged using conventional analog electronics. The digital loop was designed analytically to closely resemble the analog loop in performance. The breadboard was subjected to various static and dynamic tests in order to establish the relative stability characteristics and frequency responses of the digital and analog loops. Several variations of the digital loop configuration were evaluated. The results were favorable.
Feng, Juanjuan; Sun, Min; Bu, Yanan; Luo, Chuannan
2016-03-01
Stir bar sorptive extraction is an environmentally friendly microextraction technique based on a stir bar with various sorbents. A commercial stirrer is a good support, but it has not been used in stir bar sorptive extraction due to difficult modification. A stirrer was modified with carbon nanoparticles by a simple carbon deposition process in flame and characterized by scanning electron microscopy and energy-dispersive X-ray spectrometry. A three-dimensional porous coating was formed with carbon nanoparticles. In combination with high-performance liquid chromatography, the stir bar was evaluated using five polycyclic aromatic hydrocarbons as model analytes. Conditions including extraction time and temperature, ionic strength, and desorption solvent were investigated by a factor-by-factor optimization method. The established method exhibited good linearity (0.01-10 μg/L) and low limits of quantification (0.01 μg/L). It was applied to detect model analytes in environmental water samples. No analyte was detected in river water, and five analytes were quantified in rain water. The recoveries of five analytes in two samples with spiked at 2 μg/L were in the range of 92.2-106% and 93.4-108%, respectively. The results indicated that the carbon nanoparticle-coated stirrer was an efficient stir bar for extraction analysis of some polycyclic aromatic hydrocarbons. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
A Single-Molecule Barcoding System using Nanoslits for DNA Analysis
NASA Astrophysics Data System (ADS)
Jo, Kyubong; Schramm, Timothy M.; Schwartz, David C.
Single DNA molecule approaches are playing an increasingly central role in the analytical genomic sciences because single molecule techniques intrinsically provide individualized measurements of selected molecules, free from the constraints of bulk techniques, which blindly average noise and mask the presence of minor analyte components. Accordingly, a principal challenge that must be addressed by all single molecule approaches aimed at genome analysis is how to immobilize and manipulate DNA molecules for measurements that foster construction of large, biologically relevant data sets. For meeting this challenge, this chapter discusses an integrated approach for microfabricated and nanofabricated devices for the manipulation of elongated DNA molecules within nanoscale geometries. Ideally, large DNA coils stretch via nanoconfinement when channel dimensions are within tens of nanometers. Importantly, stretched, often immobilized, DNA molecules spanning hundreds of kilobase pairs are required by all analytical platforms working with large genomic substrates because imaging techniques acquire sequence information from molecules that normally exist in free solution as unrevealing random coils resembling floppy balls of yarn. However, nanoscale devices fabricated with sufficiently small dimensions fostering molecular stretching make these devices impractical because of the requirement of exotic fabrication technologies, costly materials, and poor operational efficiencies. In this chapter, such problems are addressed by discussion of a new approach to DNA presentation and analysis that establishes scaleable nanoconfinement conditions through reduction of ionic strength; stiffening DNA molecules thus enabling their arraying for analysis using easily fabricated devices that can also be mass produced. This new approach to DNA nanoconfinement is complemented by the development of a novel labeling scheme for reliable marking of individual molecules with fluorochrome labels, creating molecular barcodes, which are efficiently read using fluorescence resonance energy transfer techniques for minimizing noise from unincorporated labels. As such, our integrative approach for the realization of genomic analysis through nanoconfinement, named nanocoding, was demonstrated through the barcoding and mapping of bacterial artificial chromosomal molecules, thereby providing the basis for a high-throughput platform competent for whole genome investigations.
van Elk, Michiel; Matzke, Dora; Gronau, Quentin F.; Guan, Maime; Vandekerckhove, Joachim; Wagenmakers, Eric-Jan
2015-01-01
According to a recent meta-analysis, religious priming has a positive effect on prosocial behavior (Shariff et al., 2015). We first argue that this meta-analysis suffers from a number of methodological shortcomings that limit the conclusions that can be drawn about the potential benefits of religious priming. Next we present a re-analysis of the religious priming data using two different meta-analytic techniques. A Precision-Effect Testing–Precision-Effect-Estimate with Standard Error (PET-PEESE) meta-analysis suggests that the effect of religious priming is driven solely by publication bias. In contrast, an analysis using Bayesian bias correction suggests the presence of a religious priming effect, even after controlling for publication bias. These contradictory statistical results demonstrate that meta-analytic techniques alone may not be sufficiently robust to firmly establish the presence or absence of an effect. We argue that a conclusive resolution of the debate about the effect of religious priming on prosocial behavior – and about theoretically disputed effects more generally – requires a large-scale, preregistered replication project, which we consider to be the sole remedy for the adverse effects of experimenter bias and publication bias. PMID:26441741
Coupling corona discharge for ambient extractive ionization mass spectrometry.
Hu, Bin; Zhang, Xinglei; Li, Ming; Peng, Xuejiao; Han, Jing; Yang, Shuiping; Ouyang, Yongzhong; Chen, Huanwen
2011-12-07
Unlike the extractive electrospray ionization (EESI) technique described elsewhere, a corona discharge instead of electrospray ionization has been utilized to charge a neutral solvent spray under ambient conditions for the generation of highly charged microdroplets, which impact a neutral sample plume for the extractive ionization of the analytes in raw samples without any sample pretreatment. Using the positive ion mode, molecular radical cations were easily generated for the detection of non-polar compounds (e.g., benzene, cyclohexane, etc.), while protonated molecular ions of polar compounds (e.g., acetonitrile, acetic ether) were readily produced for the detection. By dispensing the matrix in a relatively large space, this method tolerates highly complex matrices. For a given sample such as lily fragrances, more compounds were detected by the method established here than the EESI technique. An acceptable relative standard deviation (RSD 8.9%, n = 11) was obtained for the direct measurement of explosives (10 ppb) in waste water samples. The experimental data demonstrate that this method could simultaneously detect both polar and non-polar analytes with high sensitivity, showing promising applications for the rapid detection of a wide variety of compounds present in complex matrices.
Translational research in pediatrics III: bronchoalveolar lavage.
Radhakrishnan, Dhenuka; Yamashita, Cory; Gillio-Meina, Carolina; Fraser, Douglas D
2014-07-01
The role of flexible bronchoscopy and bronchoalveolar lavage (BAL) for the care of children with airway and pulmonary diseases is well established, with collected BAL fluid most often used clinically for microbiologic pathogen identification and cellular analyses. More recently, powerful analytic research methods have been used to investigate BAL samples to better understand the pathophysiological basis of pediatric respiratory disease. Investigations have focused on the cellular components contained in BAL fluid, such as macrophages, lymphocytes, neutrophils, eosinophils, and mast cells, as well as the noncellular components such as serum molecules, inflammatory proteins, and surfactant. Molecular techniques are frequently used to investigate BAL fluid for the presence of infectious pathologies and for cellular gene expression. Recent advances in proteomics allow identification of multiple protein expression patterns linked to specific respiratory diseases, whereas newer analytic techniques allow for investigations on surfactant quantification and function. These translational research studies on BAL fluid have aided our understanding of pulmonary inflammation and the injury/repair responses in children. We review the ethics and practices for the execution of BAL in children for translational research purposes, with an emphasis on the optimal handling and processing of BAL samples. Copyright © 2014 by the American Academy of Pediatrics.
Extending existing structural identifiability analysis methods to mixed-effects models.
Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D
2018-01-01
The concept of structural identifiability for state-space models is expanded to cover mixed-effects state-space models. Two methods applicable for the analytical study of the structural identifiability of mixed-effects models are presented. The two methods are based on previously established techniques for non-mixed-effects models; namely the Taylor series expansion and the input-output form approach. By generating an exhaustive summary, and by assuming an infinite number of subjects, functions of random variables can be derived which in turn determine the distribution of the system's observation function(s). By considering the uniqueness of the analytical statistical moments of the derived functions of the random variables, the structural identifiability of the corresponding mixed-effects model can be determined. The two methods are applied to a set of examples of mixed-effects models to illustrate how they work in practice. Copyright © 2017 Elsevier Inc. All rights reserved.
13C-based metabolic flux analysis: fundamentals and practice.
Yang, Tae Hoon
2013-01-01
Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.
Lloyd, Lyrelle S; Adams, Ralph W; Bernstein, Michael; Coombes, Steven; Duckett, Simon B; Green, Gary G R; Lewis, Richard J; Mewis, Ryan E; Sleigh, Christopher J
2012-08-08
The characterization of materials by the inherently insensitive method of NMR spectroscopy plays a vital role in chemistry. Increasingly, hyperpolarization is being used to address the sensitivity limitation. Here, by reference to quinoline, we illustrate that the SABRE hyperpolarization technique, which uses para-hydrogen as the source of polarization, enables the rapid completion of a range of NMR measurements. These include the collection of (13)C, (13)C{(1)H}, and NOE data in addition to more complex 2D COSY, ultrafast 2D COSY and 2D HMBC spectra. The observations are made possible by the use of a flow probe and external sample preparation cell to re-hyperpolarize the substrate between transients, allowing repeat measurements to be made within seconds. The potential benefit of the combination of SABRE and 2D NMR methods for rapid characterization of low-concentration analytes is therefore established.
Mass spectrometry-based biomarker discovery: toward a global proteome index of individuality.
Hawkridge, Adam M; Muddiman, David C
2009-01-01
Biomarker discovery and proteomics have become synonymous with mass spectrometry in recent years. Although this conflation is an injustice to the many essential biomolecular techniques widely used in biomarker-discovery platforms, it underscores the power and potential of contemporary mass spectrometry. Numerous novel and powerful technologies have been developed around mass spectrometry, proteomics, and biomarker discovery over the past 20 years to globally study complex proteomes (e.g., plasma). However, very few large-scale longitudinal studies have been carried out using these platforms to establish the analytical variability relative to true biological variability. The purpose of this review is not to cover exhaustively the applications of mass spectrometry to biomarker discovery, but rather to discuss the analytical methods and strategies that have been developed for mass spectrometry-based biomarker-discovery platforms and to place them in the context of the many challenges and opportunities yet to be addressed.
Systematic Assessment of the Hemolysis Index: Pros and Cons.
Lippi, Giuseppe
2015-01-01
Preanalytical quality is as important as the analytical and postanalytical quality in laboratory diagnostics. After decades of visual inspection to establish whether or not a diagnostic sample may be suitable for testing, automated assessment of hemolysis index (HI) has now become available in a large number of laboratory analyzers. Although most national and international guidelines support systematic assessment of sample quality via HI, there is widespread perception that this indication has not been thoughtfully acknowledged. Potential explanations include concern of increased specimen rejection rate, poor harmonization of analytical techniques, lack of standardized units of measure, differences in instrument-specific cutoff, negative impact on throughput, organization and laboratory economics, and lack of a reliable quality control system. Many of these concerns have been addressed. Evidence now supports automated HI in improving quality and patient safety. These will be discussed. © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Xing; Hill, Thomas L.; Neild, Simon A.; Shaw, Alexander D.; Haddad Khodaparast, Hamed; Friswell, Michael I.
2018-02-01
This paper proposes a model updating strategy for localised nonlinear structures. It utilises an initial finite-element (FE) model of the structure and primary harmonic response data taken from low and high amplitude excitations. The underlying linear part of the FE model is first updated using low-amplitude test data with established techniques. Then, using this linear FE model, the nonlinear elements are localised, characterised, and quantified with primary harmonic response data measured under stepped-sine or swept-sine excitations. Finally, the resulting model is validated by comparing the analytical predictions with both the measured responses used in the updating and with additional test data. The proposed strategy is applied to a clamped beam with a nonlinear mechanism and good agreements between the analytical predictions and measured responses are achieved. Discussions on issues of damping estimation and dealing with data from amplitude-varying force input in the updating process are also provided.
Yan, Weiying; Colyer, Christa L
2006-11-24
Noncovalent interactions between fluorescent probe molecules and protein analyte molecules, which typically occur with great speed and minimal sample handling, form the basis of many high sensitivity analytical techniques. Understanding the nature of these interactions and the composition of the resulting complexes represents an important area of study that can be facilitated by capillary electrophoresis (CE). Specifically, we will present how frontal analysis (FA) and Hummel-Dreyer (HD) methods can be implemented with CE to determine association constants and stoichiometries of noncovalent complexes of the red luminescent squarylium dye Red-1c with bovine serum albumin (BSA) and beta-lactoglobulin A. By adjusting solution conditions, such as pH or ionic strength, it is possible to selectively modify the binding process. As such, conditions for optimal selectivity for labeling reactions can be established by capillary electrophoresis-frontal analysis (CE-FA) investigations.
Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods
Punshon, Tracy
2015-01-01
Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012
Crystal Growth of ZnSe and Related Ternary Compound Semiconductors by Vapor Transport in Low Gravity
NASA Technical Reports Server (NTRS)
Su, Ching-Hua; Ramachandran, N.
2013-01-01
Crystals of ZnSe and related ternary compounds, such as ZnSeS and ZnSeTe, will be grown by physical vapor transport in the Material Science Research Rack (MSRR) on International Space Station (ISS). The objective of the project is to determine the relative contributions of gravity-driven fluid flows to the compositional distribution, incorporation of impurities and defects, and deviation from stoichiometry observed in the crystals grown by vapor transport as results of buoyance-driven convection and growth interface fluctuations caused by irregular fluid-flows on Earth. The investigation consists of extensive ground-based experimental and theoretical research efforts and concurrent flight experimentation. The objectives of the ground-based studies are (1) obtain the experimental data and conduct the analyses required to define the optimum growth parameters for the flight experiments, (2) perfect various characterization techniques to establish the standard procedure for material characterization, (3) quantitatively establish the characteristics of the crystals grown on Earth as a basis for subsequent comparative evaluations of the crystals grown in a low-gravity environment and (4) develop theoretical and analytical methods required for such evaluations. ZnSe and related ternary compounds have been grown by vapor transport technique with real time in-situ non-invasive monitoring techniques. The grown crystals have been characterized extensively by various techniques to correlate the grown crystal properties with the growth conditions.
Hoffmann, Thomas; Dorrestein, Pieter C
2015-11-01
Matrix deposition on agar-based microbial colonies for MALDI imaging mass spectrometry is often complicated by the complex media on which microbes are grown. This Application Note demonstrates how consecutive short spray pulses of a matrix solution can form an evenly closed matrix layer on dried agar. Compared with sieving dry matrix onto wet agar, this method supports analyte cocrystallization, which results in significantly more signals, higher signal-to-noise ratios, and improved ionization efficiency. The even matrix layer improves spot-to-spot precision of measured m/z values when using TOF mass spectrometers. With this technique, we established reproducible imaging mass spectrometry of myxobacterial cultures on nutrient-rich cultivation media, which was not possible with the sieving technique. Graphical Abstract ᅟ.
Luces, Candace A.; Warner, Isiah M.
2014-01-01
Mixed mode separation using a combination of micellar electrokinetic chromatography (MEKC) and polyelectrolyte multilayer (PEM) coatings is herein reported for the separation of achiral and chiral analytes. Many analytes are difficult to separate by MEKC and PEM coatings alone. Therefore, the implementation of a mixed mode separation provides several advantages for overcoming the limitations of these well-established methods. In this study, it was observed that achiral separations using MEKC and PEM coatings individually resulted in partial resolution of 8 very similar aryl ketones when the molecular micelle (sodium poly(N-undecanoyl-l-glycinate) (poly-SUG)) concentration was varied from 0.25% – 1.00% (w/v) and the bilayer number varied from 2 – 4. However, when mixed mode separation was introduced, baseline resolution was achieved for all 8 analytes. In the case of chiral separations, temazepam, aminoglutethimide, benzoin, benzoin methyl ether and coumachlor were separated using the three separation techniques. For chiral separations, the chiral molecular micelle, sodium poly(N-undecanoyl-l-leucylvalinate) (poly-l-SULV), was employed at concentrations of 0.25–1.50% (w/v) for both MEKC and PEM coatings. Overall, the results revealed partial separation with MEKC and PEM coatings individually. However, mixed mode separation enabled baseline separation of each chiral mixture. The separation of achiral and chiral compounds from different compound classes demonstrates the versatility of this mixed mode approach. PMID:20155738
DOE Office of Scientific and Technical Information (OSTI.GOV)
Datta, Dipayan, E-mail: datta@uni-mainz.de; Gauss, Jürgen, E-mail: gauss@uni-mainz.de
2014-09-14
An analytic scheme is presented for the evaluation of first derivatives of the energy for a unitary group based spin-adapted coupled cluster (CC) theory, namely, the combinatoric open-shell CC (COSCC) approach within the singles and doubles approximation. The widely used Lagrange multiplier approach is employed for the derivation of an analytical expression for the first derivative of the energy, which in combination with the well-established density-matrix formulation, is used for the computation of first-order electrical properties. Derivations of the spin-adapted lambda equations for determining the Lagrange multipliers and the expressions for the spin-free effective density matrices for the COSCC approachmore » are presented. Orbital-relaxation effects due to the electric-field perturbation are treated via the Z-vector technique. We present calculations of the dipole moments for a number of doublet radicals in their ground states using restricted open-shell Hartree-Fock (ROHF) and quasi-restricted HF (QRHF) orbitals in order to demonstrate the applicability of our analytic scheme for computing energy derivatives. We also report calculations of the chlorine electric-field gradients and nuclear quadrupole-coupling constants for the CCl, CH{sub 2}Cl, ClO{sub 2}, and SiCl radicals.« less
Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration
NASA Technical Reports Server (NTRS)
Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.
1993-01-01
Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2013-12-20
Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.
One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.
Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz
2009-07-15
The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.
Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed
2014-06-01
Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...
2016-07-05
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.
Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less
Experimental and analytical determination of stability parameters for a balloon tethered in a wind
NASA Technical Reports Server (NTRS)
Redd, L. T.; Bennett, R. M.; Bland, S. R.
1973-01-01
Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.
100-B/C Target Analyte List Development for Soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
R.W. Ovink
2010-03-18
This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.
Development of a composite geodetic structure for space construction, phase 2
NASA Technical Reports Server (NTRS)
1981-01-01
Primary physical and mechanical properties were defined for pultruded hybrid HMS/E-glass P1700 rod material used for the fabrication of geodetic beams. Key properties established were used in the analysis, design, fabrication, instrumentation, and testing of a geodetic parameter cylinder and a lattice cone closeout joined to a short cylindrical geodetic beam segment. Requirements of structural techniques were accomplished. Analytical procedures were refined and extended to include the effect of rod dimensions for the helical and longitudinal members on local buckling, and the effect of different flexural and extensional moduli on general instability buckling.
A measurement system for the atmospheric trace gases CH4 and CO
NASA Technical Reports Server (NTRS)
Condon, E. P.
1977-01-01
A system for measuring ambient clean air levels of the atmospheric trace gases methane and carbon monoxide is described. The analytical method consists of a gas chromatographic technique that incorporates sample preconcentration with catalytic conversion of CO to CH4 and subsequent flame ionization detection of these gases. The system has sufficient sensitivity and repeatability to make the precise measurements required to establish concentration profiles for CO and CH4 in the planetary boundary layer. A discussion of the bottle sampling program being conducted to obtain the samples for the concentration profiles is also presented.
A trajectory generation framework for modeling spacecraft entry in MDAO
NASA Astrophysics Data System (ADS)
D`Souza, Sarah N.; Sarigul-Klijn, Nesrin
2016-04-01
In this paper a novel trajectory generation framework was developed that optimizes trajectory event conditions for use in a Generalized Entry Guidance algorithm. The framework was developed to be adaptable via the use of high fidelity equations of motion and drag based analytical bank profiles. Within this framework, a novel technique was implemented that resolved the sensitivity of the bank profile to atmospheric non-linearities. The framework's adaptability was established by running two different entry bank conditions. Each case yielded a reference trajectory and set of transition event conditions that are flight feasible and implementable in a Generalized Entry Guidance algorithm.
Biologically inspired technologies using artificial muscles
NASA Technical Reports Server (NTRS)
Bar-Cohen, Yoseph
2005-01-01
One of the newest fields of biomimetics is the electroactive polymers (EAP) that are also known as artificial muscles. To take advantage of these materials, efforts are made worldwide to establish a strong infrastructure addressing the need for comprehensive analytical modeling of their response mechanism and develop effective processing and characterization techniques. The field is still in its emerging state and robust materials are still not readily available however in recent years significant progress has been made and commercial products have already started to appear. This paper covers the current state of- the-art and challenges to making artificial muscles and their potential biomimetic applications.
Analytical Chemistry: A Literary Approach.
ERIC Educational Resources Information Center
Lucy, Charles A.
2000-01-01
Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)
Control of birhythmicity: A self-feedback approach
NASA Astrophysics Data System (ADS)
Biswas, Debabrata; Banerjee, Tanmoy; Kurths, Jürgen
2017-06-01
Birhythmicity occurs in many natural and artificial systems. In this paper, we propose a self-feedback scheme to control birhythmicity. To establish the efficacy and generality of the proposed control scheme, we apply it on three birhythmic oscillators from diverse fields of natural science, namely, an energy harvesting system, the p53-Mdm2 network for protein genesis (the OAK model), and a glycolysis model (modified Decroly-Goldbeter model). Using the harmonic decomposition technique and energy balance method, we derive the analytical conditions for the control of birhythmicity. A detailed numerical bifurcation analysis in the parameter space establishes that the control scheme is capable of eliminating birhythmicity and it can also induce transitions between different forms of bistability. As the proposed control scheme is quite general, it can be applied for control of several real systems, particularly in biochemical and engineering systems.
EDXRF as an alternative method for multielement analysis of tropical soils and sediments.
Fernández, Zahily Herrero; Dos Santos Júnior, José Araújo; Dos Santos Amaral, Romilton; Alvarez, Juan Reinaldo Estevez; da Silva, Edvane Borges; De França, Elvis Joacir; Menezes, Rômulo Simões Cezar; de Farias, Emerson Emiliano Gualberto; do Nascimento Santos, Josineide Marques
2017-08-10
The quality assessment of tropical soils and sediments is still under discussion, with efforts being made on the part of governmental agencies to establish reference values. Energy dispersive X-ray fluorescence (EDXRF) is a potential analytical technique for quantifying diverse chemical elements in geological material without chemical treatment, primarily when it is performed at an appropriate metrological level. In this work, analytical curves were obtained by means of the analysis of geological reference materials (RMs), which allowed for the researchers to draw a comparison among the sources of analytical uncertainty. After having determined the quality assurance of the analytical procedure, the EDXRF method was applied to determine chemical elements in soils from the state of Pernambuco, Brazil. The regression coefficients of the analytical curves used to determine Al, Ca, Fe, K, Mg, Mn, Ni, Pb, Si, Sr, Ti, and Zn were higher than 0.99. The quality of the analytical procedure was demonstrated at a 95% confidence level, in which the estimated analytical uncertainties agreed with those from the RM's certificates of analysis. The analysis of diverse geological samples from Pernambuco indicated higher concentrations of Ni and Zn in sugarcane, with maximum values of 41 mg kg - 1 and 118 mg kg - 1 , respectively, and agricultural areas (41 mg kg - 1 and 127 mg kg - 1 , respectively). The trace element Sr was mainly enriched in urban soils with values of 400 mg kg - 1 . According to the results, the EDXRF method was successfully implemented, providing some chemical tracers for the quality assessment of tropical soils and sediments.
Eypert-Blaison, Céline; Romero-Hariot, Anita; Clerc, Frédéric; Vincent, Raymond
2018-03-01
From November 2009 to October 2010, the French general directorate for labor organized a large field-study using analytical transmission electron microscopy (ATEM) to characterize occupational exposure to asbestos fibers during work on asbestos containing materials (ACM). The primary objective of this study was to establish a method and to validate the feasibility of using ATEM for the analysis of airborne asbestos of individual filters sampled in various occupational environments. For each sampling event, ATEM data were compared to those obtained by phase-contrast optical microscopy (PCOM), the WHO-recommended reference technique. A total of 265 results were obtained from 29 construction sites where workers were in contact with ACM. Data were sorted depending on the combination of the ACM type and the removal technique. For each "ACM-removal technique" combination, ATEM data were used to compute statistical indicators on short, fine and WHO asbestos fibers. Moreover, exposure was assessed taking into account the use of respiratory protective devices (RPD). As in previous studies, no simple relationship was found between results by PCOM and ATEM counting methods. Some ACM, such as asbestos-containing plasters, generated very high dust levels, and some techniques generated considerable levels of dust whatever the ACM treated. On the basis of these observations, recommendations were made to measure and control the occupational exposure limit. General prevention measures to be taken during work with ACM are also suggested. Finally, it is necessary to continue acquiring knowledge, in particular regarding RPD and the dust levels measured by ATEM for the activities not evaluated during this study.
NASA Technical Reports Server (NTRS)
Migneault, Gerard E.
1987-01-01
Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
New approaches for metabolomics by mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vertes, Akos
Small molecules constitute a large part of the world around us, including fossil and some renewable energy sources. Solar energy harvested by plants and bacteria is converted into energy rich small molecules on a massive scale. Some of the worst contaminants of the environment and compounds of interest for national security also fall in the category of small molecules. The development of large scale metabolomic analysis methods lags behind the state of the art established for genomics and proteomics. This is commonly attributed to the diversity of molecular classes included in a metabolome. Unlike nucleic acids and proteins, metabolites domore » not have standard building blocks, and, as a result, their molecular properties exhibit a wide spectrum. This impedes the development of dedicated separation and spectroscopic methods. Mass spectrometry (MS) is a strong contender in the quest for a quantitative analytical tool with extensive metabolite coverage. Although various MS-based techniques are emerging for metabolomics, many of these approaches include extensive sample preparation that make large scale studies resource intensive and slow. New ionization methods are redefining the range of analytical problems that can be solved using MS. This project developed new approaches for the direct analysis of small molecules in unprocessed samples, as well as pushed the limits of ultratrace analysis in volume limited complex samples. The projects resulted in techniques that enabled metabolomics investigations with enhanced molecular coverage, as well as the study of cellular response to stimuli on a single cell level. Effectively individual cells became reaction vessels, where we followed the response of a complex biological system to external perturbation. We established two new analytical platforms for the direct study of metabolic changes in cells and tissues following external perturbation. For this purpose we developed a novel technique, laser ablation electrospray ionization (LAESI), for metabolite profiling of functioning cells and tissues. The technique was based on microscopic sampling of biological specimens by mid-infrared laser ablation followed by electrospray ionization of the plume and MS analysis. The two main shortcomings of this technique had been limited specificity due to the lack of a separation step, and limited molecular coverage, especially for nonpolar chemical species. To improve specificity and the coverage of the metabolome, we implemented the LAESI ion source on a mass spectrometer with ion mobility separation (IMS). In this system, the gas phase ions produced by the LAESI source were first sorted according to their collisional cross sections in a mobility cell. These separated ion packets were then subjected to MS analysis. By combining the atmospheric pressure ionization with IMS, we improved the metabolite coverage. Further enhancement of the non-polar metabolite coverage resulted from the combination of laser ablation with vacuum UV irradiation of the ablation plume. Our results indicated that this new ionization modality provided improved detection for neutral and non-polar compounds. Based on rapid progress in photonics, we had introduced another novel ion source that utilized the interaction of a laser pulse with silicon nanopost arrays (NAPA). In these nanophotonic ion sources, the structural features were commensurate with the wavelength of the laser light. The enhanced interaction resulted in high ion yields. This ultrasensitive analytical platform enabled the MS analysis of single yeast cells. We extended these NAPA studies from yeast to other microorganisms, including green algae (Chlamydomonas reinhardtii) that captured energy from sunlight on a massive scale. Combining cellular perturbations, e.g., through environmental changes, with the newly developed single cell analysis methods enabled us to follow dynamic changes induced in the cells. In effect, we were able to use individual cells as a “laboratory,” and approached the long-standing goal of establishing a “lab-in-a-cell.” Model systems for these studies included cells of cyanobacteria (Anabaena), yeast (Saccharomyces cerevisiae), green algae (C. reinhardtii) and Arabidopsis thaliana.« less
NASA Astrophysics Data System (ADS)
Poggio, Andrew J.; Mayall, Brian H.
1989-04-01
The Lawrence Livermore National Laboratory (LLNL) is an acknowledged world center for analytical cytology. This leadership was recognized by the Regents of the University of California (UC), who in 1982 established and funded the Program for Analytical Cytology to facilitate the transfer of this technology from scientists at LLNL to their University colleagues, primarily through innovative collaborative research. This issue of Energy and Technology Review describes three of the forty projects that have been funded in this way, chosen to illustrate the potential medical application of the research. Analytical cytology is a relatively new field of biomedical research that is increasingly being applied in clinical medicine. It has been particularly important in unraveling the complexities of the human immune system and in quantifying the pathobiology of malignancy. Defined as the characterization and measurement of cells and cellular constituents for biological and medical purposes, analytical cytology bridges the gap between the quantitative discipline of molecular biology and the more qualitative disciplines of anatomy and pathology. It is itself multidisciplinary in nature. Two major approaches to analytical cytology are flow cytometry and image cytometry. In each of these research techniques, cells are measured one at a time in an automated device. In flow instruments, the cells are dispersed in fluid suspension and pass in single file through a beam of laser light to generate optical signals that are measured. In image cytometry, cells are dispersed on a slide and are imaged through a microscope onto an electronic imaging and analysis system that processes the cell image to extract measurements of interest.
An initial investigation into methods of computing transonic aerodynamic sensitivity coefficients
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1994-01-01
The primary accomplishments of the project are as follows: (1) Using the transonic small perturbation equation as a flowfield model, the project demonstrated that the quasi-analytical method could be used to obtain aerodynamic sensitivity coefficients for airfoils at subsonic, transonic, and supersonic conditions for design variables such as Mach number, airfoil thickness, maximum camber, angle of attack, and location of maximum camber. It was established that the quasi-analytical approach was an accurate method for obtaining aerodynamic sensitivity derivatives for airfoils at transonic conditions and usually more efficient than the finite difference approach. (2) The usage of symbolic manipulation software to determine the appropriate expressions and computer coding associated with the quasi-analytical method for sensitivity derivatives was investigated. Using the three dimensional fully conservative full potential flowfield model, it was determined that symbolic manipulation along with a chain rule approach was extremely useful in developing a combined flowfield and quasi-analytical sensitivity derivative code capable of considering a large number of realistic design variables. (3) Using the three dimensional fully conservative full potential flowfield model, the quasi-analytical method was applied to swept wings (i.e. three dimensional) at transonic flow conditions. (4) The incremental iterative technique has been applied to the three dimensional transonic nonlinear small perturbation flowfield formulation, an equivalent plate deflection model, and the associated aerodynamic and structural discipline sensitivity equations; and coupled aeroelastic results for an aspect ratio three wing in transonic flow have been obtained.
Development Of Antibody-Based Fiber-Optic Sensors
NASA Astrophysics Data System (ADS)
Tromberg, Bruce J.; Sepaniak, Michael J.; Vo-Dinh, Tuan
1988-06-01
The speed and specificity characteristic of immunochemical complex formation has encouraged the development of numerous antibody-based analytical techniques. The scope and versatility of these established methods can be enhanced by combining the principles of conventional immunoassay with laser-based fiber-optic fluorimetry. This merger of spectroscopy and immunochemistry provides the framework for the construction of highly sensitive and selective fiber-optic devices (fluoroimmuno-sensors) capable of in-situ detection of drugs, toxins, and naturally occurring biochemicals. Fluoroimmuno-sensors (FIS) employ an immobilized reagent phase at the sampling terminus of a single quartz optical fiber. Laser excitation of antibody-bound analyte produces a fluorescence signal which is either directly proportional (as in the case of natural fluorophor and "antibody sandwich" assays) or inversely proportional (as in the case of competitive-binding assays) to analyte concentration. Factors which influence analysis time, precision, linearity, and detection limits include the nature (solid or liquid) and amount of the reagent phase, the method of analyte delivery (passive diffusion, convection, etc.), and whether equilibrium or non-equilibrium assays are performed. Data will be presented for optical fibers whose sensing termini utilize: (1) covalently-bound solid antibody reagent phases, and (2) membrane-entrapped liquid antibody reagents. Assays for large-molecular weight proteins (antigens) and small-molecular weight, carcinogenic, polynuclear aromatics (haptens) will be considered. In this manner, the influence of a system's chemical characteristics and measurement requirements on sensor design, and the consequence of various sensor designs on analytical performance will be illustrated.
Thermoelectrically cooled water trap
Micheels, Ronald H [Concord, MA
2006-02-21
A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
Recent Development in Optical Chemical Sensors Coupling with Flow Injection Analysis
Ojeda, Catalina Bosch; Rojas, Fuensanta Sánchez
2006-01-01
Optical techniques for chemical analysis are well established and sensors based on these techniques are now attracting considerable attention because of their importance in applications such as environmental monitoring, biomedical sensing, and industrial process control. On the other hand, flow injection analysis (FIA) is advisable for the rapid analysis of microliter volume samples and can be interfaced directly to the chemical process. The FIA has become a widespread automatic analytical method for more reasons; mainly due to the simplicity and low cost of the setups, their versatility, and ease of assembling. In this paper, an overview of flow injection determinations by using optical chemical sensors is provided, and instrumentation, sensor design, and applications are discussed. This work summarizes the most relevant manuscripts from 1980 to date referred to analysis using optical chemical sensors in FIA.
Analytical methods in multivariate highway safety exposure data estimation
DOT National Transportation Integrated Search
1984-01-01
Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...
Techniques for Forecasting Air Passenger Traffic
NASA Technical Reports Server (NTRS)
Taneja, N.
1972-01-01
The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Recent advances in computational-analytical integral transforms for convection-diffusion problems
NASA Astrophysics Data System (ADS)
Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.; Almeida, A. P.
2017-10-01
An unifying overview of the Generalized Integral Transform Technique (GITT) as a computational-analytical approach for solving convection-diffusion problems is presented. This work is aimed at bringing together some of the most recent developments on both accuracy and convergence improvements on this well-established hybrid numerical-analytical methodology for partial differential equations. Special emphasis is given to novel algorithm implementations, all directly connected to enhancing the eigenfunction expansion basis, such as a single domain reformulation strategy for handling complex geometries, an integral balance scheme in dealing with multiscale problems, the adoption of convective eigenvalue problems in formulations with significant convection effects, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Then, selected examples are presented that illustrate the improvement achieved in each class of extension, in terms of convergence acceleration and accuracy gain, which are related to conjugated heat transfer in complex or multiscale microchannel-substrate geometries, multidimensional Burgers equation model, and diffusive metal extraction through polymeric hollow fiber membranes. Numerical results are reported for each application and, where appropriate, critically compared against the traditional GITT scheme without convergence enhancement schemes and commercial or dedicated purely numerical approaches.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Imaging of oxygen and hypoxia in cell and tissue samples.
Papkovsky, Dmitri B; Dmitriev, Ruslan I
2018-05-14
Molecular oxygen (O 2 ) is a key player in cell mitochondrial function, redox balance and oxidative stress, normal tissue function and many common disease states. Various chemical, physical and biological methods have been proposed for measurement, real-time monitoring and imaging of O 2 concentration, state of decreased O 2 (hypoxia) and related parameters in cells and tissue. Here, we review the established and emerging optical microscopy techniques allowing to visualize O 2 levels in cells and tissue samples, mostly under in vitro and ex vivo, but also under in vivo settings. Particular examples include fluorescent hypoxia stains, fluorescent protein reporter systems, phosphorescent probes and nanosensors of different types. These techniques allow high-resolution mapping of O 2 gradients in live or post-mortem tissue, in 2D or 3D, qualitatively or quantitatively. They enable control and monitoring of oxygenation conditions and their correlation with other biomarkers of cell and tissue function. Comparison of these techniques and corresponding imaging setups, their analytical capabilities and typical applications are given.
Demonstration of the feasibility of an integrated x ray laboratory for planetary exploration
NASA Technical Reports Server (NTRS)
Franco, E. D.; Kerner, J. A.; Koppel, L. N.; Boyle, M. J.
1993-01-01
The identification of minerals and elemental compositions is an important component in the geological and exobiological exploration of the solar system. X ray diffraction and fluorescence are common techniques for obtaining these data. The feasibility of combining these analytical techniques in an integrated x ray laboratory compatible with the volume, mass, and power constraints imposed by many planetary missions was demonstrated. Breadboard level hardware was developed to cover the range of diffraction lines produced by minerals, clays, and amorphous; and to detect the x ray fluorescence emissions of elements from carbon through uranium. These breadboard modules were fabricated and used to demonstrate the ability to detect elements and minerals. Additional effort is required to establish the detection limits of the breadboard modules and to integrate diffraction and fluorescence techniques into a single unit. It was concluded that this integrated x ray laboratory capability will be a valuable tool in the geological and exobiological exploration of the solar system.
NASA Astrophysics Data System (ADS)
Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng
2017-12-01
Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.
Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.
2017-01-01
Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607
An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Collis, Peter
2012-01-01
Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…
NASA Technical Reports Server (NTRS)
Bozeman, Robert E.
1987-01-01
An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.
Duct flow nonuniformities study for space shuttle main engine
NASA Technical Reports Server (NTRS)
Thoenes, J.
1985-01-01
To improve the Space Shuttle Main Engine (SSME) design and for future use in the development of generation rocket engines, a combined experimental/analytical study was undertaken with the goals of first, establishing an experimental data base for the flow conditions in the SSME high pressure fuel turbopump (HPFTP) hot gas manifold (HGM) and, second, setting up a computer model of the SSME HGM flow field. Using the test data to verify the computer model it should be possible in the future to computationally scan contemplated advanced design configurations and limit costly testing to the most promising design. The effort of establishing and using the computer model is detailed. The comparison of computational results and experimental data observed clearly demonstrate that computational fluid mechanics (CFD) techniques can be used successfully to predict the gross features of three dimensional fluid flow through configurations as intricate as the SSME turbopump hot gas manifold.
Stochastic Thermodynamics of a Particle in a Box.
Gong, Zongping; Lan, Yueheng; Quan, H T
2016-10-28
The piston system (particles in a box) is the simplest paradigmatic model in traditional thermodynamics. However, the recently established framework of stochastic thermodynamics (ST) fails to apply to this model system due to the embedded singularity in the potential. In this Letter, we study the ST of a particle in a box by adopting a novel coordinate transformation technique. Through comparing with the exact solution of a breathing harmonic oscillator, we obtain analytical results of work distribution for an arbitrary protocol in the linear response regime and verify various predictions of the fluctuation-dissipation relation. When applying to the Brownian Szilard engine model, we obtain the optimal protocol λ_{t}=λ_{0}2^{t/τ} for a given sufficiently long total time τ. Our study not only establishes a paradigm for studying ST of a particle in a box but also bridges the long-standing gap in the development of ST.
Backward bifurcation and optimal control of Plasmodium Knowlesi malaria
NASA Astrophysics Data System (ADS)
Abdullahi, Mohammed Baba; Hasan, Yahya Abu; Abdullah, Farah Aini
2014-07-01
A deterministic model for the transmission dynamics of Plasmodium Knowlesi malaria with direct transmission is developed. The model is analyzed using dynamical system techniques and it shows that the backward bifurcation occurs for some range of parameters. The model is extended to assess the impact of time dependent preventive (biological and chemical control) against the mosquitoes and vaccination for susceptible humans, while treatment for infected humans. The existence of optimal control is established analytically by the use of optimal control theory. Numerical simulations of the problem, suggest that applying the four control measure can effectively reduce if not eliminate the spread of Plasmodium Knowlesi in a community.
A new approach to the Schrödinger equation with rational potentials
NASA Astrophysics Data System (ADS)
Dong, Ming-de; Chu, Jue-Hui
1984-04-01
A new analytic theory is established for the Schrödinger equation with a rational potential, including a complete classification of the regular eigenfunctions into three different types, an exact method of obtaining wavefunctions, an explicit formulation of the spectral equation (3 x 3 determinant) etc. All representations are exhibited in a unifying way via function-theoretic methods and therefore given in explicit form, in contrast to the prevailing discussion appealing to perturbation or variation methods or continued-fraction techniques. The irregular eigenfunctions at infinity can be obtained analogously and will be discussed separately as another solvable case for singular potentials.
Screening Vaccine Formulations in Fresh Human Whole Blood.
Hakimi, Jalil; Aboutorabian, Sepideh; To, Frederick; Ausar, Salvador F; Rahman, Nausheen; Brookes, Roger H
2017-01-01
Monitoring the immunological functionality of vaccine formulations is critical for vaccine development. While the traditional approach using established animal models has been relatively effective, the use of animals is costly and cumbersome, and animal models are not always reflective of a human response. The development of a human-based approach would be a major step forward in understanding how vaccine formulations might behave in humans. Here, we describe a platform methodology using fresh human whole blood (hWB) to monitor adjuvant-modulated, antigen-specific responses to vaccine formulations, which is amenable to analysis by standard immunoassays as well as a variety of other analytical techniques.
Trace elemental analysis of Indian natural moonstone gems by PIXE and XRD techniques.
Venkateswara Rao, R; Venkateswarulu, P; Kasipathi, C; Sivajyothi, S
2013-12-01
A selected number of Indian Eastern Ghats natural moonstone gems were studied with a powerful nuclear analytical and non-destructive Proton Induced X-ray Emission (PIXE) technique. Thirteen elements, including V, Co, Ni, Zn, Ga, Ba and Pb, were identified in these moonstones and may be useful in interpreting the various geochemical conditions and the probable cause of their inceptions in the moonstone gemstone matrix. Furthermore, preliminary XRD studies of different moonstone patterns were performed. The PIXE technique is a powerful method for quickly determining the elemental concentration of a substance. A 3MeV proton beam was employed to excite the samples. The chemical constituents of moonstones from parts of the Eastern Ghats geological formations of Andhra Pradesh, India were determined, and gemological studies were performed on those gems. The crystal structure and the lattice parameters of the moonstones were estimated using X-Ray Diffraction studies, trace and minor elements were determined using the PIXE technique, and major compositional elements were confirmed by XRD. In the present work, the usefulness and versatility of the PIXE technique for research in geo-scientific methodology is established. © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
2013-01-01
Background Healthcare delivery is largely accomplished in and through conversations between people, and healthcare quality and effectiveness depend enormously upon the communication practices employed within these conversations. An important body of evidence about these practices has been generated by conversation analysis and related discourse analytic approaches, but there has been very little systematic reviewing of this evidence. Methods We developed an approach to reviewing evidence from conversation analytic and related discursive research through the following procedures: • reviewing existing systematic review methods and our own prior experience of applying these • clarifying distinctive features of conversation analytic and related discursive work which must be taken into account when reviewing • holding discussions within a review advisory team that included members with expertise in healthcare research, conversation analytic research, and systematic reviewing • attempting and then refining procedures through conducting an actual review which examined evidence about how people talk about difficult future issues including illness progression and dying Results We produced a step-by-step guide which we describe here in terms of eight stages, and which we illustrate from our ‘Review of Future Talk’. The guide incorporates both established procedures for systematic reviewing, and new techniques designed for working with conversation analytic evidence. Conclusions The guide is designed to inform systematic reviews of conversation analytic and related discursive evidence on specific domains and topics. Whilst we designed it for reviews that aim at informing healthcare practice and policy, it is flexible and could be used for reviews with other aims, for instance those aiming to underpin research programmes and projects. We advocate systematically reviewing conversation analytic and related discursive findings using this approach in order to translate them into a form that is credible and useful to healthcare practitioners, educators and policy-makers. PMID:23721181
Sood, Akshay; Ghani, Khurshid R; Ahlawat, Rajesh; Modi, Pranjal; Abaza, Ronney; Jeong, Wooju; Sammon, Jesse D; Diaz, Mireya; Kher, Vijay; Menon, Mani; Bhandari, Mahendra
2014-08-01
Traditional evaluation of the learning curve (LC) of an operation has been retrospective. Furthermore, LC analysis does not permit patient safety monitoring. To prospectively monitor patient safety during the learning phase of robotic kidney transplantation (RKT) and determine when it could be considered learned using the techniques of statistical process control (SPC). From January through May 2013, 41 patients with end-stage renal disease underwent RKT with regional hypothermia at one of two tertiary referral centers adopting RKT. Transplant recipients were classified into three groups based on the robotic training and kidney transplant experience of the surgeons: group 1, robot trained with limited kidney transplant experience (n=7); group 2, robot trained and kidney transplant experienced (n=20); and group 3, kidney transplant experienced with limited robot training (n=14). We employed prospective monitoring using SPC techniques, including cumulative summation (CUSUM) and Shewhart control charts, to perform LC analysis and patient safety monitoring, respectively. Outcomes assessed included post-transplant graft function and measures of surgical process (anastomotic and ischemic times). CUSUM and Shewhart control charts are time trend analytic techniques that allow comparative assessment of outcomes following a new intervention (RKT) relative to those achieved with established techniques (open kidney transplant; target value) in a prospective fashion. CUSUM analysis revealed an initial learning phase for group 3, whereas groups 1 and 2 had no to minimal learning time. The learning phase for group 3 varied depending on the parameter assessed. Shewhart control charts demonstrated no compromise in functional outcomes for groups 1 and 2. Graft function was compromised in one patient in group 3 (p<0.05) secondary to reasons unrelated to RKT. In multivariable analysis, robot training was significantly associated with improved task-completion times (p<0.01). Graft function was not adversely affected by either the lack of robotic training (p=0.22) or kidney transplant experience (p=0.72). The LC and patient safety of a new surgical technique can be assessed prospectively using CUSUM and Shewhart control chart analytic techniques. These methods allow determination of the duration of mentorship and identification of adverse events in a timely manner. A new operation can be considered learned when outcomes achieved with the new intervention are at par with outcomes following established techniques. Statistical process control techniques allowed for robust, objective, and prospective monitoring of robotic kidney transplantation and can similarly be applied to other new interventions during the introduction and adoption phase. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert
2009-01-01
Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...
Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A
2007-01-01
The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.
Therapeutic drug monitoring of flucytosine in serum using a SERS-active membrane system
NASA Astrophysics Data System (ADS)
Berger, Adam G.; White, Ian M.
2017-02-01
A need exists for near real-time therapeutic drug monitoring (TDM), in particular for antibiotics and antifungals in patient samples at the point-of-care. To truly fit the point-of-care need, techniques must be rapid and easy to use. Here we report a membrane system utilizing inkjet-fabricated surface enhanced Raman spectroscopy (SERS) sensors that allows sensitive and specific analysis despite the elimination of sophisticated chromatography equipment, expensive analytical instruments, and other systems relegated to the central lab. We utilize inkjet-fabricated paper SERS sensors as substrates for 5FC detection; the use of paper-based SERS substrates leverages the natural wicking ability and filtering properties of microporous membranes. We investigate the use of microporous membranes in the vertical flow assay to allow separation of the flucytosine from whole blood. The passive vertical flow assay serves as a valuable method for physical separation of target analytes from complex biological matrices. This work further establishes a platform for easy, sensitive, and specific TDM of 5FC from whole blood.
García-Jiménez, Sara; Erazo-Mijares, Miguel; Toledano-Jaimes, Cairo D; Monroy-Noyola, Antonio; Bilbao-Marcos, Fernando; Sánchez-Alemán, Miguel A; Déciga-Campos, Myrna
2016-01-01
The present study determined through analytic techniques the quantification of some biomarkers that have been useful to detect early ethanol consumption in a college population. A group of 117 students of recent entry to the Universidad Autónoma del Estado de Morelos was analyzed. The enzyme determination of aspartate aminotransferase, alanine aminotransferase, and gamma glutamyltransferase as metabolic markers of ethanol, as well as the carbohydrate-deficient transferrin (CDT) detected by high chromatographic liquid (up to 1.8% of CDT), allowed us to identify that 6% of the college population presented a potential risk of alcohol consumption. The use of the biochemical-analytical method overall with the psychological drug and a risk factor instrument established by the Universidad Autónoma del Estado de Morelos permit us to identify students whose substance abuse consumption puts their terminal efficiency at risk as well as their academic level. The timely detection on admission to college can monitor and support a student consumer's substance abuse.
NASA Astrophysics Data System (ADS)
Macías-Díaz, J. E.
In the present manuscript, we introduce a finite-difference scheme to approximate solutions of the two-dimensional version of Fisher's equation from population dynamics, which is a model for which the existence of traveling-wave fronts bounded within (0,1) is a well-known fact. The method presented here is a nonstandard technique which, in the linear regime, approximates the solutions of the original model with a consistency of second order in space and first order in time. The theory of M-matrices is employed here in order to elucidate conditions under which the method is able to preserve the positivity and the boundedness of solutions. In fact, our main result establishes relatively flexible conditions under which the preservation of the positivity and the boundedness of new approximations is guaranteed. Some simulations of the propagation of a traveling-wave solution confirm the analytical results derived in this work; moreover, the experiments evince a good agreement between the numerical result and the analytical solutions.
D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco
2013-01-01
This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.
Hu, Yijie; Deng, Liqing; Chen, Jinwu; Zhou, Siyu; Liu, Shuang; Fu, Yufan; Yang, Chunxian; Liao, Zhihua; Chen, Min
2016-03-01
Purple sweet potato (Ipomoea batatas L.) is rich in anthocyanin pigments, which are valuable constituents of the human diet. Techniques to identify and quantify anthocyanins and their antioxidant potential are desirable for cultivar selection and breeding. In this study, we performed a quantitative and qualitative chemical analysis of 30 purple sweet potato (PSP) cultivars, using various assays to measure reducing power radical-scavenging activities, and linoleic acid autoxidation inhibition activity. Grey relational analysis (GRA) was applied to establish relationships between the antioxidant activities and the chemical fingerprints, in order to identify key bioactive compounds. The results indicated that four peonidin-based anthocyanins and three cyanidin-based anthocyanins make significant contributions to antioxidant activity. We conclude that the analytical pipeline described here represents an effective method to evaluate the antioxidant potential of, and the contributing compounds present in, PSP cultivars. This approach may be used to guide future breeding strategies. Copyright © 2015. Published by Elsevier Ltd.
QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa
2016-10-01
Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.
Anderle, Heinz; Weber, Alfred
2016-03-01
Among "vintage" methods of protein determination, quantitative analytical refractometry has received far less attention than well-established pharmacopoeial techniques based on protein nitrogen content, such as Dumas combustion (1831) and Kjeldahl digestion (1883). Protein determination by quantitative refractometry dates back to 1903 and has been extensively investigated and characterized in the following 30 years, but has since vanished into a few niche applications that may not require the degree of accuracy and precision essential for pharmaceutical analysis. However, because high-resolution and precision digital refractometers have replaced manual instruments, reducing time and resource consumption, the method appears particularly attractive from an economic, ergonomic, and environmental viewpoint. The sample solution can be measured without dilution or other preparation procedures than the separation of the protein-free matrix by ultrafiltration, which might even be omitted for a constant matrix and excipient composition. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Streby, Ashleigh; Mull, Bonnie J; Levy, Karen; Hill, Vincent R
2015-05-01
Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Foursuch assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices.
Streby, Ashleigh; Mull, Bonnie J.; Levy, Karen
2015-01-01
Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Four such assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices. PMID:25855343
Principles and applications of Raman spectroscopy in pharmaceutical drug discovery and development.
Gala, Urvi; Chauhan, Harsh
2015-02-01
In recent years, Raman spectroscopy has become increasingly important as an analytical technique in various scientific areas of research and development. This is partly due to the technological advancements in Raman instrumentation and partly due to detailed fingerprinting that can be derived from Raman spectra. Its versatility of applications, rapidness of collection and easy analysis have made Raman spectroscopy an attractive analytical tool. The following review describes Raman spectroscopy and its application within the pharmaceutical industry. The authors explain the theory of Raman scattering and its variations in Raman spectroscopy. The authors also highlight how Raman spectra are interpreted, providing examples. Raman spectroscopy has a number of potential applications within drug discovery and development. It can be used to estimate the molecular activity of drugs and to establish a drug's physicochemical properties such as its partition coefficient. It can also be used in compatibility studies during the drug formulation process. Raman spectroscopy's immense potential should be further investigated in future.
Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.
ERIC Educational Resources Information Center
Hercules, David M.; Hercules, Shirley H.
1984-01-01
Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-01-01
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713
Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason
2016-12-15
With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Approximate analytical relationships for linear optimal aeroelastic flight control laws
NASA Astrophysics Data System (ADS)
Kassem, Ayman Hamdy
1998-09-01
This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.
Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration
NASA Technical Reports Server (NTRS)
Merritt, D. A.; Brand, W. A.; Hayes, J. M.
1994-01-01
In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
A Boltzmann machine for the organization of intelligent machines
NASA Technical Reports Server (NTRS)
Moed, Michael C.; Saridis, George N.
1989-01-01
In the present technological society, there is a major need to build machines that would execute intelligent tasks operating in uncertain environments with minimum interaction with a human operator. Although some designers have built smart robots, utilizing heuristic ideas, there is no systematic approach to design such machines in an engineering manner. Recently, cross-disciplinary research from the fields of computers, systems AI and information theory has served to set the foundations of the emerging area of the design of intelligent machines. Since 1977 Saridis has been developing an approach, defined as Hierarchical Intelligent Control, designed to organize, coordinate and execute anthropomorphic tasks by a machine with minimum interaction with a human operator. This approach utilizes analytical (probabilistic) models to describe and control the various functions of the intelligent machine structured by the intuitively defined principle of Increasing Precision with Decreasing Intelligence (IPDI) (Saridis 1979). This principle, even though resembles the managerial structure of organizational systems (Levis 1988), has been derived on an analytic basis by Saridis (1988). The purpose is to derive analytically a Boltzmann machine suitable for optimal connection of nodes in a neural net (Fahlman, Hinton, Sejnowski, 1985). Then this machine will serve to search for the optimal design of the organization level of an intelligent machine. In order to accomplish this, some mathematical theory of the intelligent machines will be first outlined. Then some definitions of the variables associated with the principle, like machine intelligence, machine knowledge, and precision will be made (Saridis, Valavanis 1988). Then a procedure to establish the Boltzmann machine on an analytic basis will be presented and illustrated by an example in designing the organization level of an Intelligent Machine. A new search technique, the Modified Genetic Algorithm, is presented and proved to converge to the minimum of a cost function. Finally, simulations will show the effectiveness of a variety of search techniques for the intelligent machine.
NASA Astrophysics Data System (ADS)
Larsen, Erik H.
1998-02-01
Achievement of optimum selectivity, sensitivity and robustness in speciation analysis using high performance liquid chromatography (HPLC) with inductively coupled mass spectrometry (ICP-MS) detection requires that each instrumental component is selected and optimized with a view to the ideal operating characteristics of the entire hyphenated system. An isocratic HPLC system, which employs an aqueous mobile phase with organic buffer constituents, is well suited for introduction into the ICP-MS because of the stability of the detector response and high degree of analyte sensitivity attained. Anion and cation exchange HPLC systems, which meet these requirements, were used for the seperation of selenium and arsenic species in crude extracts of biological samples. Furthermore, the signal-to-noise ratios obtained for these incompletely ionized elements in the argon ICP were further enhanced by a factor of four by continously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed. The optimized anion and cation exchange HPLC-ICP-MS systems were used for arsenic speciation in contaminated ground water and in an in-house shrimp reference sample. For the purpose of verification, HPLC coupled with tandem mass spectrometry with electrospray ionization was additionally used for arsenic speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing a mass balance of the analytes in each fraction of the extraction procedure, by recovery of spikes and by employing and comparing independent techniques. The urgent need for reference materials certified for elemental species is stressed.
Foster, Katherine T; Beltz, Adriene M
2018-08-01
Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poggio, A.J.; Mayall, B.H.
1989-04-01
The Lawrence Livermore National Laboratory (LLNL) is an acknowledged world center for analytical cytology. This leadership was recognized by the Regents of the University of California (UC), who in 1982 established and funded the Program for Analytical Cytology to facilitate the transfer of this technology from scientists at LLNL to their University colleagues, primarily through innovative collaborative research. This issue of Energy and Technology Review describes three of the forty projects that have been funded in this way; chosen to illustrate the potential medical application of the research. Analytical cytology is a relatively new field of biomedical research that ismore » increasingly being applied in clinical medicine. It has been particularly important in unraveling the complexities of the human immune system and in quantifying the pathobiology of malignancy. Defined as the characterization and measurement of cells and cellular constituents for biological and medical purposes, analytical cytology bridges the gap between the quantitative discipline of molecular biology and the more qualitative disciplines of anatomy and pathology. It is itself multidisciplinary in nature. Two major approaches to analytical cytology are flow cytometry and image cytometry. In each of these research techniques, cells are measured one at a time in an automated device. In flow instruments, the cells are dispersed in fluid suspension and pass in single file through a beam of laser light to generate optical signals that are measured. In image cytometry, cells are dispersed on a slide and are imaged through a microscope onto an electronic imaging and analysis system that processes the cell image to extract measurements of interest.« less
Zakaria, Rosita; Allen, Katrina J; Koplin, Jennifer J; Roche, Peter; Greaves, Ronda F
2016-12-01
Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; "blood spot" and "mass spectrometry"; while excluding "newborn"; and "neonate". In addition, databases were restricted to English language and human specific. There was no time period limit applied. As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required.
Advancements in oxygen generation and humidity control by water vapor electrolysis
NASA Technical Reports Server (NTRS)
Heppner, D. B.; Sudar, M.; Lee, M. C.
1988-01-01
Regenerative processes for the revitalization of manned spacecraft atmospheres or other manned habitats are essential for realization of long-term space missions. These processes include oxygen generation through water electrolysis. One promising technique of water electrolysis is the direct conversion of the water vapor contained in the cabin air to oxygen. This technique is the subject of the present program on water vapor electrolysis development. The objectives were to incorporate technology improvements developed under other similar electrochemical programs and add new ones; design and fabricate a mutli-cell electrochemical module and a testing facility; and demonstrate through testing the improvements. Each aspect of the water vapor electrolysis cell was reviewed. The materials of construction and sizing of each element were investigated analytically and sometime experimentally. In addition, operational considerations such as temperature control in response to inlet conditions were investigated. Three specific quantitative goals were established.
NASA Astrophysics Data System (ADS)
Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.
2005-10-01
Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.
Common aspects influencing the translocation of SERS to Biomedicine.
Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu
2018-01-04
In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
He, Qili; Su, Guoming; Liu, Keliang; Zhang, Fangcheng; Jiang, Yong; Gao, Jun; Liu, Lida; Jiang, Zhongren; Jin, Minwu; Xie, Huiping
2017-01-01
Hematologic and biochemical analytes of Sprague-Dawley rats are commonly used to determine effects that were induced by treatment and to evaluate organ dysfunction in toxicological safety assessments, but reference intervals have not been well established for these analytes. Reference intervals as presently defined for these analytes in Sprague-Dawley rats have not used internationally recommended statistical method nor stratified by sex. Thus, we aimed to establish sex-specific reference intervals for hematologic and biochemical parameters in Sprague-Dawley rats according to Clinical and Laboratory Standards Institute C28-A3 and American Society for Veterinary Clinical Pathology guideline. Hematology and biochemistry blood samples were collected from 500 healthy Sprague-Dawley rats (250 males and 250 females) in the control groups. We measured 24 hematologic analytes with the Sysmex XT-2100i analyzer, 9 biochemical analytes with the Olympus AU400 analyzer. We then determined statistically relevant sex partitions and calculated reference intervals, including corresponding 90% confidence intervals, using nonparametric rank percentile method. We observed that most hematologic and biochemical analytes of Sprague-Dawley rats were significantly influenced by sex. Males had higher hemoglobin, hematocrit, red blood cell count, red cell distribution width, mean corpuscular volume, mean corpuscular hemoglobin, white blood cell count, neutrophils, lymphocytes, monocytes, percentage of neutrophils, percentage of monocytes, alanine aminotransferase, aspartate aminotransferase, and triglycerides compared to females. Females had higher mean corpuscular hemoglobin concentration, plateletcrit, platelet count, eosinophils, percentage of lymphocytes, percentage of eosinophils, creatinine, glucose, total cholesterol and urea compared to males. Sex partition was required for most hematologic and biochemical analytes in Sprague-Dawley rats. We established sex-specific reference intervals, including corresponding 90% confidence intervals, for Sprague-Dawley rats. Understanding the significant discrepancies in hematologic and biochemical analytes between male and female Sprague-Dawley rats provides important insight into physiological effects in test rats. Establishment of locally sex-specific reference intervals allows a more precise evaluation of animal quality and experimental results of Sprague-Dawley rats in our toxicology safety assessment.
Light aircraft crash safety program
NASA Technical Reports Server (NTRS)
Thomson, R. G.; Hayduk, R. J.
1974-01-01
NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.
Surface-Enhanced Raman Spectroscopy.
ERIC Educational Resources Information Center
Garrell, Robin L.
1989-01-01
Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali
2016-03-01
The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.
Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote
2016-01-01
Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.
New analytical technique for carbon dioxide absorption solvents
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pouryousefi, F.; Idem, R.O.
2008-02-15
The densities and refractive indices of two binary systems (water + MEA and water + MDEA) and three ternary systems (water + MEA + CO{sub 2}, water + MDEA + CO{sub 2}, and water + MEA + MDEA) used for carbon dioxide (CO{sub 2}) capture were measured over the range of compositions of the aqueous alkanolamine(s) used for CO{sub 2} absorption at temperatures from 295 to 338 K. Experimental densities were modeled empirically, while the experimental refractive indices were modeled using well-established models from the known values of their pure-component densities and refractive indices. The density and Gladstone-Dale refractive indexmore » models were then used to obtain the compositions of unknown samples of the binary and ternary systems by simultaneous solution of the density and refractive index equations. The results from this technique have been compared with HPLC (high-performance liquid chromatography) results, while a third independent technique (acid-base titration) was used to verify the results. The results show that the systems' compositions obtained from the simple and easy-to-use refractive index/density technique were very comparable to the expensive and laborious HPLC/titration techniques, suggesting that the refractive index/density technique can be used to replace existing methods for analysis of fresh or nondegraded, CO{sub 2}-loaded, single and mixed alkanolamine solutions.« less
NASA Astrophysics Data System (ADS)
Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.
2002-04-01
Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.
WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL
The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...
Requirements for Calibration in Noninvasive Glucose Monitoring by Raman Spectroscopy
Lipson, Jan; Bernhardt, Jeff; Block, Ueyn; Freeman, William R.; Hofmeister, Rudy; Hristakeva, Maya; Lenosky, Thomas; McNamara, Robert; Petrasek, Danny; Veltkamp, David; Waydo, Stephen
2009-01-01
Background In the development of noninvasive glucose monitoring technology, it is highly desirable to derive a calibration that relies on neither person-dependent calibration information nor supplementary calibration points furnished by an existing invasive measurement technique (universal calibration). Method By appropriate experimental design and associated analytical methods, we establish the sufficiency of multiple factors required to permit such a calibration. Factors considered are the discrimination of the measurement technique, stabilization of the experimental apparatus, physics–physiology-based measurement techniques for normalization, the sufficiency of the size of the data set, and appropriate exit criteria to establish the predictive value of the algorithm. Results For noninvasive glucose measurements, using Raman spectroscopy, the sufficiency of the scale of data was demonstrated by adding new data into an existing calibration algorithm and requiring that (a) the prediction error should be preserved or improved without significant re-optimization, (b) the complexity of the model for optimum estimation not rise with the addition of subjects, and (c) the estimation for persons whose data were removed entirely from the training set should be no worse than the estimates on the remainder of the population. Using these criteria, we established guidelines empirically for the number of subjects (30) and skin sites (387) for a preliminary universal calibration. We obtained a median absolute relative difference for our entire data set of 30 mg/dl, with 92% of the data in the Clarke A and B ranges. Conclusions Because Raman spectroscopy has high discrimination for glucose, a data set of practical dimensions appears to be sufficient for universal calibration. Improvements based on reducing the variance of blood perfusion are expected to reduce the prediction errors substantially, and the inclusion of supplementary calibration points for the wearable device under development will be permissible and beneficial. PMID:20144354
Baselining PMU Data to Find Patterns and Anomalies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.
This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typicalmore » patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.« less
Najam-ul-Haq, M; Rainer, M; Szabó, Z; Vallant, R; Huck, C W; Bonn, G K
2007-03-10
At present, carbon nano-materials are being utilized in various procedures, especially in laser desorption/ionization-mass spectrometry (LDI-MS) for analyzing a range of analytes, which include peptides, proteins, metabolites, and polymers. Matrix-oriented LDI-MS techniques are very well established, with weak organic acids as energy-absorbing substances. Carbon materials, such as nano-tubes and fullerenes are being successfully applied in the small-mass range, where routine matrices have strong background signals. In addition, the role of carbon nano-materials is very well established in the fractionation and purification fields. Modified diamond powder and surfaces are utilized in binding peptides and proteins from complex biological fluids and analyzed by matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) mass spectrometry (MS). Polylysine-coated diamond is used for solid-phase extraction to pre-concentrate DNA oligonucleotides. Graphite is useful for desalting, pre-concentration, and as energy-absorbing material (matrix) in desorption/ionization. Carbon nano-tubes in their different derivatized forms are used as matrix materials for the analysis of a range of analytes, such as carbohydrates, amino acids, peptides, proteins, and some environmental samples by LDI-MS. Fullerenes are modified in different ways to bind serum entities analyzed through MALDI/TOF-MS and are subsequently utilized in their identifications. In addition, the fullerenes are a promising matrix in LDI-MS, but improvements are needed.
Metabolomic analysis-Addressing NMR and LC-MS related problems in human feces sample preparation.
Moosmang, Simon; Pitscheider, Maria; Sturm, Sonja; Seger, Christoph; Tilg, Herbert; Halabalaki, Maria; Stuppner, Hermann
2017-10-31
Metabolomics is a well-established field in fundamental clinical research with applications in different human body fluids. However, metabolomic investigations in feces are currently an emerging field. Fecal sample preparation is a demanding task due to high complexity and heterogeneity of the matrix. To gain access to the information enclosed in human feces it is necessary to extract the metabolites and make them accessible to analytical platforms like NMR or LC-MS. In this study different pre-analytical parameters and factors were investigated i.e. water content, different extraction solvents, influence of freeze-drying and homogenization, ratios of sample weight to extraction solvent, and their respective impact on metabolite profiles acquired by NMR and LC-MS. The results indicate that profiles are strongly biased by selection of extraction solvent or drying of samples, which causes different metabolites to be lost, under- or overstated. Additionally signal intensity and reproducibility of the measurement were found to be strongly dependent on sample pre-treatment steps: freeze-drying and homogenization lead to improved release of metabolites and thus increased signals, but at the same time induced variations and thus deteriorated reproducibility. We established the first protocol for extraction of human fecal samples and subsequent measurement with both complementary techniques NMR and LC-MS. Copyright © 2017 Elsevier B.V. All rights reserved.
Guzmán-Delgado, Paula; Graça, José; Cabral, Vanessa; Gil, Luis; Fernández, Victoria
2016-06-01
Plant cuticles have been traditionally classified on the basis of their ultrastructure, with certain chemical composition assumptions. However, the nature of the plant cuticle may be misinterpreted in the prevailing model, which was established more than 150 years ago. Using the adaxial leaf cuticle of Ficus elastica, a study was conducted with the aim of analyzing cuticular ultrastructure, chemical composition and the potential relationship between structure and chemistry. Gradual chemical extractions and diverse analytical and microscopic techniques were performed on isolated leaf cuticles of two different stages of development (i.e. young and mature leaves). Evidence for the presence of cutan in F. elastica leaf cuticles has been gained after chemical treatments and tissue analysis by infrared spectroscopy and electron microscopy. Significant calcium, boron and silicon concentrations were also measured in the cuticle of this species. Such mineral elements which are often found in plant cell walls may play a structural role and their presence in isolated cuticles further supports the interpretation of the cuticle as the most external region of the epidermal cell wall. The complex and heterogeneous nature of the cuticle, and constraints associated with current analytical procedures may limit the chance for establishing a relationship between cuticle chemical composition and structure also in relation to organ ontogeny. © 2016 Scandinavian Plant Physiology Society.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Resonance Ionization, Mass Spectrometry.
ERIC Educational Resources Information Center
Young, J. P.; And Others
1989-01-01
Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)
Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods
ERIC Educational Resources Information Center
Zhang, Ying
2011-01-01
Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
An analytical and experimental evaluation of a Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. A.; Cosby, R. M.
1976-01-01
An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.
Deriving Earth Science Data Analytics Requirements
NASA Technical Reports Server (NTRS)
Kempler, Steven J.
2015-01-01
Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.
Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi
2016-10-01
The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (
An Analytical Hierarchy Process Model for the Evaluation of College Experimental Teaching Quality
ERIC Educational Resources Information Center
Yin, Qingli
2013-01-01
Taking into account the characteristics of college experimental teaching, through investigaton and analysis, evaluation indices and an Analytical Hierarchy Process (AHP) model of experimental teaching quality have been established following the analytical hierarchy process method, and the evaluation indices have been given reasonable weights. An…
Simulated In Situ Determination of Soil Profile Organic and Inorganic Carbon With LIBS and VisNIR
NASA Astrophysics Data System (ADS)
Bricklemyer, R. S.; Brown, D. J.; Clegg, S. M.; Barefield, J. E.
2008-12-01
There is growing need for rapid, accurate, and inexpensive methods to measure, and verify soil organic carbon (SOC) change for national greenhouse gas accounting and the development of a soil carbon trading market. Laser Induced Breakdown Spectroscopy (LIBS) and Visible and Near Infrared Spectroscopy (VisNIR) are complementary analytical techniques that have the potential to fill that need. The LIBS method provides precise elemental analysis of soils, but generally cannot distinguish between organic C and inorganic C. VisNIR has been established as a viable technique for measuring soil properties including SOC and inorganic carbon (IC). As part of the Big Sky Carbon Sequestration Regional Partnership, 240 intact core samples (3.8 x 50 cm) have been collected from six agricultural fields in north central Montana, USA. Each of these core samples were probed concurrently with LIBS and VisNIR at 2.5, 7.5, 12.5, 17.5, 22.5, 27.5, 35 and 45 cm (+/- 1.5 cm) depths. VisNIR measurements were taken using an Analytical Spectral Devices (ASD, Boulder, CO, USA) Agrispec spectrometer to determine the partition of SOC vs. IC in the samples. The LIBS scans were collected with the LANL LIBS Core Scanner Instrument which collected the entire 200 - 900 nm plasma emission including the 247.8 nm carbon emission line. This instrument also collected the emission from the elements typically found in inorganic carbon (Ca and Mg) and organic carbon (H, O, and N). Subsamples of soil (~ 4 g) were taken from interrogation points for laboratory determination of SOC and IC. Using this analytical data, we constructed several full spectrum multivariate VisNIR/LIBS calibration models for SOC and IC. These models were then applied to independent validation cores for model evaluation.
Multi-analytical study of techniques and palettes of wall paintings of the monastery of Žiča, Serbia
NASA Astrophysics Data System (ADS)
Holclajtner-Antunović, Ivanka; Stojanović-Marić, Milica; Bajuk-Bogdanović, Danica; Žikić, Radiša; Uskoković-Marković, Snežana
2016-03-01
The present multi-analytical study concentrates on establishing the painting techniques and the identity of the wall painting materials used by the artists from the 13th and 14th centuries to decorate the Žiča monastery, Serbia. For this purpose, we demonstrate that micro-Raman spectroscopy is an efficient, non-destructive method with high spatial resolution which gives molecular and crystal structural information of a wide variety of both inorganic and organic materials. It is shown that elementary composition revealed through scanning electron microscopy with energy dispersive X-ray spectroscopy and energy dispersive X-ray fluorescence spectroscopy is necessary in some cases to confirm the identity of pigments and binders identified by micro-Raman spectroscopy. It was found that a fresco technique, in combination with mainly natural earth pigments such as red ochre, yellow ochre and green earth, was used. Expensive natural pigment lapis lazuli was exclusively used for obtaining blue colour while pure vermilion was used by the artists from the first period of decorations at the beginning of the 13th century. A mixture of pigments was used for attaining different colour shades. For the gilding of saint's haloes, thin golden foil was deposited over the tin sheet. In order to get a desirable optical and aesthetical impression, the metallic leaves were deposited over the yellow ochre preparatory layer. Deposits of gypsum on wall paintings as well as traces of weddellite are degradation products formed as a result of exposing wall paintings to environmental conditions.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Osorio-Murillo, C.; Over, M. W.; Rubin, Y.
2012-12-01
The Method of Anchored Distributions (MAD) is an inverse modeling technique that is well-suited for estimation of spatially varying parameter fields using limited observations and Bayesian methods. This presentation will discuss the design, development, and testing of a free software implementation of the MAD technique using the open source DotSpatial geographic information system (GIS) framework, R statistical software, and the MODFLOW groundwater model. This new tool, dubbed MAD-GIS, is built using a modular architecture that supports the integration of external analytical tools and models for key computational processes including a forward model (e.g. MODFLOW, HYDRUS) and geostatistical analysis (e.g. R, GSLIB). The GIS-based graphical user interface provides a relatively simple way for new users of the technique to prepare the spatial domain, to identify observation and anchor points, to perform the MAD analysis using a selected forward model, and to view results. MAD-GIS uses the Managed Extensibility Framework (MEF) provided by the Microsoft .NET programming platform to support integration of different modeling and analytical tools at run-time through a custom "driver." Each driver establishes a connection with external programs through a programming interface, which provides the elements for communicating with core MAD software. This presentation gives an example of adapting the MODFLOW to serve as the external forward model in MAD-GIS for inferring the distribution functions of key MODFLOW parameters. Additional drivers for other models are being developed and it is expected that the open source nature of the project will engender the development of additional model drivers by 3rd party scientists.
NASA Astrophysics Data System (ADS)
Blackie, J. R.; Robinson, M.
2007-01-01
Dr J.S.G. McCulloch was deeply involved in the establishment of research catchments in East Africa and subsequently in the UK to investigate the hydrological consequences of changes in land use. Comparison of these studies provides an insight into how influential his inputs and direction have been in the progressive development of the philosophy, the instrumentation and the analytical techniques now employed in catchment research. There were great contrasts in the environments: tropical highland (high radiation, intense rainfall) vs. temperate maritime (low radiation and frontal storms), contrasting soils and vegetation types, as well as the differing social and economic pressures in developing and developed nations. Nevertheless, the underlying scientific philosophy was common to both, although techniques had to be modified according to local conditions. As specialised instrumentation and analytical techniques were developed for the UK catchments many were also integrated into the East African studies. Many lessons were learned in the course of these studies and from the experiences of other studies around the world. Overall, a rigorous scientific approach was developed with widespread applicability. Beyond the basics of catchment selection and the quantification of the main components of the catchment water balance, this involved initiating parallel process studies to provide information on specific aspects of catchment behaviour. This information could then form the basis for models capable of extrapolation from the observed time series to other periods/hydrological events and, ultimately, the capability of predicting the consequences of changes in catchment land management to other areas in a range of climates.
NASA Astrophysics Data System (ADS)
Rim, Jung H.
Accurate and fast determination of the activity of radionuclides in a sample is critical for nuclear forensics and emergency response. Radioanalytical techniques are well established for radionuclides measurement, however, they are slow and labor intensive, requiring extensive radiochemical separations and purification prior to analysis. With these limitations of current methods, there is great interest for a new technique to rapidly process samples. This dissertation describes a new analyte extraction medium called Polymer Ligand Film (PLF) developed to rapidly extract radionuclides. Polymer Ligand Film is a polymer medium with ligands incorporated in its matrix that selectively and rapidly extract analytes from a solution. The main focus of the new technique is to shorten and simplify the procedure necessary to chemically isolate radionuclides for determination by alpha spectrometry or beta counting. Five different ligands were tested for plutonium extraction: bis(2-ethylhexyl) methanediphosphonic acid (H2DEH[MDP]), di(2-ethyl hexyl) phosphoric acid (HDEHP), trialkyl methylammonium chloride (Aliquat-336), 4,4'(5')-di-t-butylcyclohexano 18-crown-6 (DtBuCH18C6), and 2-ethylhexyl 2-ethylhexylphosphonic acid (HEH[EHP]). The ligands that were effective for plutonium extraction further studied for uranium extraction. The plutonium recovery by PLFs has shown dependency on nitric acid concentration and ligand to total mass ratio. H2DEH[MDP] PLFs performed best with 1:10 and 1:20 ratio PLFs. 50.44% and 47.61% of plutonium were extracted on the surface of PLFs with 1M nitric acid for 1:10 and 1:20 PLF, respectively. HDEHP PLF provided the best combination of alpha spectroscopy resolution and plutonium recovery with 1:5 PLF when used with 0.1M nitric acid. The overall analyte recovery was lower than electrodeposited samples, which typically has recovery above 80%. However, PLF is designed to be a rapid field deployable screening technique and consistency is more important than recovery. PLFs were also tested using blind quality control samples and the activities were accurately measured. It is important to point out that PLFs were consistently susceptible to analytes penetrating and depositing below the surface. The internal radiation within the body of PLF is mostly contained and did not cause excessive self-attenuation and peak broadening in alpha spectroscopy. The analyte penetration issue was beneficial in the destructive analysis. H2DEH[MDP] PLF was tested with environmental samples to fully understand the capabilities and limitations of the PLF in relevant environments. The extraction system was very effective in extracting plutonium from environmental water collected from Mortandad Canyon at Los Alamos National Laboratory with minimal sample processing. Soil samples were tougher to process than the water samples. Analytes were first leached from the soil matrixes using nitric acid before processing with PLF. This approach had a limitation in extracting plutonium using PLF. The soil samples from Mortandad Canyon, which are about 1% iron by weight, were effectively processed with the PLF system. Even with certain limitations of the PLF extraction system, this technique was able to considerably decrease the sample analysis time. The entire environmental sample was analyzed within one to two days. The decrease in time can be attributed to the fact that PLF is replacing column chromatography and electrodeposition with a single step for preparing alpha spectrometry samples. The two-step process of column chromatography and electrodeposition takes a couple days to a week to complete depending on the sample. The decrease in time and the simplified procedure make this technique a unique solution for application to nuclear forensics and emergency response. A large number of samples can be quickly analyzed and selective samples can be further analyzed with more sensitive techniques based on the initial data. The deployment of a PLF system as a screening method will greatly reduce a total analysis time required to gain meaningful isotopic data for the nuclear forensics application. (Abstract shortened by UMI.)
Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community
2016-01-01
Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement
Depth profile by Total IBA in perovskite active layers for solar cells
NASA Astrophysics Data System (ADS)
Barreiros, M. A.; Alves, L. C.; Brites, M. J.; Corregidor, V.
2017-08-01
In recent years the record efficiency of perovskite solar cells (PSCs) has been updated exceeding now 20%. However, it is difficult to make PSCs consistently. Definite correlation has been established between the PSC performance and the perovskite film quality which involves mainly morphology, crystallinity and composition. The manufacturing development of these devices is dependent on the characterisation methodologies, on the availability of suitable and reliable analytical techniques to assess the materials composition and quality and on the relationship of these results with the cell performance. Ion beam analytical (IBA) techniques jointly with a micro-ion beam are powerful tools for materials characterisation and can provide a valuable input for the knowledge of perovskite films. Perovskite films based on CH3NH3PbI3 were prepared (from CH3NH3I and PbI2 precursors) in a planar architecture and in a mesoporous TiO2 scaffold. Proton and helium micro-beams at different energies were used in the analysis of PSC active layers, previously characterised by SEM-FEG (Scanning Electron Microscopy with a field emission gun) and XRD (X-ray diffraction). Self-consistent fit of all the obtained PIXE (Particle Induced X-ray Emission) and RBS (Rutherford Backscattering Spectrometry) spectra through Total IBA approach provided depth profiling of perovskite, its precursors and TiO2 and assess their distribution in the films. PbI2 presence and location on the active layer may hinder the charge transport and highly affect the cell performance. IBA techniques allowed to identify regions of non-uniform surface coverage and homogeneous areas and it was possible to establish the undesired presence of PbI2 and its quantitative depth profile in the planar architecture film. In the mesostructured perovskite film it was verified a non-homogeneous distribution with a decreasing of perovskite concentration down to the thin blocking layer. The good agreement between the best fits obtained in a Total IBA approach and the experimental data granted reliability to depth profile results for the studied perovskite films.
A new concept of pencil beam dose calculation for 40-200 keV photons using analytical dose kernels.
Bartzsch, Stefan; Oelfke, Uwe
2013-11-01
The advent of widespread kV-cone beam computer tomography in image guided radiation therapy and special therapeutic application of keV photons, e.g., in microbeam radiation therapy (MRT) require accurate and fast dose calculations for photon beams with energies between 40 and 200 keV. Multiple photon scattering originating from Compton scattering and the strong dependence of the photoelectric cross section on the atomic number of the interacting tissue render these dose calculations by far more challenging than the ones established for corresponding MeV beams. That is why so far developed analytical models of kV photon dose calculations fail to provide the required accuracy and one has to rely on time consuming Monte Carlo simulation techniques. In this paper, the authors introduce a novel analytical approach for kV photon dose calculations with an accuracy that is almost comparable to the one of Monte Carlo simulations. First, analytical point dose and pencil beam kernels are derived for homogeneous media and compared to Monte Carlo simulations performed with the Geant4 toolkit. The dose contributions are systematically separated into contributions from the relevant orders of multiple photon scattering. Moreover, approximate scaling laws for the extension of the algorithm to inhomogeneous media are derived. The comparison of the analytically derived dose kernels in water showed an excellent agreement with the Monte Carlo method. Calculated values deviate less than 5% from Monte Carlo derived dose values, for doses above 1% of the maximum dose. The analytical structure of the kernels allows adaption to arbitrary materials and photon spectra in the given energy range of 40-200 keV. The presented analytical methods can be employed in a fast treatment planning system for MRT. In convolution based algorithms dose calculation times can be reduced to a few minutes.
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2012 CFR
2012-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests
Code of Federal Regulations, 2011 CFR
2011-07-01
... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...
Analytical aids in land management planning
David R. Betters
1978-01-01
Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...
Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert
2015-01-01
A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.
Determination of Ignitable Liquids in Fire Debris: Direct Analysis by Electronic Nose
Ferreiro-González, Marta; Barbero, Gerardo F.; Palma, Miguel; Ayuso, Jesús; Álvarez, José A.; Barroso, Carmelo G.
2016-01-01
Arsonists usually use an accelerant in order to start or accelerate a fire. The most widely used analytical method to determine the presence of such accelerants consists of a pre-concentration step of the ignitable liquid residues followed by chromatographic analysis. A rapid analytical method based on headspace-mass spectrometry electronic nose (E-Nose) has been developed for the analysis of Ignitable Liquid Residues (ILRs). The working conditions for the E-Nose analytical procedure were optimized by studying different fire debris samples. The optimized experimental variables were related to headspace generation, specifically, incubation temperature and incubation time. The optimal conditions were 115 °C and 10 min for these two parameters. Chemometric tools such as hierarchical cluster analysis (HCA) and linear discriminant analysis (LDA) were applied to the MS data (45–200 m/z) to establish the most suitable spectroscopic signals for the discrimination of several ignitable liquids. The optimized method was applied to a set of fire debris samples. In order to simulate post-burn samples several ignitable liquids (gasoline, diesel, citronella, kerosene, paraffin) were used to ignite different substrates (wood, cotton, cork, paper and paperboard). A full discrimination was obtained on using discriminant analysis. This method reported here can be considered as a green technique for fire debris analyses. PMID:27187407
Trace analysis of high-purity graphite by LA-ICP-MS.
Pickhardt, C; Becker, J S
2001-07-01
Laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been established as a very efficient and sensitive technique for the direct analysis of solids. In this work the capability of LA-ICP-MS was investigated for determination of trace elements in high-purity graphite. Synthetic laboratory standards with a graphite matrix were prepared for the purpose of quantifying the analytical results. Doped trace elements, concentration 0.5 microg g(-1), in a laboratory standard were determined with an accuracy of 1% to +/- 7% and a relative standard deviation (RSD) of 2-13%. Solution-based calibration was also used for quantitative analysis of high-purity graphite. It was found that such calibration led to analytical results for trace-element determination in graphite with accuracy similar to that obtained by use of synthetic laboratory standards for quantification of analytical results. Results from quantitative determination of trace impurities in a real reactor-graphite sample, using both quantification approaches, were in good agreement. Detection limits for all elements of interest were determined in the low ng g(-1) concentration range. Improvement of detection limits by a factor of 10 was achieved for analyses of high-purity graphite with LA-ICP-MS under wet plasma conditions, because the lower background signal and increased element sensitivity.
Recent trends in analytical procedures in forensic toxicology.
Van Bocxlaer, Jan F
2005-12-01
Forensic toxicology is a very demanding discipline,heavily dependent on good analytical techniques. That is why new trends appear continuously. In the past years. LC-MS has revolutionized target compound analysis and has become the trend, also in toxicology. In LC-MS screening analysis, things are less straightforward and several approaches exist. One promising approach based on accurate LC-MSTOF mass measurements and elemental formula based library searches is discussed. This way of screening has already proven its applicability but at the same time it became obvious that a single accurate mass measurement lacks some specificity when using large compound libraries. CE too is a reemerging approach. The increasingly polar and ionic molecules encountered make it a worthwhile addition to e.g. LC, as illustrated for the analysis of GHB. A third recent trend is the use of MALDI mass spectrometry for small molecules. It is promising for its ease-of-use and high throughput. Unfortunately, re-ports of disappointment but also accomplishment, e.g. the quantitative analysis of LSD as discussed here, alternate, and it remains to be seen whether MALDI really will establish itself. Indeed, not all new trends will prove themselves but the mere fact that many appear in the world of analytical toxicology nowadays is, in itself, encouraging for the future of (forensic) toxicology.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Cortez, Juliana; Pasquini, Celio
2013-02-05
The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.
Net growth rate of continuum heterogeneous biofilms with inhibition kinetics.
Gonzo, Elio Emilio; Wuertz, Stefan; Rajal, Veronica B
2018-01-01
Biofilm systems can be modeled using a variety of analytical and numerical approaches, usually by making simplifying assumptions regarding biofilm heterogeneity and activity as well as effective diffusivity. Inhibition kinetics, albeit common in experimental systems, are rarely considered and analytical approaches are either lacking or consider effective diffusivity of the substrate and the biofilm density to remain constant. To address this obvious knowledge gap an analytical procedure to estimate the effectiveness factor (dimensionless substrate mass flux at the biofilm-fluid interface) was developed for a continuum heterogeneous biofilm with multiple limiting-substrate Monod kinetics to different types of inhibition kinetics. The simple perturbation technique, previously validated to quantify biofilm activity, was applied to systems where either the substrate or the inhibitor is the limiting component, and cases where the inhibitor is a reaction product or the substrate also acts as the inhibitor. Explicit analytical equations are presented for the effectiveness factor estimation and, therefore, the calculation of biomass growth rate or limiting substrate/inhibitor consumption rate, for a given biofilm thickness. The robustness of the new biofilm model was tested using kinetic parameters experimentally determined for the growth of Pseudomonas putida CCRC 14365 on phenol. Several additional cases have been analyzed, including examples where the effectiveness factor can reach values greater than unity, characteristic of systems with inhibition kinetics. Criteria to establish when the effectiveness factor can reach values greater than unity in each of the cases studied are also presented.
Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.
Lo, Y C; Armbruster, David A
2012-04-01
Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.
Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals
NASA Astrophysics Data System (ADS)
Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.
2018-02-01
Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.
Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F
2005-12-08
This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.
ERIC Educational Resources Information Center
Toh, Chee-Seng
2007-01-01
A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.
NASA Astrophysics Data System (ADS)
Jones, C. E.; Kato, S.; Nakashima, Y.; Yamazakii, S.; Kajii, Y. J.
2011-12-01
Biogenic volatile organic compounds (BVOCs) emitted from vegetation constitute the largest fraction (>90 %) of total global non-methane VOC supplied to the atmosphere, yet the chemical complexity of these emissions means that achieving comprehensive measurements of BVOCs, and in particular the less volatile terpenes, is not straightforward. As such, there is still significant uncertainty associated with the contribution of BVOCs to the tropospheric oxidation budget, and to atmospheric secondary organic aerosol (SOA) formation. The rate of BVOC emission from vegetation is regulated by environmental conditions such as light intensity and temperature, and thus can be highly variable, necessitating high time-resolution BVOC measurements. In addition, the numerous monoterpene and sesquiterpene isomers, which are indistinguishable by some analytical techniques, have greatly varying lifetimes with respect to atmospheric oxidants, and as such quantification of each individual isomer is fundamental to achieving a comprehensive characterisation of the impact of BVOCs upon the atmospheric oxidation capacity. However, established measurement techniques for these trace gases typically offer a trade-off between sample frequency and the level of speciation; detailed information regarding chemical composition may be obtained, but with reduced time resolution, or vice versa. We have developed a Fast-GC-FID technique for quantification of a range of monoterpene, sesquiterpene and oxygenated C10 BVOC isomers, which retains the separation capability of conventional gas chromatography, yet offers considerably improved sample frequency. Development of this system is ongoing, but currently a 20 m x 0.18 mm i.d resistively heated metal column is employed to achieve chromatographic separation of thirteen C10-C15 BVOCs, within a total cycle time of ~15 minutes. We present the instrument specifications and analytical capability, together with the first application of this Fast-GC technique for BVOC analysis, monitoring BVOC emissions from white spruce (Picea glauca) during plant chamber studies.
NASA Astrophysics Data System (ADS)
Yazdchi, K.; Salehi, M.; Shokrieh, M. M.
2009-03-01
By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
Inspection method for the identification of TBT-containing antifouling paints.
Senda, Tetsuya; Miyata, Osamu; Kihara, Takeshi; Yamada, Yasujiro
2003-04-01
In order to ensure the effectiveness of the international convention which will prohibit the use of organotin compounds in antifouling paints applied to ships, it is essential to establish an inspection system to determine the presence of the prohibited compounds in the paint. In the present study, a method for the identification of organotin containing antifouling paints using a two-stage analysis process is investigated. Firstly, X-ray fluorescence analysis (XRF) is utilized, which could be used at the place of ship surveys or port state control. Using a portable XRF instrument customized for ship inspection, analysis is automatically executed and determines whether tin is present or not. If the presence of tin is confirmed by XRF, the sample is subsequently examined at an analytical laboratory using more rigorous analytical techniques, such as gas chromatograph mass spectrometry (GC-MS). A sampling device has been designed. It is a disc of approximately 10 mm diameter and has abrasive paper pasted to one of its flat surfaces. The device is pressed onto and then slid along a ship hull to lightly scrape off fragments of paint onto the abrasive paper. Preliminary field tests have revealed that sampling from a ship in dock yields successful collection of the paint for XRD analysis and that the resultant damage caused to the antifouling paint surface by the sampling technique was found to be negligible.
Higher-order differential phase shift keyed modulation
NASA Astrophysics Data System (ADS)
Vanalphen, Deborah K.; Lindsey, William C.
1994-02-01
Advanced modulation/demodulation techniques which are robust in the presence of phase and frequency uncertainties continue to be of interest to communication engineers. We are particularly interested in techniques which accommodate slow channel phase and frequency variations with minimal performance degradation and which alleviate the need for phase and frequency tracking loops in the receiver. We investigate the performance sensitivity to frequency offsets of a modulation technique known as binary Double Differential Phase Shift Keying (DDPSK) and compare it to that of classical binary Differential Phase Shift Keying (DPSK). We also generalize our analytical results to include n(sup -th) order, M-ary DPSK. The DDPSK (n = 2) technique was first introduced in the Russian literature circa 1972 and was studied more thoroughly in the late 1970's by Pent and Okunev. Here, we present an expression for the symbol error probability that is easy to derive and to evaluate numerically. We also present graphical results that establish when, as a function of signal energy-to-noise ratio and normalized frequency offset, binary DDPSK is preferable to binary DPSK with respect to performance in additive white Gaussian noise. Finally, we provide insight into the optimum receiver from a detection theory viewpoint.
An Analytical Solution for Transient Thermal Response of an Insulated Structure
NASA Technical Reports Server (NTRS)
Blosser, Max L.
2012-01-01
An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.
Analyte-induced spectral filtering in femtosecond transient absorption spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abraham, Baxter; Nieto-Pescador, Jesus; Gundlach, Lars
Here, we discuss the influence of spectral filtering by samples in femtosecond transient absorption measurements. Commercial instruments for transient absorption spectroscopy (TA) have become increasingly available to scientists in recent years and TA is becoming an established technique to measure the dynamics of photoexcited systems. Furthermore, we show that absorption of the excitation pulse by the sample can severely alter the spectrum and consequently the temporal pulse shape. This “spectral self-filtering” effect can lead to systematic errors and misinterpretation of data, most notably in concentration dependent measurements. Finally, the combination of narrow absorption peaks in the sample with ultrafast broadbandmore » excitation pulses is especially prone to this effect.« less
Phases and stability of non-uniform black strings
NASA Astrophysics Data System (ADS)
Emparan, Roberto; Luna, Raimon; Martínez, Marina; Suzuki, Ryotaku; Tanabe, Kentaro
2018-05-01
We construct solutions of non-uniform black strings in dimensions from D ≈ 9 all the way up to D = ∞, and investigate their thermodynamics and dynamical stability. Our approach employs the large- D perturbative expansion beyond the leading order, including corrections up to 1 /D 4. Combining both analytical techniques and relatively simple numerical solution of ODEs, we map out the ranges of parameters in which non-uniform black strings exist in each dimension and compute their thermodynamics and quasinormal modes with accuracy. We establish with very good precision the existence of Sorkin's critical dimension and we prove that not only the thermodynamic stability, but also the dynamic stability of the solutions changes at it.
Hyperasymptotics and quark-hadron duality violations in QCD
NASA Astrophysics Data System (ADS)
Boito, Diogo; Caprini, Irinel; Golterman, Maarten; Maltman, Kim; Peris, Santiago
2018-03-01
We investigate the origin of the quark-hadron duality-violating terms in the expansion of the QCD two-point vector correlation function at large energies in the complex q2 plane. Starting from the dispersive representation for the associated polarization, the analytic continuation of the operator product expansion from the Euclidean to the Minkowski region is performed by means of a generalized Borel-Laplace transform, borrowing techniques from hyperasymptotics. We establish a connection between singularities in the Borel plane and quark-hadron duality-violating contributions. Starting with the assumption that for QCD at Nc=∞ the spectrum approaches a Regge trajectory at large energy, we obtain an expression for quark-hadron duality violations at large, but finite Nc.
Manufacturing of novel low-cost adsorbent: Co-granulation of limestone and coffee waste.
Iakovleva, Evgenia; Sillanpää, Mika; Maydannik, Philipp; Liu, Jiang Tao; Allen, Stephen; Albadarin, Ahmad B; Mangwandi, Chirangano
2017-12-01
Limestone and coffee waste were used during the wet co-granulation process for the production of efficient adsorbents to be used in the removal of anionic and cationic dyes. The adsorbents were characterized using different analytical techniques such as XRD, SEM, FTIR, organic elemental analysis, the nitrogen adsorption method, with wettability, strength and adsorption tests. The adsorption capacity of granules was determined by removal of methylene blue (MB) and orange II (OR) from single and mixed solutions. In the mixed solution, co-granules removed 100% of MB and 85% of OR. The equilibria were established after 6 and 480 h for MB and OR, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.
Analyte-induced spectral filtering in femtosecond transient absorption spectroscopy
Abraham, Baxter; Nieto-Pescador, Jesus; Gundlach, Lars
2017-03-06
Here, we discuss the influence of spectral filtering by samples in femtosecond transient absorption measurements. Commercial instruments for transient absorption spectroscopy (TA) have become increasingly available to scientists in recent years and TA is becoming an established technique to measure the dynamics of photoexcited systems. Furthermore, we show that absorption of the excitation pulse by the sample can severely alter the spectrum and consequently the temporal pulse shape. This “spectral self-filtering” effect can lead to systematic errors and misinterpretation of data, most notably in concentration dependent measurements. Finally, the combination of narrow absorption peaks in the sample with ultrafast broadbandmore » excitation pulses is especially prone to this effect.« less
Optical trapping for analytical biotechnology.
Ashok, Praveen C; Dholakia, Kishan
2012-02-01
We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.
Nuclear and atomic analytical techniques in environmental studies in South America.
Paschoa, A S
1990-01-01
The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.
Green aspects, developments and perspectives of liquid phase microextraction techniques.
Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek
2014-02-01
Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.
Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A
2011-07-01
Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Chandrasekar, Thiravidamani; Raman, Natarajan
2016-07-01
A few novel Schiff base transition metal complexes of general formula [MLCl] (where, L = Schiff base, obtained by the condensation reaction of Knoevenagel condensate of curcumin, L-tryptophan and M = Cu(II), Ni(II), Co(II), and Zn(II)), were prepared by stencil synthesis. They were typified using UV-vis, IR, EPR spectral techniques, micro analytical techniques, magnetic susceptibility and molar conductivity. Geometry of the metal complexes was examined and recognized as square planar. DNA binding and viscosity studies revealed that the metal(II) complexes powerfully bound via an intercalation mechanism with the calf thymus DNA. Gel-electrophoresis technique was used to investigate the DNA cleavage competence of the complexes and they establish to approve the cleavage of pBR322 DNA in presence of oxidant H2O2. This outcome inferred that the synthesized complexes showed better nuclease activity. Moreover, the complexes were monitored for antimicrobial activities. The results exposed that the synthesized compounds were forceful against all the microbes under exploration.
On the power spectral density of quadrature modulated signals. [satellite communication
NASA Technical Reports Server (NTRS)
Yan, T. Y.
1981-01-01
The conventional (no-offset) quadriphase modulation technique suffers from the fact that hardlimiting will restore the frequency sidelobes removed by proper filtering. Thus, offset keyed quadriphase modulation techniques are often proposed for satellite communication with bandpass hardlimiting. A unified theory is developed which is capable of describing the power spectral density before and after the hardlimiting process. Using the in-phase and the quadrature phase channel with arbitrary pulse shaping, analytical results are established for generalized quadriphase modulation. In particular MSK, OPSK or the recently introduced overlapped raised cosine keying all fall into this general category. It is shown that for a linear communication channel, the power spectral density of the modulated signal remains unchanged regardless of the offset delay. Furthermore, if the in phase and the quadrature phase channel have identical pulse shapes without offset, the spectrum after bandpass hardlimiting will be identical to that of the conventional QPSK modulation. Numerical examples are given for various modulation techniques. A case of different pulse shapes in the in phase and the quadrature phase channel is also considered.
Performance analysis of cooperative virtual MIMO systems for wireless sensor networks.
Rafique, Zimran; Seet, Boon-Chong; Al-Anbuky, Adnan
2013-05-28
Multi-Input Multi-Output (MIMO) techniques can be used to increase the data rate for a given bit error rate (BER) and transmission power. Due to the small form factor, energy and processing constraints of wireless sensor nodes, a cooperative Virtual MIMO as opposed to True MIMO system architecture is considered more feasible for wireless sensor network (WSN) applications. Virtual MIMO with Vertical-Bell Labs Layered Space-Time (V-BLAST) multiplexing architecture has been recently established to enhance WSN performance. In this paper, we further investigate the impact of different modulation techniques, and analyze for the first time, the performance of a cooperative Virtual MIMO system based on V-BLAST architecture with multi-carrier modulation techniques. Through analytical models and simulations using real hardware and environment settings, both communication and processing energy consumptions, BER, spectral efficiency, and total time delay of multiple cooperative nodes each with single antenna are evaluated. The results show that cooperative Virtual-MIMO with Binary Phase Shift Keying-Wavelet based Orthogonal Frequency Division Multiplexing (BPSK-WOFDM) modulation is a promising solution for future high data-rate and energy-efficient WSNs.
Performance Analysis of Cooperative Virtual MIMO Systems for Wireless Sensor Networks
Rafique, Zimran; Seet, Boon-Chong; Al-Anbuky, Adnan
2013-01-01
Multi-Input Multi-Output (MIMO) techniques can be used to increase the data rate for a given bit error rate (BER) and transmission power. Due to the small form factor, energy and processing constraints of wireless sensor nodes, a cooperative Virtual MIMO as opposed to True MIMO system architecture is considered more feasible for wireless sensor network (WSN) applications. Virtual MIMO with Vertical-Bell Labs Layered Space-Time (V-BLAST) multiplexing architecture has been recently established to enhance WSN performance. In this paper, we further investigate the impact of different modulation techniques, and analyze for the first time, the performance of a cooperative Virtual MIMO system based on V-BLAST architecture with multi-carrier modulation techniques. Through analytical models and simulations using real hardware and environment settings, both communication and processing energy consumptions, BER, spectral efficiency, and total time delay of multiple cooperative nodes each with single antenna are evaluated. The results show that cooperative Virtual-MIMO with Binary Phase Shift Keying-Wavelet based Orthogonal Frequency Division Multiplexing (BPSK-WOFDM) modulation is a promising solution for future high data-rate and energy-efficient WSNs. PMID:23760087
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
Big Data Analytics with Datalog Queries on Spark.
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2016-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.
Big Data Analytics with Datalog Queries on Spark
Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo
2017-01-01
There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296
ERIC Educational Resources Information Center
Griffith, James
2002-01-01
Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
Indirect methods for reference interval determination - review and recommendations.
Jones, Graham R D; Haeckel, Rainer; Loh, Tze Ping; Sikaris, Ken; Streichert, Thomas; Katayev, Alex; Barth, Julian H; Ozarda, Yesim
2018-04-19
Reference intervals are a vital part of the information supplied by clinical laboratories to support interpretation of numerical pathology results such as are produced in clinical chemistry and hematology laboratories. The traditional method for establishing reference intervals, known as the direct approach, is based on collecting samples from members of a preselected reference population, making the measurements and then determining the intervals. An alternative approach is to perform analysis of results generated as part of routine pathology testing and using appropriate statistical techniques to determine reference intervals. This is known as the indirect approach. This paper from a working group of the International Federation of Clinical Chemistry (IFCC) Committee on Reference Intervals and Decision Limits (C-RIDL) aims to summarize current thinking on indirect approaches to reference intervals. The indirect approach has some major potential advantages compared with direct methods. The processes are faster, cheaper and do not involve patient inconvenience, discomfort or the risks associated with generating new patient health information. Indirect methods also use the same preanalytical and analytical techniques used for patient management and can provide very large numbers for assessment. Limitations to the indirect methods include possible effects of diseased subpopulations on the derived interval. The IFCC C-RIDL aims to encourage the use of indirect methods to establish and verify reference intervals, to promote publication of such intervals with clear explanation of the process used and also to support the development of improved statistical techniques for these studies.
NASA Astrophysics Data System (ADS)
Armigliato, A.
2008-07-01
In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.
Kim, Saewung; Guenther, Alex; Apel, Eric
2013-07-01
The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.
Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.
Mura, Paola
2015-09-10
Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.
Golubović, Jelena; Protić, Ana; Otašević, Biljana; Zečević, Mira
2016-04-01
QSRR are mathematically derived relationships between the chromatographic parameters determined for a representative series of analytes in given separation systems and the molecular descriptors accounting for the structural differences among the investigated analytes. Artificial neural network is a technique of data analysis, which sets out to emulate the human brain's way of working. The aim of the present work was to optimize separation of six angiotensin receptor antagonists, so-called sartans: losartan, valsartan, irbesartan, telmisartan, candesartan cilexetil and eprosartan in a gradient-elution HPLC method. For this purpose, ANN as a mathematical tool was used for establishing a QSRR model based on molecular descriptors of sartans and varied instrumental conditions. The optimized model can be further used for prediction of an external congener of sartans and analysis of the influence of the analyte structure, represented through molecular descriptors, on retention behaviour. Molecular descriptors included in modelling were electrostatic, geometrical and quantum-chemical descriptors: connolly solvent excluded volume non-1,4 van der Waals energy, octanol/water distribution coefficient, polarizability, number of proton-donor sites and number of proton-acceptor sites. Varied instrumental conditions were gradient time, buffer pH and buffer molarity. High prediction ability of the optimized network enabled complete separation of the analytes within the run time of 15.5 min under following conditions: gradient time of 12.5 min, buffer pH of 3.95 and buffer molarity of 25 mM. Applied methodology showed the potential to predict retention behaviour of an external analyte with the properties within the training space. Connolly solvent excluded volume, polarizability and number of proton-acceptor sites appeared to be most influential paramateres on retention behaviour of the sartans. Copyright © 2015 Elsevier B.V. All rights reserved.
Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter
2016-01-01
Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263
NASA Astrophysics Data System (ADS)
Bastrakova, I.; Klump, J. F.; McInnes, B.; Wyborn, L. A.; Brown, A.
2015-12-01
The International Geo-Sample Number (IGSN) provides a globally unique identifier for physical samples used to generate analytical data. This unique identifier provides the ability to link each physical sample to any analytical data undertaken on that sample, as well as to any publications derived from any data derived on the sample. IGSN is particularly important for geochemical and geochronological data, where numerous analytical techniques can be undertaken at multiple analytical facilities not only on the parent rock sample itself, but also on derived sample splits and mineral separates. Australia now has three agencies implementing IGSN: Geoscience Australia, CSIRO and Curtin University. All three have now combined into a single project, funded by the Australian Research Data Services program, to better coordinate the implementation of IGSN in Australia, in particular how these agencies allocate IGSN identifiers. The project will register samples from pilot applications in each agency including the CSIRO National Collection of Mineral Spectra database, the Geoscience Australia sample collection, and the Digital Mineral Library of the John De Laeter Centre for Isotope Research at Curtin University. These local agency catalogues will then be aggregated into an Australian portal, which will ultimately be expanded for all geoscience specimens. The development of this portal will also involve developing a common core metadata schema for the description of Australian geoscience specimens, as well as formulating agreed governance models for registering Australian samples. These developments aim to enable a common approach across Australian academic, research organisations and government agencies for the unique identification of geoscience specimens and any analytical data and/or publications derived from them. The emerging pattern of governance and technical collaboration established in Australia may also serve as a blueprint for similar collaborations internationally.
Orlandini, Serena; Pasquini, Benedetta; Caprini, Claudia; Del Bubba, Massimo; Pinzauti, Sergio; Furlanetto, Sandra
2015-11-01
A fast and selective CE method for the determination of zolmitriptan (ZOL) and its five potential impurities has been developed applying the analytical Quality by Design principles. Voltage, temperature, buffer concentration, and pH were investigated as critical process parameters that can influence the critical quality attributes, represented by critical resolution values between peak pairs, analysis time, and peak efficiency of ZOL-dimer. A symmetric screening matrix was employed for investigating the knowledge space, and a Box-Behnken design was used to evaluate the main, interaction, and quadratic effects of the critical process parameters on the critical quality attributes. Contour plots were drawn highlighting important interactions between buffer concentration and pH, and the gained information was merged into the sweet spot plots. Design space (DS) was established by the combined use of response surface methodology and Monte Carlo simulations, introducing a probability concept and thus allowing the quality of the analytical performances to be assured in a defined domain. The working conditions (with the interval defining the DS) were as follows: BGE, 138 mM (115-150 mM) phosphate buffer pH 2.74 (2.54-2.94); temperature, 25°C (24-25°C); voltage, 30 kV. A control strategy was planned based on method robustness and system suitability criteria. The main advantages of applying the Quality by Design concept consisted of a great increase of knowledge of the analytical system, obtained throughout multivariate techniques, and of the achievement of analytical assurance of quality, derived by probability-based definition of DS. The developed method was finally validated and applied to the analysis of ZOL tablets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China’s steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities’ abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns. PMID:26422266
Status of Fuel Development and Manufacturing for Space Nuclear Reactors at BWX Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmack, W.J.; Husser, D.L.; Mohr, T.C.
2004-02-04
New advanced nuclear space propulsion systems will soon seek a high temperature, stable fuel form. BWX Technologies Inc (BWXT) has a long history of fuel manufacturing. UO2, UCO, and UCx have been fabricated at BWXT for various US and international programs. Recent efforts at BWXT have focused on establishing the manufacturing techniques and analysis capabilities needed to provide a high quality, high power, compact nuclear reactor for use in space nuclear powered missions. To support the production of a space nuclear reactor, uranium nitride has recently been manufactured by BWXT. In addition, analytical chemistry and analysis techniques have been developedmore » to provide verification and qualification of the uranium nitride production process. The fabrication of a space nuclear reactor will require the ability to place an unclad fuel form into a clad structure for assembly into a reactor core configuration. To this end, BWX Technologies has reestablished its capability for machining, GTA welding, and EB welding of refractory metals. Specifically, BWX Technologies has demonstrated GTA welding of niobium flat plate and EB welding of niobium and Nb-1Zr tubing. In performing these demonstration activities, BWX Technologies has established the necessary infrastructure to manufacture UO2, UCx, or UNx fuel, components, and complete reactor assemblies in support of space nuclear programs.« less
Li, Zhidong; Marinova, Dora; Guo, Xiumei; Gao, Yuan
2015-01-01
Many steel-based cities in China were established between the 1950s and 1960s. After more than half a century of development and boom, these cities are starting to decline and industrial transformation is urgently needed. This paper focuses on evaluating the transformation capability of resource-based cities building an evaluation model. Using Text Mining and the Document Explorer technique as a way of extracting text features, the 200 most frequently used words are derived from 100 publications related to steel- and other resource-based cities. The Expert Evaluation Method (EEM) and Analytic Hierarchy Process (AHP) techniques are then applied to select 53 indicators, determine their weights and establish an index system for evaluating the transformation capability of the pillar industry of China's steel-based cities. Using real data and expert reviews, the improved Fuzzy Relation Matrix (FRM) method is applied to two case studies in China, namely Panzhihua and Daye, and the evaluation model is developed using Fuzzy Comprehensive Evaluation (FCE). The cities' abilities to carry out industrial transformation are evaluated with concerns expressed for the case of Daye. The findings have policy implications for the potential and required industrial transformation in the two selected cities and other resource-based towns.
Remote software upload techniques in future vehicles and their performance analysis
NASA Astrophysics Data System (ADS)
Hossain, Irina
Updating software in vehicle Electronic Control Units (ECUs) will become a mandatory requirement for a variety of reasons, for examples, to update/fix functionality of an existing system, add new functionality, remove software bugs and to cope up with ITS infrastructure. Software modules of advanced vehicles can be updated using Remote Software Upload (RSU) technique. The RSU employs infrastructure-based wireless communication technique where the software supplier sends the software to the targeted vehicle via a roadside Base Station (BS). However, security is critically important in RSU to avoid any disasters due to malfunctions of the vehicle or to protect the proprietary algorithms from hackers, competitors or people with malicious intent. In this thesis, a mechanism of secure software upload in advanced vehicles is presented which employs mutual authentication of the software provider and the vehicle using a pre-shared authentication key before sending the software. The software packets are sent encrypted with a secret key along with the Message Digest (MD). In order to increase the security level, it is proposed the vehicle to receive more than one copy of the software along with the MD in each copy. The vehicle will install the new software only when it receives more than one identical copies of the software. In order to validate the proposition, analytical expressions of average number of packet transmissions for successful software update is determined. Different cases are investigated depending on the vehicle's buffer size and verification methods. The analytical and simulation results show that it is sufficient to send two copies of the software to the vehicle to thwart any security attack while uploading the software. The above mentioned unicast method for RSU is suitable when software needs to be uploaded to a single vehicle. Since multicasting is the most efficient method of group communication, updating software in an ECU of a large number of vehicles could benefit from it. However, like the unicast RSU, the security requirements of multicast communication, i.e., authenticity, confidentiality and integrity of the software transmitted and access control of the group members is challenging. In this thesis, an infrastructure-based mobile multicasting for RSU in vehicle ECUs is proposed where an ECU receives the software from a remote software distribution center using the road side BSs as gateways. The Vehicular Software Distribution Network (VSDN) is divided into small regions administered by a Regional Group Manager (RGM). Two multicast Group Key Management (GKM) techniques are proposed based on the degree of trust on the BSs named Fully-trusted (FT) and Semi-trusted (ST) systems. Analytical models are developed to find the multicast session establishment latency and handover latency for these two protocols. The average latency to perform mutual authentication of the software vendor and a vehicle, and to send the multicast session key by the software provider during multicast session initialization, and the handoff latency during multicast session is calculated. Analytical and simulation results show that the link establishment latency per vehicle of our proposed schemes is in the range of few seconds and the ST system requires few ms higher time than the FT system. The handoff latency is also in the range of few seconds and in some cases ST system requires less handoff time than the FT system. Thus, it is possible to build an efficient GKM protocol without putting too much trust on the BSs.
Accuracy of trace element determinations in alternate fuels
NASA Technical Reports Server (NTRS)
Greenbauer-Seng, L. A.
1980-01-01
NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.
MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION
The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...
Product identification techniques used as training aids for analytical chemists
NASA Technical Reports Server (NTRS)
Grillo, J. P.
1968-01-01
Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.
The generation of criteria for selecting analytical tools for landscape management
Marilyn Duffey-Armstrong
1979-01-01
This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...
Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakkila, E.A.
1978-10-01
Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.
Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975
1975-09-01
and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the
Evolution of evaluation criteria in the College of American Pathologists Surveys.
Ross, J W
1988-04-01
This review of the evolution of evaluation criteria in the College of American Pathologists Survey and of theoretical grounds proposed for evaluation criteria explores the complex nature of the evaluation process. Survey professionals balance multiple variables to seek relevant and meaningful evaluations. These include the state of the art, the reliability of target values, the nature of available control materials, the perceived medical "nonusefulness" of the extremes of performance (good or poor), this extent of laboratory services provided, and the availability of scientific data and theory by which clinically relevant criteria of medical usefulness may be established. The evaluation process has consistently sought peer concensus, to stimulate improvement in state of the art, to increase medical usefulness, and to monitor the state of the art. Recent factors that are likely to promote change from peer group evaluation to fixed criteria evaluation are the high degree of proficiency in the state of the art for many analytes, accurate target values, increased knowledge of biologic variation, and the availability of statistical modeling techniques simulating biologic and diagnostic processes as well as analytic processes.
Supercritical Fluid Chromatography--Theoretical Background and Applications on Natural Products.
Hartmann, Anja; Ganzera, Markus
2015-11-01
The use of supercritical fluid chromatography for natural product analysis as well as underlying theoretical mechanisms and instrumental requirements are summarized in this review. A short introduction focusing on the historical development of this interesting separation technique is followed by remarks on the current instrumental design, also describing possible detection modes and useable stationary phases. The overview on relevant applications is grouped based on their basic intention, may it be (semi)preparative or purely analytical. They indicate that supercritical fluid chromatography is still primarily considered for the analysis of nonpolar analytes like carotenoids, fatty acids, or terpenes. The low polarity of supercritical carbon dioxide, which is used with modifiers almost exclusively as a mobile phase today, combined with high efficiency and fast separations might explain the popularity of supercritical fluid chromatography for the analysis of these compounds. Yet, it has been shown that more polar natural products (e.g., xanthones, flavonoids, alkaloids) are separable too, with the same (if not superior) selectivity and reproducibility than established approaches like HPLC or GC. Georg Thieme Verlag KG Stuttgart · New York.
Morinaga, Osamu
2018-01-01
The scientific evaluation of crude drugs and kampo medicines (KMs) was demonstrated using the eastern blotting method with monoclonal antibodies (MAbs) against bioactive natural compounds. Scutellariae radix is one of the most important crude drugs used in KMs. Part of its pharmaceutical properties is due to the flavone glycoside baicalin (BI). A quantitative analysis method based on eastern blotting was developed for BI using an anti-BI MAb. A rapid, simple, sensitive, specific analytical system was subsequently established for BI with the eastern blotting technique using dot-blot and chemiluminescent methods. This system was useful as a high-throughput analytical method for the determination of BI in KMs as well as HPLC and enzyme-linked immunosorbent assay systems. Furthermore, an eastern blotting method was applied to the biological metabolic study of glycyrrhizic acid (GL), the major active constituent of licorice, for investigation of metabolites of GL such as 3-monoglucuronyl-glycyrrhetinic acid (3MGA) because licorice causes pseudoaldosteronism as a side effect. This approach may make it possible to determine the pathogenic agents of licorice-induced pseudoaldosteronism.
Comments on higher rank Wilson loops in N = 2∗
NASA Astrophysics Data System (ADS)
Liu, James T.; Zayas, Leopoldo A. Pando; Zhou, Shan
2018-01-01
For N = 2∗ theory with U( N ) gauge group we evaluate expectation values of Wilson loops in representations described by a rectangular Young tableau with n rows and k columns. The evaluation reduces to a two-matrix model and we explain, using a combination of numerical and analytical techniques, the general properties of the eigen-value distributions in various regimes of parameters ( N, λ , n, k) where λ is the 't Hooft coupling. In the large N limit we present analytic results for the leading and sub-leading contributions. In the particular cases of only one row or one column we reproduce previously known results for the totally symmetry and totally antisymmetric representations. We also extensively discusss the N = 4 limit of the N = 2∗ theory. While establishing these connections we clarify aspects of various orders of limits and how to relax them; we also find it useful to explicitly address details of the genus expansion. As a result, for the totally symmetric Wilson loop we find new contributions that improve the comparison with the dual holographic computation at one loop order in the appropriate regime.
Analytical ultrasonics for characterization of metallurgical microstructures and transformations
NASA Technical Reports Server (NTRS)
Rosen, M.
1986-01-01
The application of contact (piezoelectric) and noncontact (laser generation and detection) ultrasonic techniques for dynamic investigation of precipitation hardening processes in aluminum alloys, as well as crystallization and phase transformation in rapidly solidified amorphous and microcrystalline alloys is discussed. From the variations of the sound velocity and attenuation the precipitation mechanism and kinetics were determined. In addition, a correlation was established between the observed changes in the velocity and attenuation and the mechanical properties of age-hardenable aluminum alloys. The behavior of the elastic moduli, determined ultrasonically, were found to be sensitive to relaxation, crystallization and phase decomposition phenomena in rapidly solidified metallic glasses. Analytical ultrasonics enables determination of the activation energies and growth parameters of the reactions. Therefrom theoretical models can be constructed to explain the changes in mechanical and physical properties upon heat treatment of glassy alloys. The composition dependence of the elastic moduli in amorphous Cu-Zr alloys was found to be related to the glass transition temperature, and consequently to the glass forming ability of these alloys. Dynamic ultrasonic analysis was found to be feasible for on-line, real-time, monitoring of metallurgical processes.
DNA barcode-based delineation of putative species: efficient start for taxonomic workflows
Kekkonen, Mari; Hebert, Paul D N
2014-01-01
The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435
Walorczyk, Stanisław; Drożdżyński, Dariusz; Kowalska, Jolanta; Remlein-Starosta, Dorota; Ziółkowski, Andrzej; Przewoźniak, Monika; Gnusowski, Bogusław
2013-08-15
A sensitive, accurate and reliable multiresidue method based on the application of gas chromatography-tandem quadrupole mass spectrometry (GC-QqQ-MS/MS) has been established for screening, identification and quantification of a large number of pesticide residues in produce. The method was accredited in compliance with PN-EN ISO/IEC 17025:2005 standard and it was operated under flexible scope as PB-11 method. The flexible scope of accreditation allowed for minor modifications and extension of the analytical scope while using the same analytical technique. During the years 2007-2010, the method was used for the purpose of verification of organic crop production by multiresidue analysis for the presence of pesticides. A total of 528 samples of differing matrices such as fruits, vegetables, cereals, plant leaves and other green parts were analysed, of which 4.4% samples contained pesticide residues above the threshold value of 0.01 mg/kg. A total of 20 different pesticide residues were determined in the samples. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
López-Sánchez, M.; Mansilla-Plaza, L.; Sánchez-de-laOrden, M.
2017-10-01
Prior to field scale research, soil samples are analysed on a laboratory scale for electrical resistivity calibrations. Currently, there are a variety of field instruments to estimate the water content in soils using different physical phenomena. These instruments can be used to develop moisture-resistivity relationships on the same soil samples. This assures that measurements are performed on the same material and under the same conditions (e.g., humidity and temperature). A geometric factor is applied to the location of electrodes, in order to calculate the apparent electrical resistivity of the laboratory test cells. This geometric factor can be determined in three different ways: by means of the use of an analytical approximation, laboratory trials (experimental approximation), or by the analysis of a numerical model. The first case, the analytical approximation, is not appropriate for complex cells or arrays. And both, the experimental and numerical approximation can lead to inaccurate results. Therefore, we propose a novel approach to obtain a compromise solution between both techniques, providing a more precise determination of the geometrical factor.
Islas, Gabriela; Hernandez, Prisciliano
2017-01-01
To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027
Theoretical analysis of exponential transversal method of lines for the diffusion equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salazar, A.; Raydan, M.; Campo, A.
1996-12-31
Recently a new approximate technique to solve the diffusion equation was proposed by Campo and Salazar. This new method is inspired on the Method of Lines (MOL) with some insight coming from the method of separation of variables. The proposed method, the Exponential Transversal Method of Lines (ETMOL), utilizes an exponential variation to improve accuracy in the evaluation of the time derivative. Campo and Salazar have implemented this method in a wide range of heat/mass transfer applications and have obtained surprisingly good numerical results. In this paper, the authors study the theoretical properties of ETMOL in depth. In particular, consistency,more » stability and convergence are established in the framework of the heat/mass diffusion equation. In most practical applications the method presents a very reduced truncation error in time and its different versions are proven to be unconditionally stable in the Fourier sense. Convergence of the solutions is then established. The theory is corroborated by several analytical/numerical experiments.« less
Light stable isotope analysis of meteorites by ion microprobe
NASA Technical Reports Server (NTRS)
Mcsween, Harry Y., Jr.
1994-01-01
The main goal was to develop the necessary secondary ion mass spectrometer (SIMS) techniques to use a Cameca ims-4f ion microprobe to measure light stable isotope ratios (H, C, O and S) in situ and in non-conducting mineral phases. The intended application of these techniques was the analysis of meteorite samples, although the techniques that have been developed are equally applicable to the investigation of terrestrial samples. The first year established techniques for the analysis of O isotope ratios (delta O-18 and delta O-17) in conducting mineral phases and the measurement of S isotope ratios (delta S-34) in a variety of sulphide phases. In addition, a technique was developed to measure delta S-34 values in sulphates, which are insulators. Other research undertaken in the first year resulted in SIMS techniques for the measurement of wide variety of trace elements in carbonate minerals, with the aim of understanding the nature of alteration fluids in carbonaceous chondrites. In the second year we developed techniques for analyzing O isotope ratios in nonconducting mineral phases. These methods are potentially applicable to the measurement of other light stable isotopes such as H, C and S in insulators. Also, we have further explored the analytical techniques used for the analysis of S isotopes in sulphides by analyzing troilite in a number of L and H ordinary chondrites. This was done to see if there was any systematic differences with petrological type.
NASA Technical Reports Server (NTRS)
Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.
1985-01-01
An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.
NASA Technical Reports Server (NTRS)
Su, Ching-Hua
2015-01-01
A low gravity material experiment will be performed in the Material Science Research Rack (MSRR) on International Space Station (ISS). The flight experiment will conduct crystal growths of ZnSe and related ternary compounds, such as ZnSeS and ZnSeTe, by physical vapor transport (PVT). The main objective of the project is to determine the relative contributions of gravity-driven fluid flows to the compositional distribution, incorporation of impurities and defects, and deviation from stoichiometry observed in the grown crystals as results of buoyancy-driven convection and growth interface fluctuations caused by irregular fluid-flows on Earth. The investigation consists of extensive ground-based experimental and theoretical research efforts and concurrent flight experimentation. The objectives of the ground-based studies are (1) obtain the experimental data and conduct the analyses required to define the optimum growth parameters for the flight experiments, (2) perfect various characterization techniques to establish the standard procedure for material characterization, (3) quantitatively establish the characteristics of the crystals grown on Earth as a basis for subsequent comparative evaluations of the crystals grown in a low-gravity environment and (4) develop theoretical and analytical methods required for such evaluations. ZnSe and related ternary compounds have been grown by vapor transport technique with real time in-situ non-invasive monitoring techniques. The grown crystals have been characterized extensively by various techniques to correlate the grown crystal properties with the growth conditions. This talk will focus on the ground-based studies on the PVT crystal growth of ZnSe and related ternary compounds, especially the effects of different growth orientations related to gravity direction on the grown crystals.
Crystal Growth of Ternary Compound Semiconductors in Low Gravity Environment
NASA Technical Reports Server (NTRS)
Su, Ching-Hua
2014-01-01
A low gravity material experiment will be performed in the Material Science Research Rack (MSRR) on International Space Station (ISS). There are two sections of the flight experiment: (I) crystal growth of ZnSe and related ternary compounds, such as ZnSeS and ZnSeTe, by physical vapor transport (PVT) and (II) melt growth of CdZnTe by directional solidification. The main objective of the project is to determine the relative contributions of gravity-driven fluid flows to the compositional distribution, incorporation of impurities and defects, and deviation from stoichiometry observed in the grown crystals as results of buoyancy-driven convection and growth interface fluctuations caused by irregular fluid-flows on Earth. The investigation consists of extensive ground-based experimental and theoretical research efforts and concurrent flight experimentation. This talk will focus on the ground-based studies on the PVT crystal growth of ZnSe and related ternary compounds. The objectives of the ground-based studies are (1) obtain the experimental data and conduct the analyses required to define the optimum growth parameters for the flight experiments, (2) perfect various characterization techniques to establish the standard procedure for material characterization, (3) quantitatively establish the characteristics of the crystals grown on Earth as a basis for subsequent comparative evaluations of the crystals grown in a low-gravity environment and (4) develop theoretical and analytical methods required for such evaluations. ZnSe and related ternary compounds have been grown by vapor transport technique with real time in-situ non-invasive monitoring techniques. The grown crystals have been characterized extensively by various techniques to correlate the grown crystal properties with the growth conditions.
Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H
1984-01-01
The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.
Safety and Suitability for Service Assessment Testing for Surface and Underwater Launched Munitions
2014-12-05
test efficiency that tend to associate the Analytical S3 Test Approach with large, complex munition systems and the Empirical S3 Test Approach with...the smaller, less complex munition systems . 8.1 ANALYTICAL S3 TEST APPROACH. The Analytical S3 test approach, as shown in Figure 3, evaluates...assets than the Analytical S3 Test approach to establish the safety margin of the system . This approach is generally applicable to small munitions
NASA Astrophysics Data System (ADS)
Nakadate, Hiromichi; Sekizuka, Eiichi; Minamitani, Haruyuki
We aimed to study the validity of a new analytical approach that reflected the phase from platelet activation to the formation of small platelet aggregates. We hoped that this new approach would enable us to use the particle-counting method with laser-light scattering to measure platelet aggregation in healthy controls and in diabetic patients without complications. We measured agonist-induced platelet aggregation for 10 min. Agonist was added to the platelet-rich plasma 1 min after measurement started. We compared the total scattered light intensity from small aggregates over a 10-min period (established analytical approach) and that over a 2-min period from 1 to 3 min after measurement started (new analytical approach). Consequently platelet aggregation in diabetics with HbA1c ≥ 6.5% was significantly greater than in healthy controls by both analytical approaches. However, platelet aggregation in diabetics with HbA1c < 6.5%, i.e. patients in the early stages of diabetes, was significantly greater than in healthy controls only by the new analytical approach, not by the established analytical approach. These results suggest that platelet aggregation as detected by the particle-counting method using laser-light scattering could be applied in clinical examinations by our new analytical approach.
Olivieri, Alejandro C
2005-08-01
Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.
Culture-Sensitive Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Vandenberghe, L.
2008-01-01
Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…
NASA Astrophysics Data System (ADS)
Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel
2013-10-01
We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.
Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu
2010-01-01
This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595
Shih, Tsung-Ting; Hsu, I-Hsiang; Chen, Shun-Niang; Chen, Ping-Hung; Deng, Ming-Jay; Chen, Yu; Lin, Yang-Wei; Sun, Yuh-Chang
2015-01-21
We employed a polymeric material, poly(methyl methacrylate) (PMMA), for fabricating a microdevice and then implanted the chlorine (Cl)-containing solid-phase extraction (SPE) functionality into the PMMA chip to develop an innovative on-chip dipole-assisted SPE technique. Instead of the ion-ion interactions utilized in on-chip SPE techniques, the dipole-ion interactions between the highly electronegative C-Cl moieties in the channel interior and the positively charged metal ions were employed to facilitate the on-chip SPE procedures. Furthermore, to avoid labor-intensive manual manipulation, a programmable valve manifold was designed as an interface combining the dipole-assisted SPE microchip and inductively coupled plasma-mass spectrometry (ICP-MS) to achieve the fully automated operation. Under the optimized operation conditions for the established system, the detection limits for each analyte ion were obtained based on three times the standard deviation of seven measurements of the blank eluent solution. The limits ranged from 3.48 to 20.68 ng L(-1), suggesting that this technique appears uniquely suited for determining the levels of heavy metal ions in natural water. Indeed, a series of validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Remarkably, the developed device was durable enough to be reused more than 160 times without any loss in its analytical performance. To the best of our knowledge, this is the first study reporting on the combination of a dipole-assisted SPE microchip and elemental analysis instrument for the online determination of trace heavy metal ions.
NASA Astrophysics Data System (ADS)
Patel, Dhananjay; Singh, Vinay Kumar; Dalal, U. D.
2016-07-01
This work addresses the analytical and numerical investigations of the transmission performance of an optical Single Sideband (SSB) modulation technique generated by a Mach Zehnder Modulator (MZM) with a 90° and 120° hybrid coupler. It takes into account the problem of chromatic dispersion in single mode fibers in Passive Optical Networks (PON), which severely degrades the performance of the system. Considering the transmission length of the fiber, the SSB modulation generated by maintaining a phase shift of π/2 between the two electrodes of the MZM provides better receiver sensitivity. However, the power of higher-order harmonics generated due to the nonlinearity of the MZM is directly proportional to the modulation index, making the SSB look like a quasi-double sideband (DSB) and causing power fading due to chromatic dispersion. To eliminate one of the second-order harmonics, the SSB signal based on an MZM with a 120° hybrid coupler is simulated. An analytical model of conventional SSB using 90° and 120° hybrid couplers is established. The latter suppresses unwanted (upper/lower) first-order and second-order (lower/upper) sidebands. For the analysis, a varying quadrature amplitude modulation (QAM) Orthogonal Frequency Division Multiplexing (OFDM) signal with a data rate of 5 Gb/s is upconverted using both of the SSB techniques and is transmitted over a distance of 75 km in Single Mode Fiber (SMF). The simulation results show that the SSB with 120° hybrid coupler proves to be more immune to chromatic dispersion as compared to the conventional SSB technique. This is in tandem with the theoretical analysis presented in the article.
Xiao, Xiaoyin; Miller, Lance L; Parchert, Kylea J; Hayes, Dulce; Hochrein, James M
2016-07-15
From allergies to plant reproduction, pollens have important impacts on the health of human and plant populations, yet identification of pollen grains remains difficult and time-consuming. Low-volatility flavonoids generated from pollens cannot be easily characterized and quantified with current analytical techniques. Here we present the novel use of atmospheric solids analysis probe mass spectrometry (ASAP-MS) for the characterization of flavonoids in pollens. Flavonoid patterns were generated for pollens collected from different plant types (trees and bushes) in addition to bee pollens from distinct geographic regions. Standard flavonoids (kaempferol and rhamnazin) and those produced from pollens were compared and assessed with ASAP-MS using low-energy collision MS/MS. Results for a semi-quantitative method for assessing the amount of a flavonoid in pollens are also presented. Flavonoid patterns for pollen samples were distinct with variability in the number and relative abundance of flavonoids in each sample. Pollens contained 2-5 flavonoids, and all but Kochia scoparia contained kaempferol or kaempferol isomers. We establish this method as a reliable and applicable technique for analyzing low-volatility compounds with minimal sample preparation. Standard curves were generated using 0.2-5 μg of kaempferol; from these experiments, it was estimated that there is approximately 2 mg of kaempferol present in 1 g of P. nigra italica pollen. Pollens can be characterized with a simple flavonoid pattern rather than analyzing the whole product pattern or the products-temperature profiles. ASAP-MS is a rapid analytical technique that can be used to distinguish between plant pollens and between bee pollens originating from different regions. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Takahashi, Tomoko; Thornton, Blair
2017-12-01
This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.
Kamruzzaman, Mohammed; Sun, Da-Wen; ElMasry, Gamal; Allen, Paul
2013-01-15
Many studies have been carried out in developing non-destructive technologies for predicting meat adulteration, but there is still no endeavor for non-destructive detection and quantification of adulteration in minced lamb meat. The main goal of this study was to develop and optimize a rapid analytical technique based on near-infrared (NIR) hyperspectral imaging to detect the level of adulteration in minced lamb. Initial investigation was carried out using principal component analysis (PCA) to identify the most potential adulterate in minced lamb. Minced lamb meat samples were then adulterated with minced pork in the range 2-40% (w/w) at approximately 2% increments. Spectral data were used to develop a partial least squares regression (PLSR) model to predict the level of adulteration in minced lamb. Good prediction model was obtained using the whole spectral range (910-1700 nm) with a coefficient of determination (R(2)(cv)) of 0.99 and root-mean-square errors estimated by cross validation (RMSECV) of 1.37%. Four important wavelengths (940, 1067, 1144 and 1217 nm) were selected using weighted regression coefficients (Bw) and a multiple linear regression (MLR) model was then established using these important wavelengths to predict adulteration. The MLR model resulted in a coefficient of determination (R(2)(cv)) of 0.98 and RMSECV of 1.45%. The developed MLR model was then applied to each pixel in the image to obtain prediction maps to visualize the distribution of adulteration of the tested samples. The results demonstrated that the laborious and time-consuming tradition analytical techniques could be replaced by spectral data in order to provide rapid, low cost and non-destructive testing technique for adulterate detection in minced lamb meat. Copyright © 2012 Elsevier B.V. All rights reserved.
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Rui, E-mail: ryang73@ustc.edu; Gudipati, Murthy S., E-mail: gudipati@jpl.nasa.gov
2014-03-14
In this work, we report for the first time successful analysis of organic aromatic analytes imbedded in D{sub 2}O ices by novel infrared (IR) laser ablation of a layered non-absorbing D{sub 2}O ice (spectator) containing the analytes and an ablation-active IR-absorbing H{sub 2}O ice layer (actor) without the analyte. With these studies we have opened up a new method for the in situ analysis of solids containing analytes when covered with an IR laser-absorbing layer that can be resonantly ablated. This soft ejection method takes advantage of the tenability of two-step infrared laser ablation and ultraviolet laser ionization mass spectrometry,more » previously demonstrated in this lab to study chemical reactions of polycyclic aromatic hydrocarbons (PAHs) in cryogenic ices. The IR laser pulse tuned to resonantly excite only the upper H{sub 2}O ice layer (actor) generates a shockwave upon impact. This shockwave penetrates the lower analyte-containing D{sub 2}O ice layer (spectator, a non-absorbing ice that cannot be ablated directly with the wavelength of the IR laser employed) and is reflected back, ejecting the contents of the D{sub 2}O layer into the vacuum where they are intersected by a UV laser for ionization and detection by a time-of-flight mass spectrometer. Thus, energy is transmitted from the laser-absorbing actor layer into the non-absorbing spectator layer resulting its ablation. We found that isotope cross-contamination between layers was negligible. We also did not see any evidence for thermal or collisional chemistry of PAH molecules with H{sub 2}O molecules in the shockwave. We call this “shockwave mediated surface resonance enhanced subsurface ablation” technique as “two-step laser ablation and ionization mass spectrometry of actor-spectator ice layers.” This method has its roots in the well-established MALDI (matrix assisted laser desorption and ionization) method. Our method offers more flexibility to optimize both the processes—ablation and ionization. This new technique can thus be potentially employed to undertake in situ analysis of materials imbedded in diverse media, such as cryogenic ices, biological samples, tissues, minerals, etc., by covered with an IR-absorbing laser ablation medium and study the chemical composition and reaction pathways of the analyte in its natural surroundings.« less
NASA Astrophysics Data System (ADS)
Rappleye, Devin Spencer
The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).
An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Allums, S. L.; Cosby, R. M.
1976-01-01
Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.
2016-04-01
Disclaimer The views expressed in this academic research paper are those of the author and do not reflect the official policy or position of the US...10 Figure 2: Proposed MAT Rating Badges..............................................................................16...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and
Adequacy of surface analytical tools for studying the tribology of ceramics
NASA Technical Reports Server (NTRS)
Sliney, H. E.
1986-01-01
Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.
ERIC Educational Resources Information Center
Arbaugh, J. B.; Hwang, Alvin
2013-01-01
Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…
Bidell, Markus P
2017-01-01
These three studies provide initial evidence for the development, factor structure, reliability, and validity of the Lesbian, Gay, Bisexual, and Transgender Development of Clinical Skills Scale (LGBT-DOCSS), a new interdisciplinary LGBT clinical self-assessment for health and mental health providers. Research participants were voluntarily recruited in the United States and United Kingdom and included trainees, clinicians, and educators from applied psychology, counseling, psychotherapy, and primary care medicine. Study 1 (N = 602) used exploratory and confirmatory factor analytic techniques, revealing an 18-item three-factor structure (Clinical Preparedness, Attitudinal Awareness, and Basic Knowledge). Study 2 established internal consistency for the overall LGBT-DOCSS (α = .86) and for each of the three subscales (Clinical Preparedness = .88, Attitudinal Awareness = .80, and Basic Knowledge = .83) and 2-week test-retest reliability (.87). In study 3 (N = 564), participant criteria (sexual orientation and education level) and four established scales that measured LGBT prejudice, assessment skills, and social desirability were used to support initial content and discriminant validity. Psychometric properties, limitations, and recommendations are discussed.
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Kenneth Paul
Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less
Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo
2016-03-24
Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-04-01
... TREASURY LIQUORS BEER Pilot Brewing Plants § 25.271 General. (a) Establishment. A person may establish and operate a pilot brewing plant off the brewery premises for research, analytical, experimental, or developmental purposes relating to beer or brewery operations. Pilot brewing plants will be established as...
Code of Federal Regulations, 2011 CFR
2011-04-01
... TREASURY LIQUORS BEER Pilot Brewing Plants § 25.271 General. (a) Establishment. A person may establish and operate a pilot brewing plant off the brewery premises for research, analytical, experimental, or developmental purposes relating to beer or brewery operations. Pilot brewing plants will be established as...
Noninvasive noble metal nanoparticle arrays for surface-enhanced Raman spectroscopy of proteins
NASA Astrophysics Data System (ADS)
Inya-Agha, Obianuju; Forster, Robert J.; Keyes, Tia E.
2007-02-01
Noble metal nanoparticles arrays are well established substrates for surface enhanced Raman spectroscopy (SERS). Their ability to enhance optical fields is based on the interaction of their surface valence electrons with incident electromagnetic radiation. In the array configuration, noble metal nanoparticles have been used to produce SER spectral enhancements of up to 10 8 orders of magnitude, making them useful for the trace analysis of physiologically relevant analytes such as proteins and peptides. Electrostatic interactions between proteins and metal surfaces result in the preferential adsorption of positively charged protein domains onto metal surfaces. This preferential interaction has the effect of disrupting the native conformation of the protein fold, with a concomitant loss of protein function. A major historic advantage of Raman microspectroscopy has been is its non-invasive nature; protein denaturation on the metal surfaces required for SER spectroscopy renders it a much more invasive technique. Further, part of the analytical power of Raman spectroscopy lies in its use as a secondary conformation probe. The protein structural loss which occurs on the metal surface results in secondary conformation readings which are not true to the actual native state of the analyte. This work presents a method for chemical fabrication of noble metal SERS arrays with surface immobilized layers which can protect protein native conformation without excessively mitigating the electromagnetic enhancements of spectra. Peptide analytes are used as model systems for proteins. Raman spectra of alpha lactalbumin on surfaces and when immobilized on these novel arrays are compared. We discuss the ability of the surface layer to protect protein structure whilst improving signal intensity.
Singular Hopf bifurcation in a differential equation with large state-dependent delay
Kozyreff, G.; Erneux, T.
2014-01-01
We study the onset of sustained oscillations in a classical state-dependent delay (SDD) differential equation inspired by control theory. Owing to the large delays considered, the Hopf bifurcation is singular and the oscillations rapidly acquire a sawtooth profile past the instability threshold. Using asymptotic techniques, we explicitly capture the gradual change from nearly sinusoidal to sawtooth oscillations. The dependence of the delay on the solution can be either linear or nonlinear, with at least quadratic dependence. In the former case, an asymptotic connection is made with the Rayleigh oscillator. In the latter, van der Pol’s equation is derived for the small-amplitude oscillations. SDD differential equations are currently the subject of intense research in order to establish or amend general theorems valid for constant-delay differential equation, but explicit analytical construction of solutions are rare. This paper illustrates the use of singular perturbation techniques and the unusual way in which solvability conditions can arise for SDD problems with large delays. PMID:24511255
Singular Hopf bifurcation in a differential equation with large state-dependent delay.
Kozyreff, G; Erneux, T
2014-02-08
We study the onset of sustained oscillations in a classical state-dependent delay (SDD) differential equation inspired by control theory. Owing to the large delays considered, the Hopf bifurcation is singular and the oscillations rapidly acquire a sawtooth profile past the instability threshold. Using asymptotic techniques, we explicitly capture the gradual change from nearly sinusoidal to sawtooth oscillations. The dependence of the delay on the solution can be either linear or nonlinear, with at least quadratic dependence. In the former case, an asymptotic connection is made with the Rayleigh oscillator. In the latter, van der Pol's equation is derived for the small-amplitude oscillations. SDD differential equations are currently the subject of intense research in order to establish or amend general theorems valid for constant-delay differential equation, but explicit analytical construction of solutions are rare. This paper illustrates the use of singular perturbation techniques and the unusual way in which solvability conditions can arise for SDD problems with large delays.
Demonstration of automated proximity and docking technologies
NASA Astrophysics Data System (ADS)
Anderson, Robert L.; Tsugawa, Roy K.; Bryan, Thomas C.
An autodock was demonstrated using straightforward techniques and real sensor hardware. A simulation testbed was established and validated. The sensor design was refined with improved optical performance and image processing noise mitigation techniques, and the sensor is ready for production from off-the-shelf components. The autonomous spacecraft architecture is defined. The areas of sensors, docking hardware, propulsion, and avionics are included in the design. The Guidance Navigation and Control architecture and requirements are developed. Modular structures suitable for automated control are used. The spacecraft system manager functions including configuration, resource, and redundancy management are defined. The requirements for autonomous spacecraft executive are defined. High level decisionmaking, mission planning, and mission contingency recovery are a part of this. The next step is to do flight demonstrations. After the presentation the following question was asked. How do you define validation? There are two components to validation definition: software simulation with formal and vigorous validation, and hardware and facility performance validated with respect to software already validated against analytical profile.
Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand
2018-05-09
This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.
Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oesterling, Patrick; Heine, Christian; Weber, Gunther H.
2012-05-04
Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less
Protein assay structured on paper by using lithography
NASA Astrophysics Data System (ADS)
Wilhelm, E.; Nargang, T. M.; Al Bitar, W.; Waterkotte, B.; Rapp, B. E.
2015-03-01
There are two main challenges in producing a robust, paper-based analytical device. The first one is to create a hydrophobic barrier which unlike the commonly used wax barriers does not break if the paper is bent. The second one is the creation of the (bio-)specific sensing layer. For this proteins have to be immobilized without diminishing their activity. We solve both problems using light-based fabrication methods that enable fast, efficient manufacturing of paper-based analytical devices. The first technique relies on silanization by which we create a flexible hydrophobic barrier made of dimethoxydimethylsilane. The second technique demonstrated within this paper uses photobleaching to immobilize proteins by means of maskless projection lithography. Both techniques have been tested on a classical lithography setup using printed toner masks and on a lithography system for maskless lithography. Using these setups we could demonstrate that the proposed manufacturing techniques can be carried out at low costs. The resolution of the paper-based analytical devices obtained with static masks was lower due to the lower mask resolution. Better results were obtained using advanced lithography equipment. By doing so we demonstrated, that our technique enables fabrication of effective hydrophobic boundary layers with a thickness of only 342 μm. Furthermore we showed that flourescine-5-biotin can be immobilized on the non-structured paper and be employed for the detection of streptavidinalkaline phosphatase. By carrying out this assay on a paper-based analytical device which had been structured using the silanization technique we proofed biological compatibility of the suggested patterning technique.
Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C
2016-01-28
Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Gardiner, Derek J.
1980-01-01
Reviews mainly quantitative analytical applications in the field of Raman spectrometry. Includes references to other reviews, new and analytically untested techniques, and novel sampling and instrument designs. Cites 184 references. (CS)
Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C
2013-01-01
The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.
The HVT technique and the 'uncertainty' relation for central potentials
NASA Astrophysics Data System (ADS)
Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th
2004-08-01
The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.
Klee, Sonja; Derpmann, Valerie; Wißdorf, Walter; Klopotowski, Sebastian; Kersten, Hendrik; Brockmann, Klaus J; Benter, Thorsten; Albrecht, Sascha; Bruins, Andries P; Dousty, Faezeh; Kauppila, Tiina J; Kostiainen, Risto; O'Brien, Rob; Robb, Damon B; Syage, Jack A
2014-08-01
It is well documented since the early days of the development of atmospheric pressure ionization methods, which operate in the gas phase, that cluster ions are ubiquitous. This holds true for atmospheric pressure chemical ionization, as well as for more recent techniques, such as atmospheric pressure photoionization, direct analysis in real time, and many more. In fact, it is well established that cluster ions are the primary carriers of the net charge generated. Nevertheless, cluster ion chemistry has only been sporadically included in the numerous proposed ionization mechanisms leading to charged target analytes, which are often protonated molecules. This paper series, consisting of two parts, attempts to highlight the role of cluster ion chemistry with regard to the generation of analyte ions. In addition, the impact of the changing reaction matrix and the non-thermal collisions of ions en route from the atmospheric pressure ion source to the high vacuum analyzer region are discussed. This work addresses such issues as extent of protonation versus deuteration, the extent of analyte fragmentation, as well as highly variable ionization efficiencies, among others. In Part 1, the nature of the reagent ion generation is examined, as well as the extent of thermodynamic versus kinetic control of the resulting ion population entering the analyzer region.
Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert
2015-01-01
A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1–10 μM and 2–100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories. PMID:25670899
NASA Astrophysics Data System (ADS)
Chen, CHAI; Yiik Diew, WONG
2017-02-01
This study provides an integrated strategy, encompassing microscopic simulation, safety assessment, and multi-attribute decision-making, to optimize traffic performance at downstream merging area of signalized intersections. A Fuzzy Cellular Automata (FCA) model is developed to replicate microscopic movement and merging behavior. Based on simulation experiment, the proposed FCA approach is able to provide capacity and safety evaluation of different traffic scenarios. The results are then evaluated through data envelopment analysis (DEA) and analytic hierarchy process (AHP). Optimized geometric layout and control strategies are then suggested for various traffic conditions. An optimal lane-drop distance that is dependent on traffic volume and speed limit can thus be established at the downstream merging area.
López-Vilariño, J M; Fernández-Martínez, G; Turnes-Carou, I; Muinategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D
2003-06-01
Behavior and contents of fluorine and chlorine in coal feedstock, combustion wastes (slag and fly ash) and emissions were studied in five conventional coal fired power plants and in a fluidized bed coal power plant. The halide levels found in the used coal were quite low. Mass balances and emission factors were calculated. The volatility of these elements makes the gaseous emission the main target between the residues. The influence of combustion parameters is not clearly established. Several analytical techniques (ion selective electrodes, capillary electrophoresis and ion chromatography) are employed to determinate the halide concentration in the different samples taken in the power plants studied (coal, slag, fly ash and flue gases).
Characterizing Covalently Sidewall-Functionalized SWCNTs by using 1H NMR Spectroscopy
Nelson, Donna J.; Kumar, Ravi
2013-01-01
Unambiguous evidence for covalent sidewall functionalization of single-walled carbon nanotubes (SWCNTs) has been a difficult task, especially for nanomaterials in which slight differences in functionality structure produce significant changes in molecular characteristics. Nuclear magnetic resonance (NMR) spectroscopy provides clear information about the structural skeleton of molecules attached to SWCNTs. In order to establish the generality of proton NMR as an analytical technique for characterizing covalently functionalized SWCNTs, we have obtained and analyzed proton NMR data of SWCNT-substituted benzenes across a variety of para substituents. Trends obtained for differences in proton NMR chemical shifts and the impact of o-, p-, and m-directing effects of electrophilic aromatic substituents on phenyl groups covalently bonded to SWCNTs are discussed. PMID:24009779
A TEM analysis of nanoparticulates in a Polar ice core
DOE Office of Scientific and Technical Information (OSTI.GOV)
Esquivel, E.V.; Murr, L.E
2004-03-15
This paper explores the prospect for analyzing nanoparticulates in age-dated ice cores representing times in antiquity to establish a historical reference for atmospheric particulate regimes. Analytical transmission electron microscope (TEM) techniques were utilized to observe representative ice-melt water drops dried down on carbon/formvar or similar coated grids. A 10,000-year-old Greenland ice core was melted, and representative water drops were transferred to coated grids in a clean room environment. Essentially, all particulates observed were aggregates and either crystalline or complex mixtures of nanocrystals. Especially notable was the observation of carbon nanotubes and related fullerene-like nanocrystal forms. These observations are similar withmore » some aspects of contemporary airborne particulates including carbon nanotubes and complex nanocrystal aggregates.« less
Lightweight engine containment. [Kevlar shielding
NASA Technical Reports Server (NTRS)
Weaver, A. T.
1977-01-01
Kevlar fabric styles and weaves were studied, as well as methods of application for advanced gas turbine engines. The Kevlar material was subjected to high speed impacts by simple projectiles fired from a rifle, as well as more complex shapes such as fan blades released from gas turbine rotors in a spin pit. Just contained data was developed for a variety of weave and/or application techniques, and a comparative containment weight efficiency was established for Kevlar containment applications. The data generated during these tests is being incorporated into an analytical design system so that blade containment trade-off studies between Kevlar and metal case engine structures can be made. Laboratory tests and engine environment tests were performed to determine the survivability of Kevlar in a gas turbine environment.
Environmental exposure effects on composite materials for commercial aircraft
NASA Technical Reports Server (NTRS)
Gibbons, M. N.
1982-01-01
The data base for composite materials' properties as they are affected by the environments encountered in operating conditions, both in flight and at ground terminals is expanded. Absorbed moisture degrades the mechanical properties of graphite/epoxy laminates at elevated temperatures. Since airplane components are frequently exposed to atmospheric moisture, rain, and accumulated water, quantitative data are required to evaluate the amount of fluids absorbed under various environmental conditions and the subsequent effects on material properties. In addition, accelerated laboratory test techniques are developed are reliably capable of predicting long term behavior. An accelerated environmental exposure testing procedure is developed, and experimental results are correlated and compared with analytical results to establish the level of confidence for predicting composite material properties.
CIMAROSTI, HELENA; HENLEY, JEREMY M.
2012-01-01
It is well established that brain ischemia can cause neuronal death via different signaling cascades. The relative importance and interrelationships between these pathways, however, remain poorly understood. Here is presented an overview of studies using oxygen-glucose deprivation of organotypic hippocampal slice cultures to investigate the molecular mechanisms involved in ischemia. The culturing techniques, setup of the oxygen-glucose deprivation model, and analytical tools are reviewed. The authors focus on SUMOylation, a posttranslational protein modification that has recently been implicated in ischemia from whole animal studies as an example of how these powerful tools can be applied and could be of interest to investigate the molecular pathways underlying ischemic cell death. PMID:19029060
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.
ERIC Educational Resources Information Center
Borman, Stuart A.
1982-01-01
Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…
Microgenetic Learning Analytics Methods: Workshop Report
ERIC Educational Resources Information Center
Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin
2016-01-01
Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…
Predictive modeling of complications.
Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P
2016-09-01
Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.
Active Control of Inlet Noise on the JT15D Turbofan Engine
NASA Technical Reports Server (NTRS)
Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.
1999-01-01
This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.
LC-MS based screening and targeted profiling methods for complex plant: coffee a case study.
da Rosa, Jeane Santos; Freitas-Silva, Otniel; Pacheco, Sidney; de Oliveira Godoy, Ronoel Luiz; de Rezende, Claudia Moraes
2012-11-01
In the recent years the way of thinking about human health necessarily passes by human food. Recent discoveries are not only concerned about valuable biomolecules but also contaminants. Thus, the screening of substances in animal and vegetable matrices by analytical techniques is focused on the presence and absence of target substance. In both cases, the majority of these substances are present as traces or in very low levels. Contaminants could be naturally present in the food, inserted on it or even developed on it as a consequence of food processing or cooking. Pesticides, mycotoxins, dioxins, acrylamide, Sudan red, melamine and now 4(5)-methylimidazole can be, at present, be listed as some of the world big problems related to food contaminants and adulterants. With the development of liquid chromatography coupled to mass spectrometry (LC-MS-MS), in the last few decades, analysis of some food contaminants in trace levels trace become less laborious, more accurate and precise. The multiple approach of those techniques make possible to obtain many results in one single run. On the other hand, European Union (2002/657/EC) established regulations for analytical methods regarding mass spectrometry as detection tool, showing the importance of this technique in food quality control. The EU criteria uses identification points (IPs) that could be achieved basically with four product ions (including molecular ion) or reduced with the use of high resolution equipments. This kind of mass spectrometers made the IPs criteria more accessible, as the exact mass information is a differential tool. In view of this the aim of this review is to present the actual scenario for mass spectrometry analysis in a complex vegetable food matrix such as roasted coffee, with emphasis on needs and challenges regarding the LC-MS technique in order to meet and contribute to food safety standards in this complex matrix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.
1990-08-01
This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less
Alternatives to current flow cytometry data analysis for clinical and research studies.
Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul
2018-02-01
Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.
Nelson, Michael A; Bedner, Mary; Lang, Brian E; Toman, Blaza; Lippa, Katrice A
2015-11-01
Given the critical role of pure, organic compound primary reference standards used to characterize and certify chemical Certified Reference Materials (CRMs), it is essential that associated mass purity assessments be fit-for-purpose, represented by an appropriate uncertainty interval, and metrologically sound. The mass fraction purities (% g/g) of 25-hydroxyvitamin D (25(OH)D) reference standards used to produce and certify values for clinical vitamin D metabolite CRMs were investigated by multiple orthogonal quantitative measurement techniques. Quantitative (1)H-nuclear magnetic resonance spectroscopy (qNMR) was performed to establish traceability of these materials to the International System of Units (SI) and to directly assess the principal analyte species. The 25(OH)D standards contained volatile and water impurities, as well as structurally-related impurities that are difficult to observe by chromatographic methods or to distinguish from the principal 25(OH)D species by one-dimensional NMR. These impurities have the potential to introduce significant biases to purity investigations in which a limited number of measurands are quantified. Combining complementary information from multiple analytical methods, using both direct and indirect measurement techniques, enabled mitigation of these biases. Purities of 25(OH)D reference standards and associated uncertainties were determined using frequentist and Bayesian statistical models to combine data acquired via qNMR, liquid chromatography with UV absorbance and atmospheric pressure-chemical ionization mass spectrometric detection (LC-UV, LC-ACPI-MS), thermogravimetric analysis (TGA), and Karl Fischer (KF) titration.
Discourse-Centric Learning Analytics: Mapping the Terrain
ERIC Educational Resources Information Center
Knight, Simon; Littleton, Karen
2015-01-01
There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations
ERIC Educational Resources Information Center
Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei
2016-01-01
Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…
Techniques for sensing methanol concentration in aqueous environments
NASA Technical Reports Server (NTRS)
Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)
2001-01-01
An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.
DOT National Transportation Integrated Search
2016-12-25
The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...
INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ANTIBIOTICS WITH ANALYTICAL CHEMISTRY
Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques.
Wang, Pei; Yu, Zhiguo
2015-10-01
Near infrared (NIR) spectroscopy as a rapid and nondestructive analytical technique, integrated with chemometrics, is a powerful process analytical tool for the pharmaceutical industry and is becoming an attractive complementary technique for herbal medicine analysis. This review mainly focuses on the recent applications of NIR spectroscopy in species authentication of herbal medicines and their geographical origin discrimination.
Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children
Lee, Hye Ryun; Roh, Eun Youn; Chang, Ju Young
2016-01-01
Background Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. Methods A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. Results As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. Conclusions We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age. PMID:27374715
Reference Intervals of Hematology and Clinical Chemistry Analytes for 1-Year-Old Korean Children.
Lee, Hye Ryun; Shin, Sue; Yoon, Jong Hyun; Roh, Eun Youn; Chang, Ju Young
2016-09-01
Reference intervals need to be established according to age. We established reference intervals of hematology and chemistry from community-based healthy 1-yr-old children and analyzed their iron status according to the feeding methods during the first six months after birth. A total of 887 children who received a medical check-up between 2010 and 2014 at Boramae Hospital (Seoul, Korea) were enrolled. A total of 534 children (247 boys and 287 girls) were enrolled as reference individuals after the exclusion of data obtained from children with suspected iron deficiency. Hematology and clinical chemistry analytes were measured, and the reference value of each analyte was estimated by using parametric (mean±2 SD) or nonparametric methods (2.5-97.5th percentile). Iron, total iron-binding capacity, and ferritin were measured, and transferrin saturation was calculated. As there were no differences in the mean values between boys and girls, we established the reference intervals for 1-yr-old children regardless of sex. The analysis of serum iron status according to feeding methods during the first six months revealed higher iron, ferritin, and transferrin saturation levels in children exclusively or mainly fed formula than in children exclusively or mainly fed breast milk. We established reference intervals of hematology and clinical chemistry analytes from community-based healthy children at one year of age. These reference intervals will be useful for interpreting results of medical check-ups at one year of age.
Fazlollahtabar, Hamed
2010-12-01
Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.
Terry, Jonathan G; Schmüser, Ilka; Underwood, Ian; Corrigan, Damion K; Freeman, Neville J; Bunting, Andrew S; Mount, Andrew R; Walton, Anthony J
2013-12-01
A novel technique for the production of nanoscale electrode arrays that uses standard microfabrication processes and micron-scale photolithography is reported here in detail. These microsquare nanoband edge electrode (MNEE) arrays have been fabricated with highly reproducible control of the key array dimensions, including the size and pitch of the individual elements and, most importantly, the width of the nanoband electrodes. The definition of lateral features to nanoscale dimensions typically requires expensive patterning techniques that are complex and low-throughput. However, the fabrication methodology used here relies on the fact that vertical dimensions (i.e. layer thicknesses) have long been manufacturable at the nanoscale using thin film deposition techniques that are well established in mainstream microelectronics. The authors report for the first time two aspects that highlight the particular suitability of these MNEE array systems for probe monolayer biosensing. The first is simulation, which shows the enhanced sensitivity to the redox reaction of the solution redox couple. The second is the enhancement of probe film functionalisation observed for the probe film model molecule, 6-mercapto-1-hexanol compared with microsquare electrodes. Such surface modification for specific probe layer biosensing and detection is of significance for a wide range of biomedical and other sensing and analytical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wright, R.S.; Kong, E.J.; Bahner, M.A.
The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.
The Coordinate Orthogonality Check (corthog)
NASA Astrophysics Data System (ADS)
Avitabile, P.; Pechinsky, F.
1998-05-01
A new technique referred to as the coordinate orthogonality check (CORTHOG) helps to identify how each physical degree of freedom contributes to the overall orthogonality relationship between analytical and experimental modal vectors on a mass-weighted basis. Using the CORTHOG technique together with the pseudo-orthogonality check (POC) clarifies where potential discrepancies exist between the analytical and experimental modal vectors. CORTHOG improves the understanding of the correlation (or lack of correlation) that exists between modal vectors. The CORTHOG theory is presented along with the evaluation of several cases to show the use of the technique.
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials
NASA Technical Reports Server (NTRS)
Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.
2004-01-01
A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Flexible aircraft dynamic modeling for dynamic analysis and control synthesis
NASA Technical Reports Server (NTRS)
Schmidt, David K.
1989-01-01
The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monteforte, Marianne; Estandarte, Ana K.; Chen, Bo
2016-06-23
High-energy X-ray Bragg coherent diffraction imaging (BCDI) is a well established synchrotron-based technique used to quantitatively reconstruct the three-dimensional morphology and strain distribution in nanocrystals. The BCDI technique has become a powerful analytical tool for quantitative investigations of nanocrystals, nanotubes, nanorods and more recently biological systems. BCDI has however typically failed for fine nanocrystals in sub-100?nm size regimes ? a size routinely achievable by chemical synthesis ? despite the spatial resolution of the BCDI technique being 20?30?nm. The limitations of this technique arise from the movement of nanocrystals under illumination by the highly coherent beam, which prevents full diffraction datamore » sets from being acquired. A solution is provided here to overcome this problem and extend the size limit of the BCDI technique, through the design of a novel stabilization method by embedding the fine nanocrystals into a silica matrix. Chemically synthesized FePt nanocrystals of maximum dimension 20?nm and AuPd nanocrystals in the size range 60?65?nm were investigated with BCDI measurement at beamline 34-ID-C of the APS, Argonne National Laboratory. Novel experimental methodologies to elucidate the presence of strain in fine nanocrystals are a necessary pre-requisite in order to better understand strain profiles in engineered nanocrystals for novel device development.« less
Monteforte, Marianne; Estandarte, Ana K; Chen, Bo; Harder, Ross; Huang, Michael H; Robinson, Ian K
2016-07-01
High-energy X-ray Bragg coherent diffraction imaging (BCDI) is a well established synchrotron-based technique used to quantitatively reconstruct the three-dimensional morphology and strain distribution in nanocrystals. The BCDI technique has become a powerful analytical tool for quantitative investigations of nanocrystals, nanotubes, nanorods and more recently biological systems. BCDI has however typically failed for fine nanocrystals in sub-100 nm size regimes - a size routinely achievable by chemical synthesis - despite the spatial resolution of the BCDI technique being 20-30 nm. The limitations of this technique arise from the movement of nanocrystals under illumination by the highly coherent beam, which prevents full diffraction data sets from being acquired. A solution is provided here to overcome this problem and extend the size limit of the BCDI technique, through the design of a novel stabilization method by embedding the fine nanocrystals into a silica matrix. Chemically synthesized FePt nanocrystals of maximum dimension 20 nm and AuPd nanocrystals in the size range 60-65 nm were investigated with BCDI measurement at beamline 34-ID-C of the APS, Argonne National Laboratory. Novel experimental methodologies to elucidate the presence of strain in fine nanocrystals are a necessary pre-requisite in order to better understand strain profiles in engineered nanocrystals for novel device development.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-04
Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.
A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics
NASA Technical Reports Server (NTRS)
Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan
2013-01-01
In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.
76 FR 52915 - Periodic Reporting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-24
... proposed changes in certain analytical methods used in periodic reporting. The proposed changes are... assignment of certain flat sorting operations; bias in mixed mail tallies; and Express Mail. Establishing... consider changes in the analytical methods approved for use in periodic reporting.\\1\\ \\1\\ Petition of the...
Strategic, Analytic and Operational Domains of Information Management.
ERIC Educational Resources Information Center
Diener, Richard AV
1992-01-01
Discussion of information management focuses on three main areas of activities and their interrelationship: (1) strategic, including establishing frameworks and principles of operations; (2) analytic, or research elements, including user needs assessment, data gathering, and data analysis; and (3) operational activities, including reference…
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
2016-09-16
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
Conceptual data sampling for breast cancer histology image classification.
Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir
2017-10-01
Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Eckel, J. S.; Crabtree, M. S.
1984-01-01
Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.
A Delayed Neutron Counting System for the Analysis of Special Nuclear Materials
NASA Astrophysics Data System (ADS)
Sellers, Madison Theresa
Nuclear forensic analysis is a modem science that uses numerous analytical techniques to identify and attribute nuclear materials in the event of a nuclear explosion, radiological terrorist attack or the interception of illicit nuclear material smuggling. The Canadian Department of National Defence has participated in recent international exercises that have highlighted the Nation's requirement to develop nuclear forensics expertise, protocol and capabilities, specifically pertaining to the analysis of special nuclear materials (SNM). A delayed neutron counting (DNC) system has been designed and established at the Royal Military College of Canada (RMC) to enhance the Government's SNM analysis capabilities. This analytical technique complements those already at RMC by providing a rapid and non-destructive method for the analysis of the fissile isotopes of both uranium (U) and plutonium (Pu). The SLOWPOKE-2 reactor at RMC produces a predominately thermal neutron flux. These neutrons induce fission in the SNM isotopes 233U, 235U and 239Pu releasing prompt fast neutrons, energy and radioactive fission fragments. Some of these fission fragments undergo beta - decay and subsequently emit neutrons, which can be recorded by an array of sensitive 3He detectors. The significant time period between the fission process and the release of these neutrons results in their identification as 'delayed neutrons'. The recorded neutron spectrum varies with time and the count rate curve is unique to each fissile isotope. In-house software, developed by this project, can analyze this delayed neutron curve and provides the fissile mass in the sample. Extensive characterization of the DNC system has been performed with natural U samples with 235 U content ranging from 2--7 microg. The system efficiency and dead time behaviour determined by the natural uranium sample analyses were validated by depleted uranium samples with similar quantities of 235 U resulting in a typical relative error of 3.6%. The system has accurately determined 235U content over three orders of magnitude with 235U amounts as low as 10 ng. The results have also been proven to be independent of small variations in total analyte volume and geometry, indicating that it is an ideal technique for the analysis of samples containing SNM in a variety of different matrices. The Analytical Sciences Group at RMC plans to continue DNC system development to include 233U and 239pu analysis and mixtures of SNM isotopes. Keywords: delayed neutron counting, special nuclear materials, nuclear forensics.
Zhdanov,; Michael, S [Salt Lake City, UT
2008-01-29
Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.
A comparison of measured and theoretical predictions for STS ascent and entry sonic booms
NASA Technical Reports Server (NTRS)
Garcia, F., Jr.; Jones, J. H.; Henderson, H. R.
1983-01-01
Sonic boom measurements have been obtained during the flights of STS-1 through 5. During STS-1, 2, and 4, entry sonic boom measurements were obtained and ascent measurements were made on STS-5. The objectives of this measurement program were (1) to define the sonic boom characteristics of the Space Transportation System (STS), (2) provide a realistic assessment of the validity of xisting theoretical prediction techniques, and (3) establish a level of confidence for predicting future STS configuration sonic boom environments. Detail evaluation and reporting of the results of this program are in progress. This paper will address only the significant results, mainly those data obtained during the entry of STS-1 at Edwards Air Force Base (EAFB), and the ascent of STS-5 from Kennedy Space Center (KSC). The theoretical prediction technique employed in this analysis is the so called Thomas Program. This prediction technique is a semi-empirical method that required definition of the near field signatures, detailed trajectory characteristics, and the prevailing meteorological characteristics as an input. This analytical procedure then extrapolates the near field signatures from the flight altitude to an altitude consistent with each measurement location.
Tungsten devices in analytical atomic spectrometry
NASA Astrophysics Data System (ADS)
Hou, Xiandeng; Jones, Bradley T.
2002-04-01
Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.
ERIC Educational Resources Information Center
Ramsey-Klee, Diane M.; Richman, Vivian
The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…
Cost and schedule analytical techniques development
NASA Technical Reports Server (NTRS)
1994-01-01
This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.
The analyst's participation in the analytic process.
Levine, H B
1994-08-01
The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.
Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V
2014-01-01
This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.
Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul
2017-01-01
The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Extended Analytic Device Optimization Employing Asymptotic Expansion
NASA Technical Reports Server (NTRS)
Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred
2013-01-01
Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.
A behavior-analytic critique of Bandura's self-efficacy theory
Biglan, Anthony
1987-01-01
A behavior-analytic critique of self-efficacy theory is presented. Self-efficacy theory asserts that efficacy expectations determine approach behavior and physiological arousal of phobics as well as numerous other clinically important behaviors. Evidence which is purported to support this assertion is reviewed. The evidence consists of correlations between self-efficacy ratings and other behaviors. Such response-response relationships do not unequivocally establish that one response causes another. A behavior-analytic alternative to self-efficacy theory explains these relationships in terms of environmental events. Correlations between self-efficacy rating behavior and other behavior may be due to the contingencies of reinforcement that establish a correspondence between such verbal predictions and the behavior to which they refer. Such a behavior-analytic account does not deny any of the empirical relationships presented in support of self-efficacy theory, but it points to environmental variables that could account for those relationships and that could be manipulated in the interest of developing more effective treatment procedures. PMID:22477956
Online analysis of chlorine stable isotopes in chlorinated ethylenes: an inter-laboratory study
NASA Astrophysics Data System (ADS)
Bernstein, Anat; Shouakar-Stash, Orfan; Hunkeler, Daniel; Sakaguchi-Söder, Kaori; Laskov, Christine; Aravena, Ramon; Elsner, Martin
2010-05-01
In the last decade, compound-specific stable isotopes analysis of groundwater pollutants became an important tool to identify different sources of the same pollutant and for tracking natural attenuating processes in the sub-surface. It has been shown that trends in the isotopic composition of the target compounds can shed light on in-situ processes that are otherwise difficult to track. Analytical methods of carbon, nitrogen and hydrogen were established and are by now frequently used for a variety of organic pollutants. Yet, the motivation of introducing analytical methods for new isotopes is emerging. This motivation is further enhanced, as advantages of using two or more stable isotopes for gaining better insight on degradation pathways are well accepted. One important element which demands the development of appropriate analytical methods is chlorine, which is found in various groups of organic pollutants, among them the chlorinated ethylenes. Chlorinated ethylenes are considered as high priority environmental pollutants, and the development of suitable chlorine isotope methods for this group of pollutants is highly desired. Ideally, stable isotope techniques should have the capability to determine the isotopic composition of and individual target compound in a non-pure mixture, without the requirement of a laborious off-line treatment. Indeed, in the last years two different concepts for on-line chlorine isotope analysis methods were introduced, by using either a standard quadrapole GC/MS (Sakaguchi-Söder et al., 2007) or by using a GC/IRMS (Shouakar-Stash et al., 2006). We present a comparison of the performances of two concepts, carried out in five different laboratories: Waterloo (GC/IRMS), Neuchâtel (GC/MS), Darmstadt (GC/MS), Tübingen (GC/MS) and Munich (GC/IRMS). This comparison was performed on pure trichloroethylene and dichloroethylene products of different manufactures, as well as trichloroethylene and dichloroethylene samples that were exposed to biodegradation. This study sets standards for further application of these techniques to distinguish sources and track degradation processes in the sub-surface.
[Multisite validation of CDT measurement by the %CDT TIA and the Tina Quant %CDT kits].
Boehrer, J L; Cano, Y; Capolaghi, B; Desch, G; Dosbaa, I; Estepa, L; Hennache, B; Schellenberg, F
2007-01-01
The measurement of CDT (Carbohydrate Deficient Transferrin) is an essential biological tool in the diagnosis and follow-up of alcohol abuse. It is also employed as a marker of abstinence for the restitution of driving licences. However, the precision of measurement, and the between laboratory homogeneity of the results are still discussed. The ion exchange followed by immunodetermination of CDT is available in two products, the Tina Quant %CDT (Roche, Mannheim, Germany) and the %CDT TIA (Bio-Rad, Hercules, United States). This multicentre study was undertaken: 1) to evaluate the analytical characteristics of these kits and the homogeneity of the results from one laboratory to another, independently of the method used, 2) to validate the differences between the proposed normal values of both kits, 3) to study the possibility of using commercial control sera as external quality control. Four analytical systems were included in the study (Roche Modular/Hitachi 717, Beckman Coulter Immage and LX20, Dade Behring BNII). Determinations were carried out on pools of sera, commercial control sera, kit controls, and 30 serums of patients. These latter were also analyzed in capillary electrophoresis in order to establish correlations between the techniques. The calibrations were stable over one 2 weeks period. The repeatability of measurements spread out from 3,1% to 24,7%, for a mean value lower than 10%. The commercial control sera provided reliable results, with values adapted to a routine quality control use. The results of the Bio-Rad applications were lower by approximately 20% than those of the Roche application, which justifies the difference of the normal values (2,6% versus 3%), and an identical classification of the patients in at least 27 of the 30 samples. We conclude that the analytical quality of the compared techniques, even if it could be improved, is sufficient to guarantee a good reliability of the results. An external quality control could be proposed by using the control sera that we tested.
Barker, John R; Martinez, Antonio
2018-04-04
Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.
NASA Astrophysics Data System (ADS)
Barker, John R.; Martinez, Antonio
2018-04-01
Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.
NASA Technical Reports Server (NTRS)
Bogan, Sam
2001-01-01
The first year included a study of the non-visible damage of composite overwrapped pressure vessels with B. Poe of the Materials Branch of Nasa-Langley. Early determinations showed a clear reduction in non-visible damage for thin COPVs when partially pressurized rather than unpressurized. Literature searches on Thicker-wall COPVs revealed surface damage but clearly visible. Analysis of current Analytic modeling indicated that that current COPV models lacked sufficient thickness corrections to predict impact damage. After a comprehensive study of available published data and numerous numerical studies based on observed data from Langley, the analytic framework for modeling the behavior was determined lacking and both Poe and Bogan suggested any short term (3yr) result for Jove would be overly ambitious and emphasis should be placed on transverse shear moduli studies. Transverse shear moduli determination is relevant to the study of fatigue, fracture and aging effects in composite structures. Based on the techniques developed by Daniel & Tsai, Bogan and Gates determined to verify the results for K3B and 8320. A detailed analytic and experimental plan was established and carried out that included variations in layup, width, thickness, and length. As well as loading rate variations to determine effects and relaxation moduli. The additional axial loads during the torsion testing were studied as was the placement of gages along the composite specimen. Of the proposed tasks, all of tasks I and 2 were completed with presentations given at Langley, SEM conferences and ASME/AIAA conferences. Sensitivity issues with the technique associated with the use of servohydraulic test systems for applying the torsional load to the composite specimen limited the torsion range for predictable and repeatable transverse shear properties. Bogan and Gates determined to diverge on research efforts with Gates continuing the experimental testing at Langley and Bogan modeling the apparent non-linear behavior at low torque & angles apparent from the tests.
Trace metal speciation in natural waters: Computational vs. analytical
Nordstrom, D. Kirk
1996-01-01
Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.
Li, Qing; Li, Xiaoming; Stanton, Bonita; Fang, Xiaoyi; Zhao, Ran
2010-11-01
Multilevel analytical techniques are being applied in condom use research to ensure the validity of investigation on environmental/structural influences and clustered data from venue-based sampling. The literature contains reports of consistent associations between perceived gatekeeper support and condom use among entertainment establishment-based female sex workers (FSWs) in Guangxi, China. However, the clustering inherent in the data (FSWs being clustered within establishment) has not been accounted in most of the analyses. We used multilevel analyses to examine perceived features of gatekeepers and individual correlates of consistent condom use among FSWs and to validate the findings in the existing literature. We analyzed cross-sectional data from 318 FSWs from 29 entertainment establishments in Guangxi, China in 2004, with a minimum of 5 FSWs per establishment. The Hierarchical Linear Models program with Laplace estimation was used to estimate the parameters in models containing random effects and binary outcomes. About 11.6% of women reported consistent condom use with clients. The intraclass correlation coefficient indicated 18.5% of the variance in condom use could be attributed to their similarity between FSWs within the same establishments. Women's perceived gatekeeper support and education remained positively associated with condom use (P < 0.05), after controlling for other individual characteristics and clustering. After adjusting for data clustering, perceived gatekeeper support remains associated with consistent condom use with clients among FSWs in China. The results imply that combined interventions to intervene both gatekeepers and individual FSW may effectively promote consistent condom use.
Li, Qing; Li, Xiaoming; Stanton, Bonita; Fang, Xiaoyi; Zhao, Ran
2010-01-01
Background Multilevel analytical techniques are being applied in condom use research to ensure the validity of investigation on environmental/structural influences and clustered data from venue-based sampling. The literature contains reports of consistent associations between perceived gatekeeper support and condom use among entertainments establishment-based female sex workers (FSWs) in Guangxi, China. However, the clustering inherent in the data (FSWs being clustered within establishment) has not been accounted in most of the analyses. We used multilevel analyses to examine perceived features of gatekeepers and individual correlates of consistent condom use among FSWs and to validate the findings in the existing literature. Methods We analyzed cross-sectional data from 318 FSWs from 29 entertainment establishments in Guangxi, China in 2004, with a minimum of 5 FSWs per establishment. The Hierarchical Linear Models program with Laplace estimation was used to estimate the parameters in models containing random effects and binary outcomes. Results About 11.6% of women reported consistent condom use with clients. The intraclass correlation coefficient indicated 18.5% of the variance in condom use could be attributed to their similarity between FSWs within the same establishments. Women’s perceived gatekeeper support and education remained positively associated with condom use (P < 0.05), after controlling for other individual characteristics and clustering. Conclusions After adjusting for data clustering, perceived gatekeeper support remains associated with consistent condom use with clients among FSWs in China. The results imply that combined interventions to intervene both gatekeepers and individual FSW may effectively promote consistent condom use. PMID:20539262
Pavement Performance : Approaches Using Predictive Analytics
DOT National Transportation Integrated Search
2018-03-23
Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...
Analytical techniques of pilot scanning behavior and their application
NASA Technical Reports Server (NTRS)
Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.
1986-01-01
The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.
Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi
2016-02-19
The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications
Medina, Isabel; Cappiello, Achille; Careri, Maria
2018-01-01
Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370
Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.
Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S
2016-04-07
Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.
Applications of surface analytical techniques in Earth Sciences
NASA Astrophysics Data System (ADS)
Qian, Gujie; Li, Yubiao; Gerson, Andrea R.
2015-03-01
This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenik, E.A.
X-ray microanalysis in an analytical electron microscope is a proven technique for the measurement of solute segregation in alloys. Solute segregation under equilibrium or nonequilibrium conditions can strongly influence material performance. X-ray microanalysis in an analytical electron microscope provides an alternative technique to measure grain boundary segregation, as well as segregation to other defects not accessible to Auger analysis. The utility of the technique is demonstrated by measurements of equilibrium segregation to boundaries in an antimony containing stainless steel, including the variation of segregation with boundary character and by measurements of nonequilibrium segregation to boundaries and dislocations in an ion-irradiatedmore » stainless steel.« less
Solid Lubrication Fundamentals and Applications. Chapter 2
NASA Technical Reports Server (NTRS)
Miyoshi, Kazuhisa
1998-01-01
This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.
Single-Cell Detection of Secreted Aβ and sAPPα from Human IPSC-Derived Neurons and Astrocytes.
Liao, Mei-Chen; Muratore, Christina R; Gierahn, Todd M; Sullivan, Sarah E; Srikanth, Priya; De Jager, Philip L; Love, J Christopher; Young-Pearse, Tracy L
2016-02-03
Secreted factors play a central role in normal and pathological processes in every tissue in the body. The brain is composed of a highly complex milieu of different cell types and few methods exist that can identify which individual cells in a complex mixture are secreting specific analytes. By identifying which cells are responsible, we can better understand neural physiology and pathophysiology, more readily identify the underlying pathways responsible for analyte production, and ultimately use this information to guide the development of novel therapeutic strategies that target the cell types of relevance. We present here a method for detecting analytes secreted from single human induced pluripotent stem cell (iPSC)-derived neural cells and have applied the method to measure amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα), analytes central to Alzheimer's disease pathogenesis. Through these studies, we have uncovered the dynamic range of secretion profiles of these analytes from single iPSC-derived neuronal and glial cells and have molecularly characterized subpopulations of these cells through immunostaining and gene expression analyses. In examining Aβ and sAPPα secretion from single cells, we were able to identify previously unappreciated complexities in the biology of APP cleavage that could not otherwise have been found by studying averaged responses over pools of cells. This technique can be readily adapted to the detection of other analytes secreted by neural cells, which would have the potential to open new perspectives into human CNS development and dysfunction. We have established a technology that, for the first time, detects secreted analytes from single human neurons and astrocytes. We examine secretion of the Alzheimer's disease-relevant factors amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα) and present novel findings that could not have been observed without a single-cell analytical platform. First, we identify a previously unappreciated subpopulation that secretes high levels of Aβ in the absence of detectable sAPPα. Further, we show that multiple cell types secrete high levels of Aβ and sAPPα, but cells expressing GABAergic neuronal markers are overrepresented. Finally, we show that astrocytes are competent to secrete high levels of Aβ and therefore may be a significant contributor to Aβ accumulation in the brain. Copyright © 2016 the authors 0270-6474/16/361730-17$15.00/0.
Schermeyer, Marie-Therese; Wöll, Anna K.; Eppink, Michel; Hubbuch, Jürgen
2017-01-01
ABSTRACT High protein titers are gaining importance in biopharmaceutical industry. A major challenge in the development of highly concentrated mAb solutions is their long-term stability and often incalculable viscosity. The complexity of the molecule itself, as well as the various molecular interactions, make it difficult to describe their solution behavior. To study the formulation stability, long- and short-range interactions and the formation of complex network structures have to be taken into account. For a better understanding of highly concentrated solutions, we combined established and novel analytical tools to characterize the effect of solution properties on the stability of highly concentrated mAb formulations. In this study, monoclonal antibody solutions in a concentration range of 50–200 mg/ml at pH 5–9 with and without glycine, PEG4000, and Na2SO4 were analyzed. To determine the monomer content, analytical size-exclusion chromatography runs were performed. ζ-potential measurements were conducted to analyze the electrophoretic properties in different solutions. The melting and aggregation temperatures were determined with the help of fluorescence and static light scattering measurements. Additionally, rheological measurements were conducted to study the solution viscosity and viscoelastic behavior of the mAb solutions. The so-determined analytical parameters were scored and merged in an analytical toolbox. The resulting scoring was then successfully correlated with long-term storage (40 d of incubation) experiments. Our results indicate that the sensitivity of complex rheological measurements, in combination with the applied techniques, allows reliable statements to be made with respect to the effect of solution properties, such as protein concentration, ionic strength, and pH shift, on the strength of protein-protein interaction and solution colloidal stability. PMID:28617076
Devriendt, Floris; Moldovan, Darie; Verbeke, Wouter
2018-03-01
Prescriptive analytics extends on predictive analytics by allowing to estimate an outcome in function of control variables, allowing as such to establish the required level of control variables for realizing a desired outcome. Uplift modeling is at the heart of prescriptive analytics and aims at estimating the net difference in an outcome resulting from a specific action or treatment that is applied. In this article, a structured and detailed literature survey on uplift modeling is provided by identifying and contrasting various groups of approaches. In addition, evaluation metrics for assessing the performance of uplift models are reviewed. An experimental evaluation on four real-world data sets provides further insight into their use. Uplift random forests are found to be consistently among the best performing techniques in terms of the Qini and Gini measures, although considerable variability in performance across the various data sets of the experiments is observed. In addition, uplift models are frequently observed to be unstable and display a strong variability in terms of performance across different folds in the cross-validation experimental setup. This potentially threatens their actual use for business applications. Moreover, it is found that the available evaluation metrics do not provide an intuitively understandable indication of the actual use and performance of a model. Specifically, existing evaluation metrics do not facilitate a comparison of uplift models and predictive models and evaluate performance either at an arbitrary cutoff or over the full spectrum of potential cutoffs. In conclusion, we highlight the instability of uplift models and the need for an application-oriented approach to assess uplift models as prime topics for further research.
Pérez-Rodríguez, Michael; Pellerano, Roberto Gerardo; Pezza, Leonardo; Pezza, Helena Redigolo
2018-05-15
Tetracyclines are widely used for both the treatment and prevention of diseases in animals as well as for the promotion of rapid animal growth and weight gain. This practice may result in trace amounts of these drugs in products of animal origin, such as milk and eggs, posing serious risks to human health. The presence of tetracycline residues in foods can lead to the transmission of antibiotic-resistant pathogenic bacteria through the food chain. In order to ensure food safety and avoid exposure to these substances, national and international regulatory agencies have established tolerance levels for authorized veterinary drugs, including tetracycline antimicrobials. In view of that, numerous sensitive and specific methods have been developed for the quantification of these compounds in different food matrices. One will note, however, that the determination of trace residues in foods such as milk and eggs often requires extensive sample extraction and preparation prior to conducting instrumental analysis. Sample pretreatment is usually the most complicated step in the analytical process and covers both cleaning and pre-concentration. Optimal sample preparation can reduce analysis time and sources of error, enhance sensitivity, apart from enabling unequivocal identification, confirmation and quantification of target analytes. The development and implementation of more environmentally friendly analytical procedures, which involve the use of less hazardous solvents and smaller sample sizes compared to traditional methods, is a rapidly increasing trend in analytical chemistry. This review seeks to provide an updated overview of the main trends in sample preparation for the determination of tetracycline residues in foodstuffs. The applicability of several extraction and clean-up techniques employed in the analysis of foodstuffs, especially milk and egg samples, is also thoroughly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
The Genetic Basis of Mendelian Phenotypes: Discoveries, Challenges, and Opportunities
Chong, Jessica X.; Buckingham, Kati J.; Jhangiani, Shalini N.; Boehm, Corinne; Sobreira, Nara; Smith, Joshua D.; Harrell, Tanya M.; McMillin, Margaret J.; Wiszniewski, Wojciech; Gambin, Tomasz; Coban Akdemir, Zeynep H.; Doheny, Kimberly; Scott, Alan F.; Avramopoulos, Dimitri; Chakravarti, Aravinda; Hoover-Fong, Julie; Mathews, Debra; Witmer, P. Dane; Ling, Hua; Hetrick, Kurt; Watkins, Lee; Patterson, Karynne E.; Reinier, Frederic; Blue, Elizabeth; Muzny, Donna; Kircher, Martin; Bilguvar, Kaya; López-Giráldez, Francesc; Sutton, V. Reid; Tabor, Holly K.; Leal, Suzanne M.; Gunel, Murat; Mane, Shrikant; Gibbs, Richard A.; Boerwinkle, Eric; Hamosh, Ada; Shendure, Jay; Lupski, James R.; Lifton, Richard P.; Valle, David; Nickerson, Deborah A.; Bamshad, Michael J.
2015-01-01
Discovering the genetic basis of a Mendelian phenotype establishes a causal link between genotype and phenotype, making possible carrier and population screening and direct diagnosis. Such discoveries also contribute to our knowledge of gene function, gene regulation, development, and biological mechanisms that can be used for developing new therapeutics. As of February 2015, 2,937 genes underlying 4,163 Mendelian phenotypes have been discovered, but the genes underlying ∼50% (i.e., 3,152) of all known Mendelian phenotypes are still unknown, and many more Mendelian conditions have yet to be recognized. This is a formidable gap in biomedical knowledge. Accordingly, in December 2011, the NIH established the Centers for Mendelian Genomics (CMGs) to provide the collaborative framework and infrastructure necessary for undertaking large-scale whole-exome sequencing and discovery of the genetic variants responsible for Mendelian phenotypes. In partnership with 529 investigators from 261 institutions in 36 countries, the CMGs assessed 18,863 samples from 8,838 families representing 579 known and 470 novel Mendelian phenotypes as of January 2015. This collaborative effort has identified 956 genes, including 375 not previously associated with human health, that underlie a Mendelian phenotype. These results provide insight into study design and analytical strategies, identify novel mechanisms of disease, and reveal the extensive clinical variability of Mendelian phenotypes. Discovering the gene underlying every Mendelian phenotype will require tackling challenges such as worldwide ascertainment and phenotypic characterization of families affected by Mendelian conditions, improvement in sequencing and analytical techniques, and pervasive sharing of phenotypic and genomic data among researchers, clinicians, and families. PMID:26166479
Goodman, Corey W.; Major, Heather J.; Walls, William D.; Sheffield, Val C.; Casavant, Thomas L.; Darbro, Benjamin W.
2016-01-01
Chromosomal microarrays (CMAs) are routinely used in both research and clinical laboratories; yet, little attention has been given to the estimation of genome-wide true and false negatives during the assessment of these assays and how such information could be used to calibrate various algorithmic metrics to improve performance. Low-throughput, locus-specific methods such as fluorescence in situ hybridization (FISH), quantitative PCR (qPCR), or multiplex ligation-dependent probe amplification (MLPA) preclude rigorous calibration of various metrics used by copy number variant (CNV) detection algorithms. To aid this task, we have established a comparative methodology, CNV-ROC, which is capable of performing a high throughput, low cost, analysis of CMAs that takes into consideration genome-wide true and false negatives. CNV-ROC uses a higher resolution microarray to confirm calls from a lower resolution microarray and provides for a true measure of genome-wide performance metrics at the resolution offered by microarray testing. CNV-ROC also provides for a very precise comparison of CNV calls between two microarray platforms without the need to establish an arbitrary degree of overlap. Comparison of CNVs across microarrays is done on a per-probe basis and receiver operator characteristic (ROC) analysis is used to calibrate algorithmic metrics, such as log2 ratio threshold, to enhance CNV calling performance. CNV-ROC addresses a critical and consistently overlooked aspect of analytical assessments of genome-wide techniques like CMAs which is the measurement and use of genome-wide true and false negative data for the calculation of performance metrics and comparison of CNV profiles between different microarray experiments. PMID:25595567
Mercury-induced fragmentation of n-decane and n-undecane in positive mode ion mobility spectrometry.
Gunzer, F
2015-09-21
Ion mobility spectrometry is a well-known technique for trace gas analysis. Using soft ionization techniques, fragmentation of analytes is normally not observed, with the consequence that analyte spectra of single substances are quite simple, i.e. showing in general only one peak. If the concentration is high enough, an extra cluster peak involving two analyte molecules can often be observed. When investigating n-alkanes, different results regarding the number of peaks in the spectra have been obtained in the past using this spectrometric technique. Here we present results obtained when analyzing n-alkanes (n-hexane to n-undecane) with a pulsed electron source, which show no fragmentation or clustering at all. However, when investigating a mixture of mercury and an n-alkane, a situation quite typical in the oil and gas industry, a strong fragmentation and cluster formation involving these fragments has been observed exclusively for n-decane and n-undecane.
[Recent Development of Atomic Spectrometry in China].
Xiao, Yuan-fang; Wang, Xiao-hua; Hang, Wei
2015-09-01
As an important part of modern analytical techniques, atomic spectrometry occupies a decisive status in the whole analytical field. The development of atomic spectrometry also reflects the continuous reform and innovation of analytical techniques. In the past fifteen years, atomic spectrometry has experienced rapid development and been applied widely in many fields in China. This review has witnessed its development and remarkable achievements. It contains several directions of atomic spectrometry, including atomic emission spectrometry (AES), atomic absorption spectrometry (AAS), atomic fluorescence spectrometry (AFS), X-ray fluorescence spectrometry (XRF), and atomic mass spectrometry (AMS). Emphasis is put on the innovation of the detection methods and their applications in related fields, including environmental samples, biological samples, food and beverage, and geological materials, etc. There is also a brief introduction to the hyphenated techniques utilized in atomic spectrometry. Finally, the prospects of atomic spectrometry in China have been forecasted.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prentice, H. J.; Proud, W. G.
2006-07-28
A technique has been developed to determine experimentally the three-dimensional displacement field on the rear surface of a dynamically deforming plate. The technique combines speckle analysis with stereoscopy, using a modified angular-lens method: this incorporates split-frame photography and a simple method by which the effective lens separation can be adjusted and calibrated in situ. Whilst several analytical models exist to predict deformation in extended or semi-infinite targets, the non-trivial nature of the wave interactions complicates the generation and development of analytical models for targets of finite depth. By interrogating specimens experimentally to acquire three-dimensional strain data points, both analytical andmore » numerical model predictions can be verified more rigorously. The technique is applied to the quasi-static deformation of a rubber sheet and dynamically to Mild Steel sheets of various thicknesses.« less
Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit
2017-10-13
The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.
Potvin, Christopher M; Zhou, Hongde
2011-11-01
The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.
Benhammouda, Brahim; Vazquez-Leal, Hector
2016-01-01
This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.
Analytic double product integrals for all-frequency relighting.
Wang, Rui; Pan, Minghao; Chen, Weifeng; Ren, Zhong; Zhou, Kun; Hua, Wei; Bao, Hujun
2013-07-01
This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals—the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.
Analytics for Cyber Network Defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plantenga, Todd.; Kolda, Tamara Gibson
2011-06-01
This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.
Comments on higher rank Wilson loops in N$$ \\mathcal{N} $$ = 2∗
Liu, James T.; Zayas, Leopoldo A. Pando; Zhou, Shan
2018-01-01
For N = 2∗ theory with U(N) gauge group we evaluate expectation values of Wilson loops in representations described by a rectangular Young tableau with n rows and k columns. The evaluation reduces to a two-matrix model and we explain, using a combination of numerical and analytical techniques, the general properties of the eigenvalue distributions in various regimes of parameters (N, λ, n, k) where λ is the ’t Hooft coupling. In the large N limit we present analytic results for the leading and sub-leading contributions. In the particular cases of only one row or one column we reproduce previouslymore » known results for the totally symmetry and totally antisymmetric representations. We also extensively discusss the N = 4 limit of the N = 2∗ theory. While establishing these connections we clarify aspects of various orders of limits and how to relax them; we also find it useful to explicitly address details of the genus expansion. As a result, for the totally symmetric Wilson loop we find new contributions that improve the comparison with the dual holographic computation at one loop order in the appropriate regime.« less
Meng, X; Ma, Q; Bai, H; Wang, Z; Han, C; Wang, C
2017-08-01
A comprehensive methodology for the simultaneous determination of 15 multiclass organic UV filters in sunscreen cosmetics was developed using high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry (HPLC-ESI-MS/MS). Sunscreen cosmetics of various matrices, such as toning lotion, emulsion, cream and lipstick, were analysed. Ultrasound-assisted extraction (UAE) was utilized as the extraction technique for sample preparation. The 15 UV filters were chromatographically separated by two groups of mobile phase system on an XBridge C 18 analytical column (150 × 2.1 mm I.D., 3.5 μm particle size) and quantified using HPLC-ESI-MS/MS. The quantitation was performed using the external calibration method. The established method was validated in terms of linearity, sensitivity, specificity, accuracy, stability, intraday and interday precisions, recovery and matrix effect. The method was also applied for the determination of UV filters in commercial sunscreen cosmetics. The experimental results demonstrated that the developed method was accurate, rapid and sensitive and can be used for the analytical control of sunscreen cosmetics. © 2016 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Hosseini, Samira; Aeinehvand, Mohammad M; Uddin, Shah M; Benzina, Abderazak; Rothan, Hussin A; Yusof, Rohana; Koole, Leo H; Madou, Marc J; Djordjevic, Ivan; Ibrahim, Fatimah
2015-11-09
The application of microfluidic devices in diagnostic systems is well-established in contemporary research. Large specific surface area of microspheres, on the other hand, has secured an important position for their use in bioanalytical assays. Herein, we report a combination of microspheres and microfluidic disk in a unique hybrid platform for highly sensitive and selective detection of dengue virus. Surface engineered polymethacrylate microspheres with carefully designed functional groups facilitate biorecognition in a multitude manner. In order to maximize the utility of the microspheres' specific surface area in biomolecular interaction, the microfluidic disk was equipped with a micromixing system. The mixing mechanism (microballoon mixing) enhances the number of molecular encounters between spheres and target analyte by accessing the entire sample volume more effectively, which subsequently results in signal amplification. Significant reduction of incubation time along with considerable lower detection limits were the prime motivations for the integration of microspheres inside the microfluidic disk. Lengthy incubations of routine analytical assays were reduced from 2 hours to 5 minutes while developed system successfully detected a few units of dengue virus. Obtained results make this hybrid microsphere-microfluidic approach to dengue detection a promising avenue for early detection of this fatal illness.
Harju, Kirsi; Rapinoja, Marja-Leena; Avondet, Marc-André; Arnold, Werner; Schär, Martin; Luginbühl, Werner; Kremp, Anke; Suikkanen, Sanna; Kankaanpää, Harri; Burrell, Stephen; Söderström, Martin; Vanninen, Paula
2015-01-01
A saxitoxin (STX) proficiency test (PT) was organized as part of the Establishment of Quality Assurance for the Detection of Biological Toxins of Potential Bioterrorism Risk (EQuATox) project. The aim of this PT was to provide an evaluation of existing methods and the European laboratories’ capabilities for the analysis of STX and some of its analogues in real samples. Homogenized mussel material and algal cell materials containing paralytic shellfish poisoning (PSP) toxins were produced as reference sample matrices. The reference material was characterized using various analytical methods. Acidified algal extract samples at two concentration levels were prepared from a bulk culture of PSP toxins producing dinoflagellate Alexandrium ostenfeldii. The homogeneity and stability of the prepared PT samples were studied and found to be fit-for-purpose. Thereafter, eight STX PT samples were sent to ten participating laboratories from eight countries. The PT offered the participating laboratories the possibility to assess their performance regarding the qualitative and quantitative detection of PSP toxins. Various techniques such as official Association of Official Analytical Chemists (AOAC) methods, immunoassays, and liquid chromatography-mass spectrometry were used for sample analyses. PMID:26602927
NASA Astrophysics Data System (ADS)
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Kiyono, Ken; Tsujimoto, Yutaka
2016-07-01
We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.
Membrane wrinkling patterns and control with SMA and SMPC actuators
NASA Astrophysics Data System (ADS)
Lu, Mingyu; Li, Yunliang; Tan, Huifeng; Zhou, Limin
2009-07-01
Wrinkling is a main factor affecting the performance of the membrane structures and is always considered to be a failure as it can cause dramatic decrease of shape accuracy. The study of membrane wrinkling control has the analytical and experimental meanings. In this paper, a feasible membrane shape control method is presented. An expression of wrinkle wavelength using stress extremum principle is established based on the tension field theory and the Von Karman large deflection formula which verifies the generation and evolution reason of membrane wrinkles. The control mechanism for membrane wrinkles is developed using shape memory alloy (SMA) and shape memory polymer composite (SMPC) actuators which are attached to the boundaries of the membrane for producing contraction/expansion forces to adjust the shape of the membrane. The whole control process is monitored by photogrammetric technique. Numerical simulations are also conducted using ANSYS finite element software with the nonlinear post-buckling analytical method. Both the experimental and numerical results show that the amplitudes of wrinkles are effectively controlled by SMA and SMPC actuators. The method introduced in this paper provides the foundation for shape control of the membrane wrinkling and is important to the future work on vibration control of space membrane structures.
Chemoviscosity modeling for thermosetting resins - I
NASA Technical Reports Server (NTRS)
Hou, T. H.
1984-01-01
A new analytical model for chemoviscosity variation during cure of thermosetting resins was developed. This model is derived by modifying the widely used WLF (Williams-Landel-Ferry) Theory in polymer rheology. Major assumptions involved are that the rate of reaction is diffusion controlled and is linearly inversely proportional to the viscosity of the medium over the entire cure cycle. The resultant first order nonlinear differential equation is solved numerically, and the model predictions compare favorably with experimental data of EPON 828/Agent U obtained on a Rheometrics System 4 Rheometer. The model describes chemoviscosity up to a range of six orders of magnitude under isothermal curing conditions. The extremely non-linear chemoviscosity profile for a dynamic heating cure cycle is predicted as well. The model is also shown to predict changes of glass transition temperature for the thermosetting resin during cure. The physical significance of this prediction is unclear at the present time, however, and further research is required. From the chemoviscosity simulation point of view, the technique of establishing an analytical model as described here is easily applied to any thermosetting resin. The model thus obtained is used in real-time process controls for fabricating composite materials.
Sonic Boom: Six Decades of Research
NASA Technical Reports Server (NTRS)
Maglieri, Domenic J.; Bobbitt, Percy J.; Plotkin, Kenneth J.; Shepherd, Kevin P.; Coen, Peter G.; Richwine, David M.
2014-01-01
Sonic booms generated by aircraft traveling at supersonic speeds have been the subject of extensive aeronautics research for over 60 years. Hundreds of papers have been published that document the experimental and analytical research conducted during this time period. The purpose of this publication is to assess and summarize this work and establish the state-of-the-art for researchers just entering the field, or for those interested in a particular aspect of the subject. This publication consists of ten chapters that cover the experimental and analytical aspects of sonic boom generation, propagation and prediction with summary remarks provided at the end of each chapter. Aircraft maneuvers, sonic boom minimization, simulation techniques and devices as well as human, structural, and other responses to sonic booms are also discussed. The geometry and boom characteristics of various low-boom concepts, both large civil transports and smaller business-jet concepts, are included. The final chapter presents an assessment of civilian supersonic overland flight and highlights the need for continued research and a low-boom demonstrator vehicle. Summary remarks are provided at the end of each chapter. The studies referenced in this publication have been drawn from over 500 references.
To address accuracy and precision using methods from analytical chemistry and computational physics.
Kozmutza, Cornelia; Picó, Yolanda
2009-04-01
In this work the pesticides were determined by liquid chromatography-mass spectrometry (LC-MS). In present study the occurrence of imidacloprid in 343 samples of oranges, tangerines, date plum, and watermelons from Valencian Community (Spain) has been investigated. The nine additional pesticides were chosen as they have been recommended for orchard treatment together with imidacloprid. The Mulliken population analysis has been applied to present the charge distribution in imidacloprid. Partitioned energy terms and the virial ratios have been calculated for certain molecules entering in interaction. A new technique based on the comparison of the decomposed total energy terms at various configurations is demonstrated in this work. The interaction ability could be established correctly in the studied case. An attempt is also made in this work to address accuracy and precision. These quantities are well-known in experimental measurements. In case precise theoretical description is achieved for the contributing monomers and also for the interacting complex structure some properties of this latter system can be predicted to quite a good accuracy. Based on simple hypothetical considerations we estimate the impact of applying computations on reducing the amount of analytical work.
Vial, Jérôme; Pezous, Benoît; Thiébaut, Didier; Sassiat, Patrick; Teillet, Béatrice; Cahours, Xavier; Rivals, Isabelle
2011-01-30
GCxGC is now recognized as the most suited analytical technique for the characterization of complex mixtures of volatile compounds; it is implemented worldwide in academic and industrial laboratories. However, in the frame of comprehensive analysis of non-target analytes, going beyond the visual examination of the color plots remains challenging for most users. We propose a strategy that aims at classifying chromatograms according to the chemical composition of the samples while determining the origin of the discrimination between different classes of samples: the discriminant pixel approach. After data pre-processing and time-alignment, the discriminatory power of each chromatogram pixel for a given class was defined as its correlation with the membership to this class. Using a peak finding algorithm, the most discriminant pixels were then linked to chromatographic peaks. Finally, crosschecking with mass spectrometry data enabled to establish relationships with compounds that could consequently be considered as candidate class markers. This strategy was applied to a large experimental data set of 145 GCxGC-MS chromatograms of tobacco extracts corresponding to three distinct classes of tobacco. Copyright © 2010 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaHaye, Nicole L.; Phillips, Mark C.; Duffin, Andrew M.
2016-01-01
Both laser-induced breakdown spectroscopy (LIBS) and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) are well-established analytical techniques with their own unique advantages and disadvantages. The combination of the two analytical methods is a very promising way to overcome the challenges faced by each method individually. We made a comprehensive comparison of local plasma conditions between nanosecond (ns) and femtosecond (fs) laser ablation (LA) sources in a combined LIBS and LA-ICP-MS system. The optical emission spectra and ICP-MS signal were recorded simultaneously for both ns- and fs-LA and figures of merit of the system were analyzed. Characterization of the plasma was conductedmore » by evaluating temperature and density of the plume under various irradiation conditions using optical emission spectroscopy, and correlations to ns- and fs-LIBS and LA-ICP-MS signal were made. The present study is very useful for providing conditions for a multimodal system as well as giving insight into how laser ablation plume parameters are related to LA-ICP-MS and LIBS results for both ns- and fs-LA.« less
Noegrohati, Sri; Hernadi, Elan; Asviastuti, Syanti
2018-06-01
Production of red flesh dragon fruit (Hylocereus polyrhizus) was hampered by Colletotrichum sp. Pre-harvest application of azoxystrobin and difenoconazole mixture is recommended, therefore, a selective and sensitive multi residues analytical method is required in monitoring and evaluating the commodity's safety. LC-MS/MS is a well-established analytical technique for qualitative and quantitative determination in complex matrices. However, this method is hurdled by co-eluted coextractives interferences. This work evaluated the pH effect of acetate buffered and citrate buffered QuEChERS sample preparation in their effectiveness of matrix effect reduction. Citrate buffered QuEChERS proved to produce clean final extract with relative matrix effect 0.4%-0.7%. Method validation of the selected sample preparation followed by LC-MS/MS for whole dragon fruit, flesh and peel matrices fortified at 0.005, 0.01, 0.1 and 1 g/g showed recoveries 75%-119%, intermediate repeatability 2%-14%. The expanded uncertainties were 7%-48%. Based on the international acceptance criteria, this method is valid.
Quality control analytical methods: refractive index.
Allen, Loyd V
2015-01-01
There are numerous analytical methods that can be utilized in a compounding pharmacy for a quality-assurance program. Since the index of refraction of a liquid/solution is a physical constant, it can be used to assist in identification of a substance, establish its purity, and, in some instances, to determine the concentration of an analyte in solution. This article serves as an introduction to refractive index and some applications of its use in a compounding program.
Superabsorption of light via quantum engineering
Higgins, K. D. B.; Benjamin, S. C.; Stace, T. M.; Milburn, G. J.; Lovett, B. W.; Gauger, E. M.
2014-01-01
Almost 60 years ago Dicke introduced the term superradiance to describe a signature quantum effect: N atoms can collectively emit light at a rate proportional to N2. Structures that superradiate must also have enhanced absorption, but the former always dominates in natural systems. Here we show that this restriction can be overcome by combining several well-established quantum control techniques. Our analytical and numerical calculations show that superabsorption can then be achieved and sustained in certain simple nanostructures, by trapping the system in a highly excited state through transition rate engineering. This opens the prospect of a new class of quantum nanotechnology with potential applications including photon detection and light-based power transmission. An array of quantum dots or a molecular ring structure could provide a suitable platform for an experimental demonstration. PMID:25146588
Efficient management design for swimming exercise treatment.
Kim, Kyunghun; Kyung, Taewon; Kim, Wonhyun; Shin, Chungsick; Song, Youngjae; Lee, Moo Yeol; Lee, Hyunwoo; Cho, Yongchan
2009-12-01
Exercise-mediated physical treatment has attracted much recent interest. In particular, swimming is a representative exercise treatment method recommended for patients experiencing muscular and cardiovascular diseases. The present study sought to design a swimming-based exercise treatment management system. A survey questionnaire was completed by participants to assess the prevalence of muscular and cardiovascular diseases among adult males and females participating in swimming programs at sport centers in metropolitan regions of country. Using the Fuzzy Analytic Hierarchy Process (AHP) technique, weighted values of indices were determined, to maximize participant clarity. A patient management system model was devised using information technology. The favorable results are evidence of the validity of this approach. Additionally, the swimming-based exercise management system can be supplemented together with analyses of weighted values considering connectivity between established indices.
Low Cost Solar Array Project: Composition Measurements by Analytical Photo Catalysis
NASA Technical Reports Server (NTRS)
Sutton, D. G.; Galvan, L.; Melzer, J.; Heidner, R. F., III
1979-01-01
The applicability of the photon catalysis technique for effecting composition analysis of silicon samples is discussed. A detector for the impurities Al, Cr, Fe, Mn, Ti, V, Mo and Zr is evaluated. During the first reporting period Al, Cr, Fe, and Mn were detected with the photon catalysis method. The best fluorescence lines to monitor and determine initial sensitivities to each of these elements by atomic absorption calibration were established. In the course of these tests vapor pressure curves for these four pure substances were also mapped. Ti and Si were detected. The best lines to monitor were catalogued and vapor pressure curves were determined. Attempts to detect vanadium were unsuccessful due to the refractory nature of this element and the limited temperature range of the evaporator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombard, K.H.
1994-08-01
The objectives of this test plan are to show the value added by using bioremediation as an effective and environmentally sound method to remediate petroleum contaminated soils (PCS) by: demonstrating bioremediation as a permanent method for remediating soils contaminated with petroleum products; establishing the best operating conditions for maximizing bioremediation and minimizing volatilization for SRS PCS during different seasons; determining the minimum set of analyses and sampling frequency to allow efficient and cost-effective operation; determining best use of existing site equipment and personnel to optimize facility operations and conserve SRS resources; and as an ancillary objective, demonstrating and optimizing newmore » and innovative analytical techniques that will lower cost, decrease time, and decrease secondary waste streams for required PCS assays.« less