Science.gov

Sample records for analytical techniques based

  1. Analytical Method for Selecting a Rectification Technique for a Piezoelectric Generator based on Admittance Measurement

    NASA Astrophysics Data System (ADS)

    Mateu, Loreto; Zessin, Henrik; Spies, Peter

    2013-12-01

    AC-DC converters employed for harvesting power from piezoelectric transducers can be divided into linear (i.e. diode bridge) and non-linear (i.e. synchronized switch harvesting on inductor, SSHI). This paper presents an analytical technique based on the measurement of the impedance circle of the piezoelectric element to determine whether either diode bridge or SSHI converter harvests more of the available power at the piezoelectric element.

  2. A novel analytical approximation technique for highly nonlinear oscillators based on the energy balance method

    NASA Astrophysics Data System (ADS)

    Hosen, Md. Alal; Chowdhury, M. S. H.; Ali, Mohammad Yeakub; Ismail, Ahmad Faris

    In the present paper, a novel analytical approximation technique has been proposed based on the energy balance method (EBM) to obtain approximate periodic solutions for the focus generalized highly nonlinear oscillators. The expressions of the natural frequency-amplitude relationship are obtained using a novel analytical way. The accuracy of the proposed method is investigated on three benchmark oscillatory problems, namely, the simple relativistic oscillator, the stretched elastic wire oscillator (with a mass attached to its midpoint) and the Duffing-relativistic oscillator. For an initial oscillation amplitude A0 = 100, the maximal relative errors of natural frequency found in three oscillators are 2.1637%, 0.0001% and 1.201%, respectively, which are much lower than the errors found using the existing methods. It is highly remarkable that an excellent accuracy of the approximate natural frequency has been found which is valid for the whole range of large values of oscillation amplitude as compared with the exact ones. Very simple solution procedure and high accuracy that is found in three benchmark problems reveal the novelty, reliability and wider applicability of the proposed analytical approximation technique.

  3. The Efficiency of South African Universities: A Study Based on the Analytical Review Technique.

    ERIC Educational Resources Information Center

    Taylor, B.; Harris, G.

    2002-01-01

    Examined the relative efficiency of 10 South African universities between 1994 and 1997 using the analytical review technique. Identified the relatively efficient, relatively inefficient, and least efficient. Important efficiency factors include student population dimensions, quality and deployment of personnel resources, and allocation of…

  4. Novel Analytical Techniques Based on a Enhanced Electron Attachment Process - Final Report - 09/15/1996 - 06/15/2001

    SciTech Connect

    Pinnaduwage, Lal A.; Buchanan, Michelle V.

    2001-06-15

    Present analytical methodologies for the detection of chlorinated compounds important to DOE's environmental restoration program, such as DNAPLs [dense non-aqueous phase liquids - such as carbon tetrachloride, trichloroethylene (TCE), perchloroethylene (PCE)], polychlorinated biphenyls (PCB), and others, involve detection by negative-ion-based analytical techniques. These techniques exploit electron attachment to analyte molecules in their ground electronic states, and are limited to particular compounds with appropriate electron capture cross sections. For example, PCB contamination is detected by analysis of mixtures of chlorinated homologues of these biphenyls. Homologues with lower numbers of chlorines do not efficiently attach thermal electrons and thus are not detected by either electron capture chromatographic detectors or by negative ion chemical ionization mass spectrometry. We proposed three novel analytical techniques based on enhanced negative-ion formation via electron attachment to highly-excited electronic states of molecules. In one of the proposed techniques, the excited states of the (analyte) molecules are populated via laser excitation; the resulting negative ions are mass analyzed for identification. The other two proposed techniques utilize a specialized gas discharge for the formation of excited species (and low-energy electrons for attachment), and thus will provide a cost-effective method if successful. In one version, the negative ions will be mass analyzed -as in the laser-based technique- and in the other, the decrease in electron density due to excited state attachment will be monitored (electron capture detector mode). A plasma mixing scheme will be employed to excite the analyte molecules so that the excited states of the analyte molecules will not be destroyed by the discharge.

  5. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  6. Flexible microfluidic cloth-based analytical devices using a low-cost wax patterning technique.

    PubMed

    Nilghaz, Azadeh; Wicaksono, Dedy H B; Gustiono, Dwi; Abdul Majid, Fadzilah Adibah; Supriyanto, Eko; Abdul Kadir, Mohammed Rafiq

    2012-01-01

    This paper describes the fabrication of microfluidic cloth-based analytical devices (μCADs) using a simple wax patterning method on cotton cloth for performing colorimetric bioassays. Commercial cotton cloth fabric is proposed as a new inexpensive, lightweight, and flexible platform for fabricating two- (2D) and three-dimensional (3D) microfluidic systems. We demonstrated that the wicking property of the cotton microfluidic channel can be improved by scouring in soda ash (Na(2)CO(3)) solution which will remove the natural surface wax and expose the underlying texture of the cellulose fiber. After this treatment, we fabricated narrow hydrophilic channels with hydrophobic barriers made from patterned wax to define the 2D microfluidic devices. The designed pattern is carved on wax-impregnated paper, and subsequently transferred to attached cotton cloth by heat treatment. To further obtain 3D microfluidic devices having multiple layers of pattern, a single layer of wax patterned cloth can be folded along a predefined folding line and subsequently pressed using mechanical force. All the fabrication steps are simple and low cost since no special equipment is required. Diagnostic application of cloth-based devices is shown by the development of simple devices that wick and distribute microvolumes of simulated body fluids along the hydrophilic channels into reaction zones to react with analytical reagents. Colorimetric detection of bovine serum albumin (BSA) in artificial urine is carried out by direct visual observation of bromophenol blue (BPB) colour change in the reaction zones. Finally, we show the flexibility of the novel microfluidic platform by conducting a similar reaction in a bent pinned μCAD.

  7. Flexible microfluidic cloth-based analytical devices using a low-cost wax patterning technique.

    PubMed

    Nilghaz, Azadeh; Wicaksono, Dedy H B; Gustiono, Dwi; Abdul Majid, Fadzilah Adibah; Supriyanto, Eko; Abdul Kadir, Mohammed Rafiq

    2012-01-01

    This paper describes the fabrication of microfluidic cloth-based analytical devices (μCADs) using a simple wax patterning method on cotton cloth for performing colorimetric bioassays. Commercial cotton cloth fabric is proposed as a new inexpensive, lightweight, and flexible platform for fabricating two- (2D) and three-dimensional (3D) microfluidic systems. We demonstrated that the wicking property of the cotton microfluidic channel can be improved by scouring in soda ash (Na(2)CO(3)) solution which will remove the natural surface wax and expose the underlying texture of the cellulose fiber. After this treatment, we fabricated narrow hydrophilic channels with hydrophobic barriers made from patterned wax to define the 2D microfluidic devices. The designed pattern is carved on wax-impregnated paper, and subsequently transferred to attached cotton cloth by heat treatment. To further obtain 3D microfluidic devices having multiple layers of pattern, a single layer of wax patterned cloth can be folded along a predefined folding line and subsequently pressed using mechanical force. All the fabrication steps are simple and low cost since no special equipment is required. Diagnostic application of cloth-based devices is shown by the development of simple devices that wick and distribute microvolumes of simulated body fluids along the hydrophilic channels into reaction zones to react with analytical reagents. Colorimetric detection of bovine serum albumin (BSA) in artificial urine is carried out by direct visual observation of bromophenol blue (BPB) colour change in the reaction zones. Finally, we show the flexibility of the novel microfluidic platform by conducting a similar reaction in a bent pinned μCAD. PMID:22089026

  8. ARPEFS as an analytic technique

    SciTech Connect

    Schach von Wittenau, A.E.

    1991-04-01

    Two modifications to the ARPEFS technique are introduced. These are studied using p(2 {times} 2)S/Cu(001) as a model system. The first modification is the obtaining of ARPEFS {chi}(k) curves at temperatures as low as our equipment will permit. While adding to the difficulty of the experiment, this modification is shown to almost double the signal-to-noise ratio of normal emission p(2 {times} 2)S/Cu(001) {chi}(k) curves. This is shown by visual comparison of the raw data and by the improved precision of the extracted structural parameters. The second change is the replacement of manual fitting of the Fourier filtered {chi}(k) curves by the use of the simplex algorithm for parameter determination. Again using p(2 {times} 2)S/Cu(001) data, this is shown to result in better agreement between experimental {chi}(k) curves and curves calculated based on model structures. The improved ARPEFS is then applied to p(2 {times} 2)S/Ni(111) and ({radical}3 {times} {radical}3) R30{degree}S/Ni(111). For p(2 {times} 2)S/Cu(001) we find a S-Cu bond length of 2.26 {Angstrom}, with the S adatom 1.31 {Angstrom} above the fourfold hollow site. The second Cu layer appears to be corrugated. Analysis of the p(2 {times} 2)S/Ni(111) data indicates that the S adatom adatom adsorbs onto the FCC threefold hollow site 1.53 {Angstrom} above the Ni surface. The S-Ni bond length is determined to be 2.13 {Angstrom}, indicating an outwards shift of the first layer Ni atoms. We are unable to assign a unique structure to ({radical}3 {times} {radical}3)R30{degree}S/Ni(111). An analysis of the strengths and weaknesses of ARPEFS as an experimental and analytic technique is presented, along with a summary of problems still to be addressed.

  9. Mapping of different structures on large area of granite sample using laser-ablation based analytical techniques, an exploratory study

    NASA Astrophysics Data System (ADS)

    Novotný, K.; Kaiser, J.; Galiová, M.; Konečná, V.; Novotný, J.; Malina, R.; Liška, M.; Kanický, V.; Otruba, V.

    2008-10-01

    Laser-ablation based analytical techniques represent a simple way for fast chemical analysis of different materials. In this work, an exploratory study of multi-element (Ca, Al, Fe, Mn) mappings of a granite sample surface was performed by laser-induced breakdown spectroscopy (LIBS) and subsequently by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) analysis. The operating parameters (e.g. pulse energy, ablation-crater size) were optimized for both techniques in order to achieve the appropriate conditions for two-dimensional high-resolution compositional mappings of mineral microstructures in large sample areas. The sample was scanned with 100 × 100 individual sample points to map an area of 20 × 20 mm 2. The normalized signals were used for construct of contour plots which were colored according local distribution of the selected elements. The results of two laser-based methods were compared and found to be similar.

  10. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  11. New analytical techniques in food science.

    PubMed

    Ibañez, E; Cifuentes, A

    2001-09-01

    In this review, some of the latest analytical techniques that are being used for the study and characterization of food are examined. This work intends to provide an updated overview (including works published up to June 1999) on the principal applications of such techniques together with their main advantages and drawbacks in food analysis. Some future developments of these systems and their foreseeable application in food characterization are also discussed. The reviewed techniques are those based on spectroscopic, biological, separation, and electrochemical procedures. Moreover, some relevant facts on new systems for sample preparation and on-line couplings are also given.

  12. Accelerator-based analytical technique in the evaluation of some Nigeria’s natural minerals: Fluorite, tourmaline and topaz

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, O. A.; Mazzoli, C.; Ceccato, D.; Akintunde, J. A.; De Poli, M.; Moschini, G.

    2005-10-01

    For the first time, the complementary accelerator-based analytical technique of PIXE and electron microprobe analysis (EMPA) were employed for the characterization of some Nigeria's natural minerals namely fluorite, tourmaline and topaz. These minerals occur in different areas in Nigeria. The minerals are mainly used as gemstones and for other scientific and technological applications and therefore are very important. There is need to characterize them to know the quality of these gemstones and update the geochemical data on them geared towards useful applications. PIXE analysis was carried out using the 1.8 MeV collimated proton beam from the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy. The novel results which show many elements at different concentrations in these minerals are presented and discussed.

  13. Improved seismic risk estimation for Bucharest, based on multiple hazard scenarios, analytical methods and new techniques

    NASA Astrophysics Data System (ADS)

    Toma-Danila, Dragos; Florinela Manea, Elena; Ortanza Cioflan, Carmen

    2014-05-01

    Bucharest, capital of Romania (with 1678000 inhabitants in 2011), is one of the most exposed big cities in Europe to seismic damage. The major earthquakes affecting the city have their origin in the Vrancea region. The Vrancea intermediate-depth source generates, statistically, 2-3 shocks with moment magnitude >7.0 per century. Although the focal distance is greater than 170 km, the historical records (from the 1838, 1894, 1908, 1940 and 1977 events) reveal severe effects in the Bucharest area, e.g. intensities IX (MSK) for the case of 1940 event. During the 1977 earthquake, 1420 people were killed and 33 large buildings collapsed. The nowadays building stock is vulnerable both due to construction (material, age) and soil conditions (high amplification, generated within the weak consolidated Quaternary deposits, their thickness is varying 250-500m throughout the city). A number of 373 old buildings, out of 2563, evaluated by experts are more likely to experience severe damage/collapse in the next major earthquake. The total number of residential buildings, in 2011, was 113900. In order to guide the mitigation measures, different studies tried to estimate the seismic risk of Bucharest, in terms of buildings, population or economic damage probability. Unfortunately, most of them were based on incomplete sets of data, whether regarding the hazard or the building stock in detail. However, during the DACEA Project, the National Institute for Earth Physics, together with the Technical University of Civil Engineering Bucharest and NORSAR Institute managed to compile a database for buildings in southern Romania (according to the 1999 census), with 48 associated capacity and fragility curves. Until now, the developed real-time estimation system was not implemented for Bucharest. This paper presents more than an adaptation of this system to Bucharest; first, we analyze the previous seismic risk studies, from a SWOT perspective. This reveals that most of the studies don't use

  14. Fabrication techniques for microfluidic paper-based analytical devices and their applications for biological testing: A review.

    PubMed

    Xia, Yanyan; Si, Jin; Li, Zhiyang

    2016-03-15

    Paper is increasingly recognized as a user-friendly and ubiquitous substrate for construction of microfluidic devices. Microfluidic paper-based analytical devices (μPADs) provide an alternative technology for development of affordable, portable, disposable and low-cost diagnostic tools for improving point of care testing (POCT) and disease screening in the developing world, especially in those countries with no- or low-infrastructure and limited trained medical and health professionals. We in this review present fabrication techniques for microfluidic devices and their respective applications for biological detection as reported to date. These include: (i) fabrication techniques: examples of devices fabricated by using two-dimensional (2D) and three-dimensional (3D) methods; (ii) detection application: biochemical, immunological and molecular detection by incorporating efficient detection methods such as, colorimetric detection, electrochemical detection, fluorescence detection, chemiluminescence (CL) detection, electrochemiluninescence (ECL) detection, photoelectrochemi (PEC) detection and so on. In addition, main advantages, disadvantages and future trends for the devices are also discussed in this review.

  15. Analytical techniques for ambient sulfate aerosols

    SciTech Connect

    Johnson, S.A.; Graczyk, D.G.; Kumar, R.; Cunningham, P.T.

    1981-06-01

    Work done to further develop the infrared spectroscopic analytical method for the analysis of atmospheric aerosol particles, as well as some exploratory work on a new procedure for determining proton acidity in aerosol samples is described. Earlier work had led to the successful use of infrared (ir) spectrophotometry for the analysis of nitrate, ammonium, and neutral and acidic sulfates in aerosol samples collected by an impactor on a Mylar-film substrate. In this work, a filter-extraction method was developed to prepare filter-collected aerosol samples for ir analysis. A study was made comparing the ir analytical results on filter-collected samples with impactor-collected samples. Also, the infrared analytical technique was compared in field studies with light-scattering techniques for aerosol analysis. A highly sensitive instrument for aerosol analysis using attenuated total internal reflection (ATR) infrared spectroscopy was designed, built, and tested. This instrument provides a measurement sensitivity much greater (by a factor of 6 for SO/sub 4//sup 2 -/) than that obtainable using the KBr-pellet method. This instrument collect size- and time-resolved samples and is potentially capable of providing automated, near real-time aerosol analysis. Exploratory work on a novel approach to the determination of proton acidity in filter- or impactor-collected aerosol samples is also described. In this technique, the acidic sample is reacted with an access of a tagged, vapor-phase base. The unreacted base is flushed off and the amount of the tag retained by the sample is a direct measure of the proton acidity of the sample. The base was tagged with Ge, which can be conveniently determined by the x-ray fluorescence technique.

  16. Proteomics: analytical tools and techniques.

    PubMed

    MacCoss, M J; Yates, J R

    2001-09-01

    Scientists have long been interested in measuring the effects of different stimuli on protein expression and metabolism. Analytical methods are being developed for the automated separation, identification, and quantitation of all of the proteins within the cell. Soon, investigators will be able to observe the effects of an experiment on every protein (as opposed to a selected few). This review presents a discussion of recent technological advances in proteomics in addition to exploring current methodological limitations.

  17. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  18. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    NASA Astrophysics Data System (ADS)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  19. Trace Analytical Techniques for Nuclear Forensics

    SciTech Connect

    Halverson, J.E.

    1999-04-28

    Over the history of the Savannah River Site, the Savannah River Technology Center (SRTC) has developed high sensitivity analytical capabilities in support of the Site's Environmental Monitoring Program and nuclear material protection process. Many of these techniques are applicable to the developing need for nuclear forensic analysis capabilities. Radiological and critically control procedures are in place at the SRTC, as well as clean room practices, to minimize the potential for a radiological evidentiary sample to contaminate personnel and the facility, as well as to minimize contaminating the sample thus rendering it useless by law enforcement agencies. Some of the trace analytical techniques available at the SRTC include ultra-low-level gamma and alpha spectrometry, high-sensitivity thermal ionization mass spectrometry, time-of-flight secondary ion mass spectrometry and trace organic analyses. These techniques have been tested during a planned domestic smuggling exercise and in the analysis of an unknown sample.In the event of an interdiction involving the illegal use or movement of radioactive material by U.S. law enforcement agencies (local, state or federal) forensic analyses will be used in developing and building a legal case against the perpetrators. The Savannah River Technology Center (SRTC) at the U.S. Department of Energy's Savannah River Site, a former nuclear production site currently conducting nuclear material stabilization missions, located in Aiken South Carolina, has a long history of performing trace analytical analyses for environmental monitoring. Many of these techniques are also applicable to nuclear forensic analyses. A summary of the trace analytical techniques used at the SRTC, which are applicable to Nuclear Forensics, is presented in this paper.Contamination control, of facilities and personnel involved in the analytical analyses, as well as preventing contamination of the sample, is a unique challenge for nuclear forensic analyses

  20. Differentiation of lemon essential oil based on volatile and non-volatile fractions with various analytical techniques: a metabolomic approach.

    PubMed

    Mehl, Florence; Marti, Guillaume; Boccard, Julien; Debrus, Benjamin; Merle, Philippe; Delort, Estelle; Baroux, Lucie; Raymo, Vilfredo; Velazco, Maria Inés; Sommer, Horst; Wolfender, Jean-Luc; Rudaz, Serge

    2014-01-15

    Due to the importance of citrus lemon oil for the industry, fast and reliable analytical methods that allow the authentication and/or classification of such oil, using the origin of production or extraction process, are necessary. To evaluate the potential of volatile and non-volatile fractions for classification purposes, volatile compounds of cold-pressed lemon oils were analyzed, using GC-FID/MS and FT-MIR, while the non-volatile residues were studied, using FT-MIR, (1)H-NMR and UHPLC-TOF-MS. 64 Lemon oil samples from Argentina, Spain and Italy were considered. Unsupervised and supervised multivariate analyses were sequentially performed on various data blocks obtained by the above techniques. Successful data treatments led to statistically significant models that discriminated and classified cold-pressed lemon oils according to their geographic origin, as well as their production processes. Studying the loadings allowed highlighting of important classes of discriminant variables that corresponded to putative or identified chemical functions and compounds. PMID:24054247

  1. Differentiation of lemon essential oil based on volatile and non-volatile fractions with various analytical techniques: a metabolomic approach.

    PubMed

    Mehl, Florence; Marti, Guillaume; Boccard, Julien; Debrus, Benjamin; Merle, Philippe; Delort, Estelle; Baroux, Lucie; Raymo, Vilfredo; Velazco, Maria Inés; Sommer, Horst; Wolfender, Jean-Luc; Rudaz, Serge

    2014-01-15

    Due to the importance of citrus lemon oil for the industry, fast and reliable analytical methods that allow the authentication and/or classification of such oil, using the origin of production or extraction process, are necessary. To evaluate the potential of volatile and non-volatile fractions for classification purposes, volatile compounds of cold-pressed lemon oils were analyzed, using GC-FID/MS and FT-MIR, while the non-volatile residues were studied, using FT-MIR, (1)H-NMR and UHPLC-TOF-MS. 64 Lemon oil samples from Argentina, Spain and Italy were considered. Unsupervised and supervised multivariate analyses were sequentially performed on various data blocks obtained by the above techniques. Successful data treatments led to statistically significant models that discriminated and classified cold-pressed lemon oils according to their geographic origin, as well as their production processes. Studying the loadings allowed highlighting of important classes of discriminant variables that corresponded to putative or identified chemical functions and compounds.

  2. Cost and Schedule Analytical Techniques Development

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) under contract NAS 8-40431 "Cost and Schedule Analytical Techniques Development Contract" (CSATD) during Option Year 3 (December 1, 1997 through November 30, 1998). This Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical products and deliverables in the form of parametric models, databases, methodologies, studies, and analyses to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) and other user organizations. Detailed Monthly Reports were submitted to MSFC in accordance with the contract's Statement of Work, Section IV "Reporting and Documentation". These reports spelled out each month's specific work performed, deliverables submitted, major meetings conducted, and other pertinent information. Therefore, this Final Report will summarize these activities at a higher level. During this contract Option Year, SAIC expended 25,745 hours in the performance of tasks called out in the Statement of Work. This represents approximately 14 full-time EPs. Included are the Huntsville-based team, plus SAIC specialists in San Diego, Ames Research Center, Tampa, and Colorado Springs performing specific tasks for which they are uniquely qualified.

  3. Nuclear analytical techniques in environmental studies.

    PubMed

    Jervis, R E

    1994-01-01

    Nuclear analytical techniques are particularly suitable for measuring trace components in a wide variety of environmental samples, and for that reason, the techniques have made a significant contribution to environmental research. Presently, at a time when biosphere contamination and threats of global change in the atmosphere are of widespread concern, there exist an impressive array of specialized instrumental methods available to life scientists engaged in environmental studies; however, the nuclear techniques will probably continue to play a useful role in the future, notwithstanding the decreasing availability of necessary facilities, such as research reactors and accelerators. Reasons for the particular suitability of radionanalytical techniques are reviewed and illustrated by examples of recent applications to solid wastes, biomonitoring, and urban aerosol source identification in this laboratory.

  4. Electroextraction and electromembrane extraction: Advances in hyphenation to analytical techniques

    PubMed Central

    Oedit, Amar; Ramautar, Rawi; Hankemeier, Thomas

    2016-01-01

    Electroextraction (EE) and electromembrane extraction (EME) are sample preparation techniques that both require an electric field that is applied over a liquid‐liquid system, which enables the migration of charged analytes. Furthermore, both techniques are often used to pre‐concentrate analytes prior to analysis. In this review an overview is provided of the body of literature spanning April 2012–November 2015 concerning EE and EME, focused on hyphenation to analytical techniques. First, the theoretical aspects of concentration enhancement in EE and EME are discussed to explain extraction recovery and enrichment factor. Next, overviews are provided of the techniques based on their hyphenation to LC, GC, CE, and direct detection. These overviews cover the compounds and matrices, experimental aspects (i.e. donor volume, acceptor volume, extraction time, extraction voltage, and separation time) and the analytical aspects (i.e. limit of detection, enrichment factor, and extraction recovery). Techniques that were either hyphenated online to analytical techniques or show high potential with respect to online hyphenation are highlighted. Finally, the potential future directions of EE and EME are discussed. PMID:26864699

  5. Analytical detection techniques for droplet microfluidics--a review.

    PubMed

    Zhu, Ying; Fang, Qun

    2013-07-17

    In the last decade, droplet-based microfluidics has undergone rapid progress in the fields of single-cell analysis, digital PCR, protein crystallization and high throughput screening. It has been proved to be a promising platform for performing chemical and biological experiments with ultra-small volumes (picoliter to nanoliter) and ultra-high throughput. The ability to analyze the content in droplet qualitatively and quantitatively is playing an increasing role in the development and application of droplet-based microfluidic systems. In this review, we summarized the analytical detection techniques used in droplet systems and discussed the advantage and disadvantage of each technique through its application. The analytical techniques mentioned in this paper include bright-field microscopy, fluorescence microscopy, laser induced fluorescence, Raman spectroscopy, electrochemistry, capillary electrophoresis, mass spectrometry, nuclear magnetic resonance spectroscopy, absorption detection, chemiluminescence, and sample pretreatment techniques. The importance of analytical detection techniques in enabling new applications is highlighted. We also discuss the future development direction of analytical detection techniques for droplet-based microfluidic systems.

  6. Comparison of four analytical techniques based on atomic spectrometry for the determination of total tin in canned foodstuffs.

    PubMed

    Boutakhrit, K; Crisci, M; Bolle, F; Van Loco, J

    2011-02-01

    Different techniques for the determination of total tin in beverages and canned foods by atomic spectrometry were compared. The performance characteristics of inductively coupled plasma-mass spectrometry (ICP-MS), hydride generation-inductively coupled plasma-atomic emission spectrometry (HG-ICP-AES), electrothermal atomisation-atomic absorption spectrometry (ETA-AAS) and inductively coupled plasma-atomic emission spectrometry (ICP-AES) were determined in terms of linearity, precision, recovery, limit of detection, decision limit (CCα) and detection capability (CCβ) (Decision 2002/657/EC). Calibration ranges were covered from ng l⁻¹ to mg l⁻¹ level. Limits of detection that ranged from 0.01, 0.05, 2.0 to 200 µg l⁻¹ were reached for ICP-MS; HG-ICP-AES; ETA-AAS and ICP-AES, respectively. Precision, calculated according to ISO 5725-2 for repeatability and within-laboratory reproducibility and expressed as relative standard deviation (RSD), ranged from 1.6% to 4.9%; and recovery, based on Decision 2002/657/EC, was found to be between 95% and 110%. Procedures for the mineralisation or extraction of total tin were compared. Wet digestion, sequentially, with nitric acid and hydrogen peroxide provided the best results. The influence of possible interferences present in canned food and beverage was studied, but no interference in the determination of tin was observed. Since maximum levels for tin established by European Union legislation vary from 50 mg kg⁻¹ in canned baby foods and infant foods up to 200 mg kg⁻¹ in canned food, ICP-AES was chosen as the preferred technique for routine analysis thanks to its good precision, reliability and ease of use. The accuracy of this routine method was confirmed by participation in six proficiency test schemes with z-scores ranging from -1.9 to 0.6. Several canned foodstuffs and beverage samples from a local market were analysed with this technique.

  7. Insights into metals in individual fine particles from municipal solid waste using synchrotron radiation-based micro-analytical techniques.

    PubMed

    Zhu, Yumin; Zhang, Hua; Shao, Liming; He, Pinjing

    2015-01-01

    Excessive inter-contamination with heavy metals hampers the application of biological treatment products derived from mixed or mechanically-sorted municipal solid waste (MSW). In this study, we investigated fine particles of <2mm, which are small fractions in MSW but constitute a significant component of the total heavy metal content, using bulk detection techniques. A total of 17 individual fine particles were evaluated using synchrotron radiation-based micro-X-ray fluorescence and micro-X-ray diffraction. We also discussed the association, speciation and source apportionment of heavy metals. Metals were found to exist in a diffuse distribution with heterogeneous intensities and intense hot-spots of <10 μm within the fine particles. Zn-Cu, Pb-Fe and Fe-Mn-Cr had significant correlations in terms of spatial distribution. The overlapped enrichment, spatial association, and the mineral phases of metals revealed the potential sources of fine particles from size-reduced waste fractions (such as scraps of organic wastes or ceramics) or from the importation of other particles. The diverse sources of heavy metal pollutants within the fine particles suggested that separate collection and treatment of the biodegradable waste fraction (such as food waste) is a preferable means of facilitating the beneficial utilization of the stabilized products.

  8. Accelerator-based analytical technique in the study of some anti-diabetic medicinal plants of Nigeria

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Omobuwajo, O. R.; Ceccato, D.; Adebajo, A. C.; Buoso, M. C.; Moschini, G.

    2008-05-01

    Diabetes mellitus, a clinical syndrome characterized by hyperglycemia due to deficiency of insulin, is a disease involving the endocrine pancreas and causes considerable morbidity and mortality in the world. In Nigeria, many plants, especially those implicated in herbal recipes for the treatment of diabetes, have not been screened for their elemental constituents while information on phytochemistry of some of them is not available. There is therefore the need to document these constituents as some of these plants are becoming increasingly important as herbal drugs or food additives. The accelerator-based technique PIXE, using the 1.8 MeV collimated proton beam from the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro (Padova) Italy, was employed in the determination of the elemental constituents of these anti-diabetic medicinal plants. Leaves of Gardenia ternifolia, Caesalpina pulcherrima, Solemostenon monostachys, whole plant of Momordica charantia and leaf and stem bark of Hunteria umbellata could be taken as vegetables, neutraceuticals, food additives and supplements in the management of diabetes. However, Hexabolus monopetalus root should be used under prescription.

  9. Insights into metals in individual fine particles from municipal solid waste using synchrotron radiation-based micro-analytical techniques.

    PubMed

    Zhu, Yumin; Zhang, Hua; Shao, Liming; He, Pinjing

    2015-01-01

    Excessive inter-contamination with heavy metals hampers the application of biological treatment products derived from mixed or mechanically-sorted municipal solid waste (MSW). In this study, we investigated fine particles of <2mm, which are small fractions in MSW but constitute a significant component of the total heavy metal content, using bulk detection techniques. A total of 17 individual fine particles were evaluated using synchrotron radiation-based micro-X-ray fluorescence and micro-X-ray diffraction. We also discussed the association, speciation and source apportionment of heavy metals. Metals were found to exist in a diffuse distribution with heterogeneous intensities and intense hot-spots of <10 μm within the fine particles. Zn-Cu, Pb-Fe and Fe-Mn-Cr had significant correlations in terms of spatial distribution. The overlapped enrichment, spatial association, and the mineral phases of metals revealed the potential sources of fine particles from size-reduced waste fractions (such as scraps of organic wastes or ceramics) or from the importation of other particles. The diverse sources of heavy metal pollutants within the fine particles suggested that separate collection and treatment of the biodegradable waste fraction (such as food waste) is a preferable means of facilitating the beneficial utilization of the stabilized products. PMID:25597689

  10. 40 CFR 141.27 - Alternate analytical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Alternate analytical techniques. 141... PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Monitoring and Analytical Requirements § 141.27 Alternate analytical techniques. (a) With the written permission of the State, concurred in...

  11. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  12. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-01

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. PMID:26995641

  13. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-01

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity.

  14. Analytical techniques for direct identification of biosignatures and microorganisms

    NASA Astrophysics Data System (ADS)

    Cid, C.; Garcia-Descalzo, L.; Garcia-Lopez, E.; Postigo, M.; Alcazar, A.; Baquero, F.

    2012-09-01

    Rover missions to potentially habitable ecosystems require portable instruments that use minimal power, require no sample preparation, and provide suitably diagnostic information to an Earth-based exploration team. In exploration of terrestrial analogue environments of potentially habitable ecosystems it is important to screen rapidly for the presence of biosignatures and microorganisms and especially to identify them accurately. In this study, several analytical techniques for the direct identification of biosignatures and microorganisms in different Earth analogues of habitable ecosystems are compared.

  15. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  16. Ancient mercury-based plating methods: combined use of surface analytical techniques for the study of manufacturing process and degradation phenomena.

    PubMed

    Ingo, Gabriel Maria; Guida, Giuseppe; Angelini, Emma; Di Carlo, Gabriella; Mezzi, Alessio; Padeletti, Giuseppina

    2013-11-19

    Fire gilding and silvering are age-old mercury-based processes used to coat thesurface of less precious substrates with thin layers of gold or silver. In ancient times, these methods were used to produce and decorate different types of artefacts, such as jewels, statues, amulets, and commonly-used objects. Gilders performed these processes not only to decorate objects but also to simulate the appearance of gold or silver, sometimes fraudulently. From a technological point of view, the aim of these workmen over 2000 years ago was to make the precious metal coatings as thin and adherent as possible. This was in order to save expensive metals and to improve the resistance to the wear caused by continued use and circulation. Without knowledge about the chemical-physical processes, the ancient crafts-men systematically manipulated these metals to create functional and decorative artistic objects. The mercury-based methods were also fraudulently used in ancient times to produce objects such as jewels and coins that looked like they were made of silver or gold but actually had a less precious core. These coins were minted by counterfeiters but also by the official issuing authorities. The latter was probably because of a lack of precious metals, reflecting periods of severe economic conditions. In this Account, we discuss some representative cases of gold- and silver-coatedobjects, focusing on unique and valuable Roman and Dark Ages period works of art, such as the St. Ambrogio's altar (825 AD), and commonly used objects. We carried out the investigations using surface analytical methods, such as selected area X-ray photoelectron spectroscopy and scanning electron microscopy combined with energy-dispersive spectroscopy. We used these methods to investigate the surface and subsurface chemical features of these important examples of art and technology, interpreting some aspects of the manufacturing methods and of disclosing degradation agents and mechanisms. These findings

  17. Ancient mercury-based plating methods: combined use of surface analytical techniques for the study of manufacturing process and degradation phenomena.

    PubMed

    Ingo, Gabriel Maria; Guida, Giuseppe; Angelini, Emma; Di Carlo, Gabriella; Mezzi, Alessio; Padeletti, Giuseppina

    2013-11-19

    Fire gilding and silvering are age-old mercury-based processes used to coat thesurface of less precious substrates with thin layers of gold or silver. In ancient times, these methods were used to produce and decorate different types of artefacts, such as jewels, statues, amulets, and commonly-used objects. Gilders performed these processes not only to decorate objects but also to simulate the appearance of gold or silver, sometimes fraudulently. From a technological point of view, the aim of these workmen over 2000 years ago was to make the precious metal coatings as thin and adherent as possible. This was in order to save expensive metals and to improve the resistance to the wear caused by continued use and circulation. Without knowledge about the chemical-physical processes, the ancient crafts-men systematically manipulated these metals to create functional and decorative artistic objects. The mercury-based methods were also fraudulently used in ancient times to produce objects such as jewels and coins that looked like they were made of silver or gold but actually had a less precious core. These coins were minted by counterfeiters but also by the official issuing authorities. The latter was probably because of a lack of precious metals, reflecting periods of severe economic conditions. In this Account, we discuss some representative cases of gold- and silver-coatedobjects, focusing on unique and valuable Roman and Dark Ages period works of art, such as the St. Ambrogio's altar (825 AD), and commonly used objects. We carried out the investigations using surface analytical methods, such as selected area X-ray photoelectron spectroscopy and scanning electron microscopy combined with energy-dispersive spectroscopy. We used these methods to investigate the surface and subsurface chemical features of these important examples of art and technology, interpreting some aspects of the manufacturing methods and of disclosing degradation agents and mechanisms. These findings

  18. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  19. Analytical pervaporation: a key technique in the enological laboratory.

    PubMed

    Luque de Castro, Maria D; Luque-García, Jose L; Mataix, Eva

    2003-01-01

    This paper reviews the use of analytical pervaporation (defined as the integration of 2 different analytical separation principles, evaporation and gas diffusion, in a single micromodule) coupled to flow-injection manifolds for the determination of analytes of interest in enology; the review discusses the advantages that these techniques can provide in wine analytical laboratories. Special attention is given to methods that enable the determination of either of 2 volatile analytes, or of one volatile analyte and one nonvolatile analyte by taking advantage of the versatility of the designed approaches. In a comparison of these methods with the official and/or standard methods, the results showed good agreement. In addition, the new methods offer improvements in linear determination range, quantitation limit, precision, rapidity, and potential for full automation. Thus, this review demonstrates that although the old technologies used in wine analytical laboratories may be supported by official and standard methods, they should be replaced by properly validated, new, and automated technologies.

  20. Regression techniques and analytical solutions to demonstrate intrinsic bioremediation

    SciTech Connect

    Buscheck, T.E.; Alcantar, C.M.

    1995-12-31

    It is now generally recognized that a major factor responsible for the attenuation and mass reduction of benzene, toluene, ethylbenzene, and xylenes (BTEX) in groundwater plumes is hydrocarbon biodegradation by indigenous microorganisms in aquifer material. Their objective is to apply well-known regression techniques and analytical solutions to estimate the contribution of advection, dispersion, sorption, and biodecay to the overall attenuation of petroleum hydrocarbons. These calculations yield an apparent biodecay rate based on field data. This biodecay rate is a significant portion of the overall attenuation in stable, dissolved hydrocarbon plumes.

  1. Review of analytical techniques for arson residues.

    PubMed

    Pert, Alastair D; Baron, Mark G; Birkett, Jason W

    2006-09-01

    Arson is a serious crime that affects society through cost, property damage, and loss of life. It is important that the methods and technologies applied by fire investigators in detection of evidence and subsequent analyses have a high degree of reliability, sensitivity, and be subject to rigorous quality control and assurance. There have been considerable advances in the field of arson investigation since the 1950s. Classification of ignitable liquids has been updated to include many new categories due to developments in the petroleum industry. Techniques such as steam or vacuum distillation and gas chromatography (GC) with flame ionization detection that may have been considered acceptable--even a benchmark--40 years ago, are nowadays generally disfavored, to the extent that their implementation may almost be considered as ignorance in the field. The advent of readily available mass spectrometric techniques has revolutionized the field of fire debris analysis, increasing the degree of sensitivity and discrimination possible considerably. Multi-dimensional GC--particularly GC x GC--while not yet widely applied, is rapidly gaining recognition as an important technique. This comprehensive review focuses on techniques and practices used in fire investigation, from scene investigation to analysis.

  2. Using Analytical Techniques to Interpret Financial Statements.

    ERIC Educational Resources Information Center

    Walters, Donald L.

    1986-01-01

    Summarizes techniques for interpreting the balance sheet and the statement of revenues, expenditures, and changes-in-fund-balance sections of the comprehensive annual financial report required of all school districts. Uses three tables to show intricacies involved and focuses on analyzing favorable and unfavorable budget variances. (MLH)

  3. 40 CFR 141.27 - Alternate analytical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 23 2011-07-01 2011-07-01 false Alternate analytical techniques. 141.27 Section 141.27 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Monitoring and Analytical...

  4. 40 CFR 141.27 - Alternate analytical techniques.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 23 2014-07-01 2014-07-01 false Alternate analytical techniques. 141.27 Section 141.27 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Monitoring and Analytical...

  5. A novel analytical technique suitable for the identification of plastics.

    PubMed

    Nečemer, Marijan; Kump, Peter; Sket, Primož; Plavec, Janez; Grdadolnik, Jože; Zvanut, Maja

    2013-01-01

    The enormous development and production of plastic materials in the last century resulted in increasing numbers of such kinds of objects. Development of a simple and fast technique to classify different types of plastics could be used in many activities dealing with plastic materials such as packaging of food, sorting of used plastic materials, and also, if technique would be non-destructive, for conservation of plastic artifacts in museum collections, a relatively new field of interest since 1990. In our previous paper we introduced a non-destructive technique for fast identification of unknown plastics based on EDXRF spectrometry,1 using as a case study some plastic artifacts archived in the Museum in order to show the advantages of the nondestructive identification of plastic material. In order to validate our technique it was necessary to apply for this purpose the comparison of analyses with some of the analytical techniques, which are more suitable and so far rather widely applied in identifying some most common sorts of plastic materials.

  6. Analytical evaluation of two motion washout techniques

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1977-01-01

    Practical tools were developed which extend the state of the art of moving base flight simulation for research and training purposes. The use of visual and vestibular cues to minimize the actual motion of the simulator itself was a primary consideration. The investigation consisted of optimum programming of motion cues based on a physiological model of the vestibular system to yield 'ideal washout logic' for any given simulator constraints.

  7. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  8. Cost and Schedule Analytical Techniques Development

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) for the base contract year from December 1, 1994 through November 30, 1995. The Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical services and products to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02). Detailed Monthly Progress Reports were submitted to MSFC in accordance with the contract's Statement of Work Section IV "Reporting and Documentation". These reports spelled out each month's specific work accomplishments, deliverables submitted, major meetings held, and other pertinent information. This Final Report will summarize these activities at a higher level.

  9. Analytic technique measures aromatics in soil and water

    SciTech Connect

    Roy, K.A.

    1990-12-01

    This paper reports on a technique for detecting aromatic compounds in soil and water. The technique traces its roots to a chemical reaction discovered in 1877. The reaction is an organic synthesis process that has been harnessed for the production of high-octane gasoline, synthetic rubber, plastics and synthetic detergents. More than a century later the same chemistry is used as the basis for an analytical technique that quantifies contamination caused by products.

  10. Waste minimization in analytical chemistry through innovative sample preparation techniques.

    SciTech Connect

    Smith, L. L.

    1998-05-28

    Because toxic solvents and other hazardous materials are commonly used in analytical methods, characterization procedures result in significant and costly amount of waste. We are developing alternative analytical methods in the radiological and organic areas to reduce the volume or form of the hazardous waste produced during sample analysis. For the radiological area, we have examined high-pressure, closed-vessel microwave digestion as a way to minimize waste from sample preparation operations. Heated solutions of strong mineral acids can be avoided for sample digestion by using the microwave approach. Because reactivity increases with pressure, we examined the use of less hazardous solvents to leach selected contaminants from soil for subsequent analysis. We demonstrated the feasibility of this approach by extracting plutonium from a NET reference material using citric and tartaric acids with microwave digestion. Analytical results were comparable to traditional digestion methods, while hazardous waste was reduced by a factor often. We also evaluated the suitability of other natural acids, determined the extraction performance on a wider variety of soil types, and examined the extraction efficiency of other contaminants. For the organic area, we examined ways to minimize the wastes associated with the determination of polychlorinated biphenyls (PCBs) in environmental samples. Conventional methods for analyzing semivolatile organic compounds are labor intensive and require copious amounts of hazardous solvents. For soil and sediment samples, we have a method to analyze PCBs that is based on microscale extraction using benign solvents (e.g., water or hexane). The extraction is performed at elevated temperatures in stainless steel cells containing the sample and solvent. Gas chromatography-mass spectrometry (GC/MS) was used to quantitate the analytes in the isolated extract. More recently, we developed a method utilizing solid-phase microextraction (SPME) for natural

  11. Semi-analytic technique for analyzing mode-locked lasers

    SciTech Connect

    Usechak, N.G.; Agrawal, G.P.

    2005-03-21

    A semi-analytic tool is developed for investigating pulse dynamics in mode-locked lasers. It provides a set of rate equations for pulse energy, width, and chirp, whose solutions predict how these pulse parameters evolve from one round trip to the next and how they approach their final steady-state values. An actively mode-locked laser is investigated using this technique and the results are in excellent agreement with numerical simulations and previous analytical studies.

  12. Non-destructive micro-analytical differentiation of copper pigments in paint layers of works of art using laboratory-based techniques

    NASA Astrophysics Data System (ADS)

    Švarcová, Silvie; Čermáková, Zdeňka; Hradilová, Janka; Bezdička, Petr; Hradil, David

    2014-11-01

    An unambiguous identification of pigments in paint layers of works of art forms a substantial part of the description of a painting technique, which is essential for the evaluation of the work of art including determination of the period and/or region of its creation as well as its attribution to a workshop or an author. Copper pigments represent a significant group of materials used in historic paintings. Because of their substantial diversity and, on the other hand, similarity, their identification and differentiation is a challenging task. An analytical procedure for unambiguous determination of both mineral-type (azurite, malachite, posnjakite, atacamite, etc.) and verdigris-type (copper acetates) copper pigments in the paint layers is presented, including light microscopy under VIS and UV light, electron microscopy with elemental microanalysis, Fourier transformed infrared micro-spectroscopy (micro-FTIR), and X-ray powder micro-diffraction (micro-XRD). Micro-Raman measurements were largely hindered by fluorescence. The choice of the analytical methods meets the contemporary requirement of a detailed description of various components in heterogeneous and minute samples of paint layers without their destruction. It is beneficial to use the combination of phase sensitive methods such as micro-FTIR and micro-XRD, because it allows the identification of both mineral-type and verdigris-type copper pigments in one paint layer. In addition, preliminary results concerning the study of the loss of crystallinity of verdigris-type pigments in proteinaceous binding media and the effect of lead white and lead tin yellow as highly absorbing matrix on verdigris identification in paint layers are reported.

  13. Non-destructive micro-analytical differentiation of copper pigments in paint layers of works of art using laboratory-based techniques.

    PubMed

    Svarcová, Silvie; Cermáková, Zdeňka; Hradilová, Janka; Bezdička, Petr; Hradil, David

    2014-11-11

    An unambiguous identification of pigments in paint layers of works of art forms a substantial part of the description of a painting technique, which is essential for the evaluation of the work of art including determination of the period and/or region of its creation as well as its attribution to a workshop or an author. Copper pigments represent a significant group of materials used in historic paintings. Because of their substantial diversity and, on the other hand, similarity, their identification and differentiation is a challenging task. An analytical procedure for unambiguous determination of both mineral-type (azurite, malachite, posnjakite, atacamite, etc.) and verdigris-type (copper acetates) copper pigments in the paint layers is presented, including light microscopy under VIS and UV light, electron microscopy with elemental microanalysis, Fourier transformed infrared micro-spectroscopy (micro-FTIR), and X-ray powder micro-diffraction (micro-XRD). Micro-Raman measurements were largely hindered by fluorescence. The choice of the analytical methods meets the contemporary requirement of a detailed description of various components in heterogeneous and minute samples of paint layers without their destruction. It is beneficial to use the combination of phase sensitive methods such as micro-FTIR and micro-XRD, because it allows the identification of both mineral-type and verdigris-type copper pigments in one paint layer. In addition, preliminary results concerning the study of the loss of crystallinity of verdigris-type pigments in proteinaceous binding media and the effect of lead white and lead tin yellow as highly absorbing matrix on verdigris identification in paint layers are reported.

  14. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  15. Analytical results from ground-water sampling using a direct-push technique at the Dover National Test Site, Dover Air Force Base, Delaware, June-July 2001

    USGS Publications Warehouse

    Guertal, William R.; Stewart, Marie; Barbaro, Jeffrey R.; McHale, Timthoy J.

    2004-01-01

    A joint study by the Dover National Test Site and the U.S. Geological Survey was conducted from June 27 through July 18, 2001 to determine the spatial distribution of the gasoline oxygenate additive methyl tert-butyl ether and selected water-quality constituents in the surficial aquifer underlying the Dover National Test Site at Dover Air Force Base, Delaware. The study was conducted to support a planned enhanced bio-remediation demonstration and to assist the Dover National Test Site in identifying possible locations for future methyl tert-butyl ether remediation demonstrations. This report presents the analytical results from ground-water samples collected during the direct-push ground-water sampling study. A direct-push drill rig was used to quickly collect 115 ground-water samples over a large area at varying depths. The ground-water samples and associated quality-control samples were analyzed for volatile organic compounds and methyl tert-butyl ether by the Dover National Test Site analytical laboratory. Volatile organic compounds were above the method reporting limits in 59 of the 115 ground-water samples. The concentrations ranged from below detection limits to maximum values of 12.4 micrograms per liter of cis-1,2-dichloroethene, 1.14 micrograms per liter of trichloroethene, 2.65 micrograms per liter of tetrachloroethene, 1,070 micrograms per liter of methyl tert-butyl ether, 4.36 micrograms per liter of benzene, and 1.8 micrograms per liter of toluene. Vinyl chloride, ethylbenzene, p,m-xylene, and o-xylene were not detected in any of the samples collected during this investigation. Methyl tert-butyl ether was detected in 47 of the 115 ground-water samples. The highest methyl tert-butyl ether concentrations were found in the surficial aquifer from -4.6 to 6.4 feet mean sea level, however, methyl tert-butyl ether was detected as deep as -9.5 feet mean sea level. Increased methane concentrations and decreased dissolved oxygen concentrations were found in

  16. New and emerging analytical techniques for marine biotechnology.

    PubMed

    Burgess, J Grant

    2012-02-01

    Marine biotechnology is the industrial, medical or environmental application of biological resources from the sea. Since the marine environment is the most biologically and chemically diverse habitat on the planet, marine biotechnology has, in recent years delivered a growing number of major therapeutic products, industrial and environmental applications and analytical tools. These range from the use of a snail toxin to develop a pain control drug, metabolites from a sea squirt to develop an anti-cancer therapeutic, and marine enzymes to remove bacterial biofilms. In addition, well known and broadly used analytical techniques are derived from marine molecules or enzymes, including green fluorescence protein gene tagging methods and heat resistant polymerases used in the polymerase chain reaction. Advances in bacterial identification, metabolic profiling and physical handling of cells are being revolutionised by techniques such as mass spectrometric analysis of bacterial proteins. Advances in instrumentation and a combination of these physical advances with progress in proteomics and bioinformatics are accelerating our ability to harness biology for commercial gain. Single cell Raman spectroscopy and microfluidics are two emerging techniques which are also discussed elsewhere in this issue. In this review, we provide a brief survey and update of the most powerful and rapidly growing analytical techniques as used in marine biotechnology, together with some promising examples of less well known earlier stage methods which may make a bigger impact in the future.

  17. The analytical representation of viscoelastic material properties using optimization techniques

    NASA Astrophysics Data System (ADS)

    Hill, S. A.

    1993-02-01

    This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.

  18. Multi-methodological investigation of kunzite, hiddenite, alexandrite, elbaite and topaz, based on laser-induced breakdown spectroscopy and conventional analytical techniques for supporting mineralogical characterization

    NASA Astrophysics Data System (ADS)

    Rossi, Manuela; Dell'Aglio, Marcella; De Giacomo, Alessandro; Gaudiuso, Rosalba; Senesi, Giorgio Saverio; De Pascale, Olga; Capitelli, Francesco; Nestola, Fabrizio; Ghiara, Maria Rosaria

    2014-02-01

    Gem-quality alexandrite, hiddenite and kunzite, elbaite and topaz minerals were characterized through a multi-methodological investigation based on EMPA-WDS, LA-ICP-MS, and laser-induced breakdown spectroscopy (LIBS). With respect to the others, the latter technique enables a simultaneous multi-elemental composition without any sample preparation and the detection of light elements, such as Li, Be and B. The criteria for the choice of minerals were: (a) the presence of chromophore elements in minor contents and/or as traces; (b) the presence of light lithophile elements (Li, Be and B); (c) different crystal chemistry complexity. The results show that LIBS can be employed in mineralogical studies for the identification and characterization of minerals, and as a fast screening method to determine the chemical composition, including the chromophore and light lithophile elements.

  19. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  20. Hybridizing experimental, numerical, and analytical stress analysis techniques

    NASA Astrophysics Data System (ADS)

    Rowlands, Robert E.

    2001-06-01

    Good measurements enjoy the advantage of conveying what actually occurs. However, recognizing that vast amounts of displacement, strain and/or stress-related information can now be recorded at high resolution, effective and reliable means of processing the data become important. It can therefore be advantageous to combine measured result with analytical and computations methods. This presentation will describe such synergism and applications to engineering problems. This includes static and transient analysis, notched and perforated composites, and fracture of composites and fiber-filled cement. Experimental methods of moire, thermo elasticity and strain gages are emphasized. Numerical techniques utilized include pseudo finite-element and boundary-element concepts.

  1. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. PMID:24238710

  2. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees.

  3. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    PubMed

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples. PMID:25864956

  4. Toward greener analytical techniques for the absolute quantification of peptides in pharmaceutical and biological samples.

    PubMed

    Van Eeckhaut, Ann; Mangelings, Debby

    2015-09-10

    Peptide-based biopharmaceuticals represent one of the fastest growing classes of new drug molecules. New reaction types included in the synthesis strategies to reduce the rapid metabolism of peptides, along with the availability of new formulation and delivery technologies, resulted in an increased marketing of peptide drug products. In this regard, the development of analytical methods for quantification of peptides in pharmaceutical and biological samples is of utmost importance. From the sample preparation step to their analysis by means of chromatographic or electrophoretic methods, many difficulties should be tackled to analyze them. Recent developments in analytical techniques emphasize more and more on the use of green analytical techniques. This review will discuss the progresses in and challenges observed during green analytical method development for the quantification of peptides in pharmaceutical and biological samples.

  5. Web-based Visual Analytics for Extreme Scale Climate Science

    SciTech Connect

    Steed, Chad A; Evans, Katherine J; Harney, John F; Jewell, Brian C; Shipman, Galen M; Smith, Brian E; Thornton, Peter E; Williams, Dean N.

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  6. Analytical technique for satellite projected cross-sectional area calculation

    NASA Astrophysics Data System (ADS)

    Ben-Yaacov, Ohad; Edlerman, Eviatar; Gurfil, Pini

    2015-07-01

    Calculating the projected cross-sectional area (PCSA) of a satellite along a given direction is essential for implementing attitude control modes such as Sun pointing or minimum-drag. The PCSA may also be required for estimating the forces and torques induced by atmospheric drag and solar radiation pressure. This paper develops a new analytical method for calculating the PCSA, the concomitant torques and the satellite exposed surface area, based on the theory of convex polygons. A scheme for approximating the outer surface of any satellite by polygons is developed. Then, a methodology for calculating the projections of the polygons along a given vector is employed. The methodology also accounts for overlaps among projections, and is capable of providing the true PCSA in a computationally-efficient manner. Using the Space Autonomous Mission for Swarming and Geo-locating Nanosatellites mechanical model, it is shown that the new analytical method yields accurate results, which are similar to results obtained from alternative numerical tools.

  7. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    NASA Technical Reports Server (NTRS)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  8. New analytical technique for carbon dioxide absorption solvents

    SciTech Connect

    Pouryousefi, F.; Idem, R.O.

    2008-02-15

    The densities and refractive indices of two binary systems (water + MEA and water + MDEA) and three ternary systems (water + MEA + CO{sub 2}, water + MDEA + CO{sub 2}, and water + MEA + MDEA) used for carbon dioxide (CO{sub 2}) capture were measured over the range of compositions of the aqueous alkanolamine(s) used for CO{sub 2} absorption at temperatures from 295 to 338 K. Experimental densities were modeled empirically, while the experimental refractive indices were modeled using well-established models from the known values of their pure-component densities and refractive indices. The density and Gladstone-Dale refractive index models were then used to obtain the compositions of unknown samples of the binary and ternary systems by simultaneous solution of the density and refractive index equations. The results from this technique have been compared with HPLC (high-performance liquid chromatography) results, while a third independent technique (acid-base titration) was used to verify the results. The results show that the systems' compositions obtained from the simple and easy-to-use refractive index/density technique were very comparable to the expensive and laborious HPLC/titration techniques, suggesting that the refractive index/density technique can be used to replace existing methods for analysis of fresh or nondegraded, CO{sub 2}-loaded, single and mixed alkanolamine solutions.

  9. Applications of nuclear analytical techniques to environmental studies

    NASA Astrophysics Data System (ADS)

    Freitas, M. C.; Pacheco, A. M. G.; Marques, A. P.; Barros, L. I. C.; Reis, M. A.

    2001-07-01

    A few examples of application of nuclear-analytical techniques to biological monitors—natives and transplants—are given herein. Parmelia sulcata Taylor transplants were set up in a heavily industrialized area of Portugal—the Setúbal peninsula, about 50 km south of Lisbon—where indigenous lichens are rare. The whole area was 10×15 km around an oil-fired power station, and a 2.5×2.5 km grid was used. In north-western Portugal, native thalli of the same epiphytes (Parmelia spp., mostly Parmelia sulcata Taylor) and bark from olive trees (Olea europaea) were sampled across an area of 50×50 km, using a 10×10 km grid. This area is densely populated and features a blend of rural, urban-industrial and coastal environments, together with the country's second-largest metro area (Porto). All biomonitors have been analyzed by INAA and PIXE. Results were put through nonparametric tests and factor analysis for trend significance and emission sources, respectively.

  10. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  11. Comparative study of analytical techniques for determining protein charge.

    PubMed

    Filoti, Dana I; Shire, Steven J; Yadav, Sandeep; Laue, Thomas M

    2015-07-01

    As interest in high-concentration protein formulations has increased, it has become apparent that routine, accurate protein charge measurements are necessary. There are several techniques for charge measurement, and a comparison of the methods is needed. The electrophoretic mobility, effective charge, and Debye-Hückel-Henry charge have been determined for bovine serum albumin, and human serum albumin. Three different electrophoretic methods were used to measure the electrophoretic mobility: capillary electrophoresis, electrophoretic light scattering, and membrane confined electrophoresis. In addition, the effective charge was measured directly using steady-state electrophoresis. Measurements made at different NaCl concentrations, pH, and temperatures allow comparison with previous charge estimates based on electrophoresis, Donnan equilibrium, and pH titration. Similar charge estimates are obtained by all of the methods. The strengths and limitations of each technique are discussed, as are some general considerations about protein charge and charge determination.

  12. Degradation of glass artifacts: application of modern surface analytical techniques.

    PubMed

    Melcher, Michael; Wiesinger, Rita; Schreiner, Manfred

    2010-06-15

    A detailed understanding of the stability of glasses toward liquid or atmospheric attack is of considerable importance for preserving numerous objects of our cultural heritage. Glasses produced in the ancient periods (Egyptian, Greek, or Roman glasses), as well as modern glass, can be classified as soda-lime-silica glasses. In contrast, potash was used as a flux in medieval Northern Europe for the production of window panes for churches and cathedrals. The particular chemical composition of these potash-lime-silica glasses (low in silica and rich in alkali and alkaline earth components), in combination with increased levels of acidifying gases (such as SO(2), CO(2), NO(x), or O(3)) and airborne particulate matter in today's urban or industrial atmospheres, has resulted in severe degradation of important cultural relics, particularly over the last century. Rapid developments in the fields of microelectronics and computer sciences, however, have contributed to the development of a variety of nondestructive, surface analytical techniques for the scientific investigation and material characterization of these unique and valuable objects. These methods include scanning electron microscopy in combination with energy- or wavelength-dispersive spectrometry (SEM/EDX or SEM/WDX), secondary ion mass spectrometry (SIMS), and atomic force microscopy (AFM). In this Account, we address glass analysis and weathering mechanisms, exploring the possibilities (and limitations) of modern analytical techniques. Corrosion by liquid substances is well investigated in the glass literature. In a tremendous number of case studies, the basic reaction between aqueous solutions and the glass surfaces was identified as an ion-exchange reaction between hydrogen-bearing species of the attacking liquid and the alkali and alkaline earth ions in the glass, causing a depletion of the latter in the outermost surface layers. Although mechanistic analogies to liquid corrosion are obvious, atmospheric

  13. Coal liquefaction process streams characterization and evaluation. Novel analytical techniques for coal liquefaction: Fluorescence microscopy

    SciTech Connect

    Rathbone, R.F.; Hower, J.C.; Derbyshire, F.J.

    1991-10-01

    This study demonstrated the feasibility of using fluorescence and reflectance microscopy techniques for the examination of distillation resid materials derived from direct coal liquefaction. Resid, as defined here, is the 850{degrees}F{sup +} portion of the process stream, and includes soluble organics, insoluble organics and ash. The technique can be used to determine the degree of hydrogenation and the presence of multiple phases occurring within a resid sample. It can also be used to infer resid reactivity. The technique is rapid, requiring less than one hour for sample preparation and examination, and thus has apparent usefulness for process monitoring. Additionally, the technique can distinguish differences in samples produced under various process conditions. It can, therefore, be considered a potentially useful technique for the process developer. Further development and application of this analytical method as a process development tool is justified based on these results.

  14. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  15. Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Johnson, Dennis C.

    1980-01-01

    Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)

  16. An Analytic Technique for Investigating Mode-Locked Lasers

    SciTech Connect

    Usechak, N.G.; Agrawal, G.P.

    2005-09-30

    We present an analytic theory capable of predicting pulse parameters in mode-locked lasers in the presence of dispersion and nonlinearity. Excellent agreement is obtained between this approach and full numerical solutions.

  17. Assessment of analytical techniques for predicting solid propellant exhaust plumes

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.

    1977-01-01

    The calculation of solid propellant exhaust plume flow fields is addressed. Two major areas covered are: (1) the applicability of empirical data currently available to define particle drag coefficients, heat transfer coefficients, mean particle size and particle size distributions, and (2) thermochemical modeling of the gaseous phase of the flow field. Comparisons of experimentally measured and analytically predicted data are made. The experimental data were obtained for subscale solid propellant motors with aluminum loadings of 2, 10 and 15%. Analytical predictions were made using a fully coupled two-phase numerical solution. Data comparisons will be presented for radial distributions at plume axial stations of 5, 12, 16 and 20 diameters.

  18. Application of X-ray fluorescence analytical techniques in phytoremediation and plant biology studies

    NASA Astrophysics Data System (ADS)

    Nečemer, Marijan; Kump, Peter; Ščančar, Janez; Jaćimović, Radojko; Simčič, Jurij; Pelicon, Primož; Budnar, Miloš; Jeran, Zvonka; Pongrac, Paula; Regvar, Marjana; Vogel-Mikuš, Katarina

    2008-11-01

    Phytoremediation is an emerging technology that employs the use of higher plants for the clean-up of contaminated environments. Progress in the field is however handicapped by limited knowledge of the biological processes involved in plant metal uptake, translocation, tolerance and plant-microbe-soil interactions; therefore a better understanding of the basic biological mechanisms involved in plant/microbe/soil/contaminant interactions would allow further optimization of phytoremediation technologies. In view of the needs of global environmental protection, it is important that in phytoremediation and plant biology studies the analytical procedures for elemental determination in plant tissues and soil should be fast and cheap, with simple sample preparation, and of adequate accuracy and reproducibility. The aim of this study was therefore to present the main characteristics, sample preparation protocols and applications of X-ray fluorescence-based analytical techniques (energy dispersive X-ray fluorescence spectrometry—EDXRF, total reflection X-ray fluorescence spectrometry—TXRF and micro-proton induced X-ray emission—micro-PIXE). Element concentrations in plant leaves from metal polluted and non-polluted sites, as well as standard reference materials, were analyzed by the mentioned techniques, and additionally by instrumental neutron activation analysis (INAA) and atomic absorption spectrometry (AAS). The results were compared and critically evaluated in order to assess the performance and capability of X-ray fluorescence-based techniques in phytoremediation and plant biology studies. It is the EDXRF, which is recommended as suitable to be used in the analyses of a large number of samples, because it is multi-elemental, requires only simple preparation of sample material, and it is analytically comparable to the most frequently used instrumental chemical techniques. The TXRF is compatible to FAAS in sample preparation, but relative to AAS it is fast

  19. Challenges of Using Learning Analytics Techniques to Support Mobile Learning

    ERIC Educational Resources Information Center

    Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide

    2015-01-01

    Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…

  20. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  1. Highly time-resolved evaluation technique of instantaneous amplitude and phase difference using analytic signals for multi-channel diagnostics.

    PubMed

    Ohshima, S; Kobayashi, S; Yamamoto, S; Nagasaki, K; Mizuuchi, T; Kado, S; Okada, H; Minami, T; Lee, H Y; Zang, L; Kenmochi, N; Kasajima, K; Ohtani, Y; Shi, N; Nagae, Y; Konoshima, S; Sano, F

    2014-11-01

    A fluctuation analysis technique using analytic signals is proposed. Analytic signals are suitable to characterize a single mode with time-dependent amplitude and frequency, such as an MHD mode observed in fusion plasmas since the technique can evaluate amplitude and frequency at a specific moment without limitations of temporal and frequency resolutions, which is problematic in Fourier-based analyses. Moreover, a concept of instantaneous phase difference is newly introduced, and error of the evaluated phase difference and its error reduction techniques using conditional/ensemble averaging are discussed. These techniques are applied to experimental data of the beam emission spectroscopic measurement in the Heliotron J device, which demonstrates that the technique can describe nonlinear evolution of MHD instabilities. This technique is widely applicable to other diagnostics having necessity to evaluate phase difference.

  2. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    SciTech Connect

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.; Riensche, Roderick M.; Franklin, Lyndsey; Pike, William A.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analytical components from information sources making it easier to adapt the framework for many different data repositories.

  3. Facilitating the Analysis of Immunological Data with Visual Analytic Techniques

    PubMed Central

    Shih, David C.; Ho, Kevin C.; Melnick, Kyle M.; Rensink, Ronald A.; Kollmann, Tobias R.; Fortuno III, Edgardo S.

    2011-01-01

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach. PMID:21248691

  4. Facilitating the analysis of immunological data with visual analytic techniques.

    PubMed

    Shih, David C; Ho, Kevin C; Melnick, Kyle M; Rensink, Ronald A; Kollmann, Tobias R; Fortuno, Edgardo S

    2011-01-02

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.

  5. Analytic mind use and interpsychic communication: driving force in analytic technique, pathway to unconscious mental life.

    PubMed

    Diamond, Michael J

    2014-07-01

    Developed from established psychoanalytic knowledge among different psychoanalytic cultures concerning unconscious interpsychic communication, analysts' use of their receptive mental experience--their analytic mind use, including the somatic, unconscious, and less accessible derivatives--represents a significant investigative road to patients' unconscious mental life, particularly with poorly symbolized mental states. The author expands upon this tradition, exploring what happens when patients unconsciously experience and identify with the analyst's psychic functioning. The technical implications of the analyst's "instrument" are described, including the analyst's ego regression, creation of inner space, taking mind as object, bearing uncertainty and intense affect, and self-analysis. Brief case vignettes illustrate the structure and obstacles to this work. PMID:25074050

  6. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  7. New analytical techniques for mycotoxins in complex organic matrices

    SciTech Connect

    Bicking, M.K.L.

    1982-01-01

    Air samples are collected for analysis from the Ames Solid Waste Recovery System. The high level of airborne fungi within the processing area is of concern due to the possible presence of toxic mycotoxins, and carcinogenic fungal metabolites. An analytical method has been developed to determine the concentration of aflatoxins B1, B2, G1, and G2 in the air of the plant which produces Refuse Derived Fuel (RDF). After extraction with methanol, some components in the matrix are precipitated by dissolving the samples in 30% acetonitrile/chloroform. An aliquot of this solution is injected onto a Styragel column where the sample components undergo simultaneous size exclusion and reverse phase partitioning. The Styragel column appears to have a useable lifetime of more than six months. After elution from Styragel, the sample is diverted to a second column containing Florisil which has been modified with oxalic acid and deactivated with water. Aflatoxins are eluted with 5% water/acetone. After removal of this solvent, the sample is dissolved in 150 ..mu..L of a spotting solvent and the entire sample applied to a thin layer chromatography (TLC) plate using a unique sample applicator developed here. The aflatoxins on the TLC plate are analyzed by laser fluorescence. A detection limit of 10 pg is possible for aflatoxin standards using a nitrogen laser as the excitation source. Sample concentrations are determined by comparing with an internal standard, a specially synthesized aflatoxin derivative. In two separate RDF samples, aflatoxin B1 was found at levels of 6.5 and 17.0 ppB. In a separate study, the spore pigment in Aspergillus flavus was isolated. The mass spectrum indicates a molecular weight in excess of 700. Only aliphatic hydrocarbons have been identified in the mass spectrum of products from a permanganate oxidation.

  8. New analytical techniques to facilitate preformulation screening in propellant systems.

    PubMed

    Mogalian, Erik; Kuehl, Philip J; Myrdal, Paul B

    2007-08-01

    The objective of these studies was to investigate the applicability of an online direct inject HPLC method for the preformulation screening of pharmaceutical agents in pressurized metered dose inhalers (MDIs). The technique was initially utilized for the solubility determination of solid solutes. This study explores the extension of the online direct inject method for the evaluation of drug stability in propellant systems as well as for the analysis of MDI vials crimped with metered valves. Through-life content analysis confirmed that a single vial may be repeatedly sampled, thus facilitating the stability evaluation of a single unit over time. The method was successfully used for evaluating the stability of a model drug, as a function of several different formulation configurations, with minimal sample numbers. Additionally, studies determined that after modifications were made to the injection coupler, the technique was also feasible for use with 50 and 100 microL metered valves, however further modifications are necessary for 25 microL valves.

  9. Team mental models: techniques, methods, and analytic approaches.

    PubMed

    Langan-Fox, J; Code, S; Langfield-Smith, K

    2000-01-01

    Effective team functioning requires the existence of a shared or team mental model among members of a team. However, the best method for measuring team mental models is unclear. Methods reported vary in terms of how mental model content is elicited and analyzed or represented. We review the strengths and weaknesses of vatrious methods that have been used to elicit, represent, and analyze individual and team mental models and provide recommendations for method selection and development. We describe the nature of mental models and review techniques that have been used to elicit and represent them. We focus on a case study on selecting a method to examine team mental models in industry. The processes involved in the selection and development of an appropriate method for eliciting, representing, and analyzing team mental models are described. The criteria for method selection were (a) applicability to the problem under investigation; (b) practical considerations - suitability for collecting data from the targeted research sample; and (c) theoretical rationale - the assumption that associative networks in memory are a basis for the development of mental models. We provide an evaluation of the method matched to the research problem and make recommendations for future research. The practical applications of this research include the provision of a technique for analyzing team mental models in organizations, the development of methods and processes for eliciting a mental model from research participants in their normal work environment, and a survey of available methodologies for mental model research.

  10. Mars Methane Analogue Mission (M3): Analytical Techniques and Operations

    NASA Astrophysics Data System (ADS)

    Cloutis, E.; Vrionis, H.; Qadi, A.; Bell, J. F.; Berard, G.; Boivin, A.; Ellery, A.; Jamroz, W.; Kruzelecky, R.; Mann, P.; Samson, C.; Stromberg, J.; Strong, K.; Tremblay, A.; Whyte, L.; Wing, B.

    2011-03-01

    The Mars Methane Analogue Mission (M3) project is designed to simulate a rover-based search for, and analysis of, methane sources on Mars at a serpentinite open pit mine in Quebec, using a variety of instruments.

  11. Analytic and interferometric techniques for the Laser Interferometer Space Antenna

    NASA Astrophysics Data System (ADS)

    Pollack, Scott E.

    The Laser Interferometer Space Antenna (LISA) is being designed to detect and study in detail gravitational waves from sources throughout the Universe such as massive black holes. The conceptual formulation of the LISA space-borne gravitational wave detector is now well developed. The interferometric measurements between the sciencecraft remain one of the most important technological and scientific design areas for the mission. Our work has concentrated on developing the interferometric technologies to create a LISA-like optical signal and to measure the phase of that signal using commercially available instruments. One of the most important goals of this research is to demonstrate the LISA phase timing and phase reconstruction for a LISA-like fringe signal, in the case of a high fringe rate and a low signal level. To this end we have constructed a table-top interferometer which produces LISA-like fringe signals. Over the past few years questions have been raised concerning the use of laser communications links between sciencecraft to transmit phase information crucial to the reduction of laser frequency noise in the LISA science measurement. The concern is that applying medium frequency phase modulations to the laser carrier could compromise the phase stability of the LISA fringe signal. We have modified our table-top interferometer by applying a phase modulation to the laser beam in order to evaluate the effects of such modulations on the LISA science fringe signal. We have demonstrated that the phase resolution of the science signal is not degraded by the presence of medium frequency phase modulations. Each spacecraft in LISA houses a proof mass which follows a geodesic through space. Disturbances that change the proof mass position, momentum, and acceleration will appear in the LISA data stream as additive quadratic functions. These data disturbances inhibit signal extraction and must be removed. Much of our analytical work has been focused on discussing the

  12. Comparing the performance of analytical techniques for genetic PARENTAGE of half-sib progeny arrays.

    PubMed

    Croshaw, Dean A; Peters, Maureen B; Glenn, Travis C

    2009-10-01

    The prevalence of female multiple mating in natural populations is important for many questions in mating system evolution. Several statistical techniques use genetic data to estimate the number of fathers that contribute gametes to broods, but they have not been widely compared to assess the magnitude of differences in their performance. With a combination of new data and reanalysis of previously published data, we compared five analytical approaches: (1) allele-counting, (2) parental reconstruction in GERUD, (3) a Bayesian probability model to estimate the frequency of multiple mating (FMM), (4) computer simulations based on population allele frequencies in HAPLOTYPES and (5) Bayesian parental reconstruction in PARENTAGE. The results show that choice of analysis technique can significantly affect estimates of sire number. Estimates from GERUD conformed exactly to results obtained from strict exclusion of potential sires in an experimental context. However, estimates yielded by HAPLOTYPES and PARENTAGE sometimes exceeded the numbers from GERUD by as much as 120 and 55%, respectively. We recommend GERUD over these other approaches for most purposes because of its accuracy and consistency in this analysis. Our novel genetic data set allowed us to investigate the extent and frequency of multiple paternity in a marbled salamander (Ambystoma opacum) population in South Carolina, USA. A. opacum contrasted with other salamander species by having relatively low levels of multiple paternity (only 31-54% compared with 71-96%). Although A. opacum had the lowest level of multiple paternity under all analytical approaches used here, the magnitude of differences among species varied. PMID:19922695

  13. Integrating Organic Matter Structure with Ecosystem Function using Advanced Analytical Chemistry Techniques

    NASA Astrophysics Data System (ADS)

    Boot, C. M.

    2012-12-01

    Microorganisms are the primary transformers of organic matter in terrestrial and aquatic ecosystems. The structure of organic matter controls its bioavailability and researchers have long sought to link the chemical characteristics of the organic matter pool to its lability. To date this effort has been primarily attempted using low resolution descriptive characteristics (e.g. organic matter content, carbon to nitrogen ratio, aromaticity, etc .). However, recent progress in linking these two important ecosystem components has been advanced using advanced high resolution tools (e.g. nuclear magnetic resonance (NMR) spectroscopy, and mass spectroscopy (MS)-based techniques). A series of experiments will be presented that highlight the application of high resolution techniques in a variety of terrestrial and aquatic ecosystems with the focus on how these data explicitly provide the foundation for integrating organic matter structure into our concept of ecosystem function. The talk will highlight results from a series of experiments including: an MS-based metabolomics and fluorescence excitation emission matrix approach evaluating seasonal and vegetation based changes in dissolved organic matter (DOM) composition from arctic soils; Fourier transform ion cyclotron resonance (FTICR) MS and MS metabolomics analysis of DOM from three lakes in an alpine watershed; and the transformation of 13C labeled glucose track with NMR during a rewetting experiment from Colorado grassland soils. These data will be synthesized to illustrate how the application of advanced analytical techniques provides novel insight into our understanding of organic matter processing in a wide range of ecosystems.

  14. Detection of irradiated chestnuts: preliminary study using three analytical techniques

    NASA Astrophysics Data System (ADS)

    Mangiacotti, Michele; Chiaravalle, Antonio Eugenio; Marchesani, Giuliana; De Sio, Antonio; Boniglia, Concetta; Bortolin, Emanuela; Onori, Sandro

    2009-07-01

    Irradiation of chestnuts has recently been considered as an alternative treatment to fumigation to reduce the considerable amount of the product normally lost during post-harvest period. The treatment is allowed in countries such as Korea and, in view of a possible extension to European countries, to permit the legal controls as required by the directive 1999/2/EC [ European Parliament and Council Directive, 1999/2/EC, on the approximation of the laws of the Member States concerning foods and food ingredients treated with ionising radiation. Official Journal of the European Communities. L 66/16 of 13.3.1999] and meet consumer consensus, reliable methods for detecting irradiated chestnuts have to be proposed. The aim of the present work was to test the efficacy of the European Standard EN 13751, EN 1788, EN 1787 and EN 13708 in detecting irradiated chestnuts. For this purpose, six sets of "Montella" chestnuts, a typical Italian variety recognized as a PGI (protected geographical indication), non-irradiated and irradiated at different doses in the 0.1-1 kGy range, were analysed by thermoluminescence (TL), photo-stimulated luminescence (PSL) (screening and calibrated PSL) and ESR techniques. PSL and TL analysis results revealed the low luminescence sensitivity of the chestnuts. Nevertheless, PSL screening data were in the intermediate band above the negative threshold (at all doses except at the lowest one) and TL analysis led to correct positive classifications even at the lowest dose tested (0.15 Gy). On the contrary, no radio-induced ESR signal could be registered with the irradiated samples of chestnut shell or pulp.

  15. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  16. Classroom Writing Tasks and Students' Analytic Text-Based Writing

    ERIC Educational Resources Information Center

    Matsumura, Lindsay Clare; Correnti, Richard; Wang, Elaine

    2015-01-01

    The Common Core State Standards emphasize students writing analytically in response to texts. Questions remain about the nature of instruction that develops students' text-based writing skills. In the present study, we examined the role that writing task quality plays in students' mastery of analytic text-based writing. Text-based writing tasks…

  17. Cost and Schedule Analytical Techniques Development: Option 2 Year

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) for the Option 2 Year from December 1, 1996 through November 30, 1997. The Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides products and deliverable in the form of models, data bases, methodologies, studies and analyses for the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PPO3) the Program Plans and Requirements Officer (PP02), and other user organizations. Detailed Monthly Progress reports were submitted to MSFC in accordance with the contract's Statement of Work, Section TV "Reporting and Documentation". These reports spelled out each month's specific work accomplishments, deliverables submitted, major meetings held, and other pertinent information. This Final Report will summarize these activities at higher level. During this contract Option Year, SAIC expended 29,830 man-hours in tile performance of tasks called out in the Statement of Work and reported oil in this yearly Final Report. This represents approximately 16 full-time EPs. Included are the basis Huntsville-based team, plus SAIC specialists in San Diego, Ames Research Center, Chicago, and Colorado Springs performing specific tasks for which they are uniquely qualified.

  18. Cost and Schedule Analytical Techniques Development: Option 1

    NASA Technical Reports Server (NTRS)

    1996-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) for the base contract year from December 1, 1995 through November 30, 1996. The Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical services and products to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PPO3) and the Program Plans and Requirements Officer (PPO2). Detailed Monthly Progress Reports were submitted to MSFC in accordance with the contract's Statement of Work Section IV "Reporting and Documentation". These reports spelled out each month's specific work accomplishments, deliverables submitted, major meetings held, and other pertinent information. This Final Report will summarize these activities at a higher level.

  19. Using morphometric and analytical techniques to characterize elephant ivory.

    PubMed

    Singh, Rina Rani; Goyal, Surendra Prakash; Khanna, Param Pal; Mukherjee, Pulok Kumar; Sukumar, Raman

    2006-10-16

    There is a need to characterize Asian elephant ivory and compare with African ivory for controlling illegal trade and implementation of national and international laws. In this paper, we characterize ivory of Asian and African elephants using Schreger angle measurements, elemental analysis {X-ray fluorescence (XRF), inductively coupled plasma-atomic emission spectroscopy (ICP-AES), and inductively coupled plasma-mass spectroscopy (ICP-MS)} and isotopic analysis. We recorded Schreger angle characteristics of elephant ivory at three different zones in ivory samples of African (n=12) and Asian (n=28) elephants. The Schreger angle ranged from 32 degrees to 145 degrees and 30 degrees to 153 degrees in Asian and African ivory, respectively. Elemental analysis (for Asian and African ivory) by XRF, ICP-AES and ICP-MS provided preliminary data. We attempted to ascertain source of origin of Asian elephant ivory similarly as in African ivory based on isotopes of carbon, nitrogen and strontium. We determined isotopic ratios of carbon (n=31) and nitrogen (n=31) corresponding to diet and rainfall, respectively. Reference ivory samples from five areas within India were analyzed using collagen and powder sample and the latter was found more suitable for forensic analysis. During our preliminary analysis, the range of delta13C values (-13.6+/-0.15 per thousand and -25.6+/-0.15 per thousand) and delta15N values (10.2+/-0.15 per thousand and 3.5+/-0.15 per thousand) were noted.

  20. Novel computational and analytic techniques for nonlinear systems applied to structural and celestial mechanics

    NASA Astrophysics Data System (ADS)

    Elgohary, Tarek Adel Abdelsalam

    In this Dissertation, computational and analytic methods are presented to address nonlinear systems with applications in structural and celestial mechanics. Scalar Homotopy Methods (SHM) are first introduced for the solution of general systems of nonlinear algebraic equations. The methods are applied to the solution of postbuckling and limit load problems of solids and structures as exemplified by simple plane elastic frames, considering only geometrical nonlinearities. In many problems, instead of simply adopting a root solving method, it is useful to study the particular problem in more detail in order to establish an especially efficient and robust method. Such a problem arises in satellite geodesy coordinate transformation where a new highly efficient solution, providing global accuracy with a non-iterative sequence of calculations, is developed. Simulation results are presented to compare the solution accuracy and algorithm performance for applications spanning the LEO-to-GEO range of missions. Analytic methods are introduced to address problems in structural mechanics and astrodynamics. Analytic transfer functions are developed to address the frequency domain control problem of flexible rotating aerospace structures. The transfer functions are used to design a Lyapunov stable controller that drives the spacecraft to a target position while suppressing vibrations in the flexible appendages. In astrodynamics, a Taylor series based analytic continuation technique is developed to address the classical two-body problem. A key algorithmic innovation for the trajectory propagation is that the classical averaged approximation strategy is replaced with a rigorous series based solution for exactly computing the acceleration derivatives. Evidence is provided to demonstrate that high precision solutions are easily obtained with the analytic continuation approach. For general nonlinear initial value problems (IVPs), the method of Radial Basis Functions time domain

  1. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  2. A task-based analytical framework for ultrasonic beamformer comparison.

    PubMed

    Nguyen, Nghia Q; Prager, Richard W; Insana, Michael F

    2016-08-01

    A task-based approach is employed to develop an analytical framework for ultrasound beamformer design and evaluation. In this approach, a Bayesian ideal-observer provides an idealized starting point and a way to measure information loss in practical beamformer designs. Different approximations of this ideal strategy are shown to lead to popular beamformers in the literature, including the matched filter, minimum variance (MV), and Wiener filter (WF) beamformers. Analysis of the approximations indicates that the WF beamformer should outperform the MV approach, especially in low echo signal-to-noise conditions. The beamformers are applied to five typical tasks from the BIRADS lexicon. Their performance is evaluated based on ability to discriminate idealized malignant and benign features. The numerical results show the advantages of the WF over the MV technique in general; although performance varies predictably in some contrast-limited tasks because of the model modifications required for the MV algorithm to avoid ill-conditioning. PMID:27586736

  3. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    PubMed

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. PMID:26118962

  4. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions.

  5. Analytical impact time and angle guidance via time-varying sliding mode technique.

    PubMed

    Zhao, Yao; Sheng, Yongzhi; Liu, Xiangdong

    2016-05-01

    To concretely provide a feasible solution for homing missiles with the precise impact time and angle, this paper develops a novel guidance law, based on the nonlinear engagement dynamics. The guidance law is firstly designed with the prior assumption of a stationary target, followed by the practical extension to a moving target scenario. The time-varying sliding mode (TVSM) technique is applied to fulfill the terminal constraints, in which a specific TVSM surface is constructed with two unknown coefficients. One is tuned to meet the impact time requirement and the other one is targeted with a global sliding mode, so that the impact angle constraint as well as the zero miss distance can be satisfied. Because the proposed law possesses three guidance gain as design parameters, the intercept trajectory can be shaped according to the operational conditions and missile׳s capability. To improve the tolerance of initial heading errors and broaden the application, a new frame of reference is also introduced. Furthermore, the analytical solutions of the flight trajectory, heading angle and acceleration command can be totally expressed for the prediction and offline parameter selection by solving a first-order linear differential equation. Numerical simulation results for various scenarios validate the effectiveness of the proposed guidance law and demonstrate the accuracy of the analytic solutions. PMID:26952314

  6. Optical microscopy as a comparative analytical technique for single-particle dissolution studies.

    PubMed

    Svanbäck, Sami; Ehlers, Henrik; Yliruusi, Jouko

    2014-07-20

    Novel, simple and cost effective methods are needed to replace advanced chemical analytical techniques, in small-scale dissolution studies. Optical microscopy of individual particles could provide such a method. The aim of the present work was to investigate and verify the applicability of optical microscopy as an analytical technique for drug dissolution studies. The evaluation was performed by comparing image and chemical analysis data of individual dissolving particles. It was shown that the data obtained by image analysis and UV-spectrophotometry produced practically identical dissolution curves, with average similarity and difference factors above 82 and below 4, respectively. The relative standard deviation for image analysis data, of the studied particle size range, varied between 1.9% and 3.8%. Consequently, it is proposed that image analysis can be used, on its own, as a viable analytical technique in single-particle dissolution studies. The possibility for significant reductions in sample preparation, operational cost, time and substance consumption gives optical detection a clear advantage over chemical analytical methods. Thus, image analysis could be an ideal and universal analytical technique for rapid small-scale dissolution studies.

  7. Field analytical techniques for mercury in soils technology evaluation. Topical report, November 1994--March 1997

    SciTech Connect

    Solc, J.; Harju, J.A.; Grisanti, A.A.

    1998-02-01

    This report presents the evaluation of the four field analytical techniques for mercury detection in soils, namely (1) an anodic stripping voltametry technique (ASV) developed and tested by General Electric Corporation; (2) a static headspace analysis (SHSA) technique developed and tested by Dr. Ralph Turner of Oak Ridge National Laboratory; (3) the BiMelyze{reg_sign} Mercury Immunoassay (Bio) developed and tested by BioNebraska, Inc.; and (4) a transportable x-ray fluorescence (XRF) instrument/technique developed and tested by Spectrace, Inc.

  8. Integration of datasets from different analytical techniques to assess the impact of nutrition on human metabolome

    PubMed Central

    Vernocchi, Pamela; Vannini, Lucia; Gottardi, Davide; Del Chierico, Federica; Serrazanetti, Diana I.; Ndagijimana, Maurice; Guerzoni, Maria E.

    2012-01-01

    Bacteria colonizing the human intestinal tract exhibit a high phylogenetic diversity that reflects their immense metabolic potentials. The catalytic activity of gut microbes has an important impact on gastrointestinal (GI) functions and host health. The microbial conversion of carbohydrates and other food components leads to the formation of a large number of compounds that affect the host metabolome and have beneficial or adverse effects on human health. Metabolomics is a metabolic-biology system approach focused on the metabolic responses understanding of living systems to physio-pathological stimuli by using multivariate statistical data on human body fluids obtained by different instrumental techniques. A metabolomic approach based on an analytical platform could be able to separate, detect, characterize and quantify a wide range of metabolites and its metabolic pathways. This approach has been recently applied to study the metabolic changes triggered in the gut microbiota by specific diet components and diet variations, specific diseases, probiotic and synbiotic food intake. This review describes the metabolomic data obtained by analyzing human fluids by using different techniques and particularly Gas Chromatography Mass Spectrometry Solid-phase Micro Extraction (GC-MS/SPME), Proton Nuclear Magnetic Resonance (1H-NMR) Spectroscopy and Fourier Transform Infrared (FTIR) Spectroscopy. This instrumental approach has a good potential in the identification and detection of specific food intake and diseases biomarkers. PMID:23248777

  9. Forecasting Device Effectiveness: Volume III. Analytic Assessment of Device Effectiveness Forecasting Technique. Final Report.

    ERIC Educational Resources Information Center

    Rose, Andrew M.; And Others

    This third of three volumes reports on analytic procedures conducted to address various aspects of the scalar properties of the Device Effectiveness Forecasting Technique (DEFT). DEFT, a series of microcomputer programs applied to data gathered from rating scales, is used to evaluate simulator devices used in U.S. Army weapons training. The…

  10. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature.

  11. An analytic reconstruction method for PET based on cubic splines

    NASA Astrophysics Data System (ADS)

    Kastis, George A.; Kyriakopoulou, Dimitra; Fokas, Athanasios S.

    2014-03-01

    PET imaging is an important nuclear medicine modality that measures in vivo distribution of imaging agents labeled with positron-emitting radionuclides. Image reconstruction is an essential component in tomographic medical imaging. In this study, we present the mathematical formulation and an improved numerical implementation of an analytic, 2D, reconstruction method called SRT, Spline Reconstruction Technique. This technique is based on the numerical evaluation of the Hilbert transform of the sinogram via an approximation in terms of 'custom made' cubic splines. It also imposes sinogram thresholding which restricts reconstruction only within object pixels. Furthermore, by utilizing certain symmetries it achieves a reconstruction time similar to that of FBP. We have implemented SRT in the software library called STIR and have evaluated this method using simulated PET data. We present reconstructed images from several phantoms. Sinograms have been generated at various Poison noise levels and 20 realizations of noise have been created at each level. In addition to visual comparisons of the reconstructed images, the contrast has been determined as a function of noise level. Further analysis includes the creation of line profiles when necessary, to determine resolution. Numerical simulations suggest that the SRT algorithm produces fast and accurate reconstructions at realistic noise levels. The contrast is over 95% in all phantoms examined and is independent of noise level.

  12. The analytical investigations of ancient pottery from Kaveripakkam, Vellore dist, Tamilnadu by spectroscopic techniques.

    PubMed

    Ravisankar, R; Naseerutheen, A; Annamalai, G Raja; Chandrasekaran, A; Rajalakshmi, A; Kanagasabapathy, K V; Prasad, M V R; Satpathy, K K

    2014-01-01

    Analytical investigations using Fourier Transform infrared spectroscopy (FT-IR), Powder X-ray Diffraction (PXRD), Thermal Analysis (TG-DTA), Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Fluorescence Spectrometry (EDXRF) were carried out on ancient pottery fragments from Kaveripakkam, in order to outline manufacturing skills, technology information, firing condition and temperature of potteries. The whole set of data showed the firing temperature in the range of 800-900°C. The analytical characterization of the potsherds, by different complimentary techniques has allowed to identifying the raw materials and technology applied by the ancient artisans.

  13. The analytical investigations of ancient pottery from Kaveripakkam, Vellore dist, Tamilnadu by spectroscopic techniques

    NASA Astrophysics Data System (ADS)

    Ravisankar, R.; Naseerutheen, A.; Raja Annamalai, G.; Chandrasekaran, A.; Rajalakshmi, A.; Kanagasabapathy, K. V.; Prasad, M. V. R.; Satpathy, K. K.

    2014-03-01

    Analytical investigations using Fourier Transform infrared spectroscopy (FT-IR), Powder X-ray Diffraction (PXRD), Thermal Analysis (TG-DTA), Scanning Electron Microscopy (SEM) and Energy Dispersive X-ray Fluorescence Spectrometry (EDXRF) were carried out on ancient pottery fragments from Kaveripakkam, in order to outline manufacturing skills, technology information, firing condition and temperature of potteries. The whole set of data showed the firing temperature in the range of 800-900 °C. The analytical characterization of the potsherds, by different complimentary techniques has allowed to identifying the raw materials and technology applied by the ancient artisans.

  14. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points".

  15. Assessment of analytical techniques for predicting solid propellant exhaust plumes and plume impingement environments

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Smith, S. D.; Penny, M. M.

    1977-01-01

    An analysis of experimental nozzle, exhaust plume, and exhaust plume impingement data is presented. The data were obtained for subscale solid propellant motors with propellant Al loadings of 2, 10 and 15% exhausting to simulated altitudes of 50,000, 100,000 and 112,000 ft. Analytical predictions were made using a fully coupled two-phase method of characteristics numerical solution and a technique for defining thermal and pressure environments experienced by bodies immersed in two-phase exhaust plumes.

  16. Paper-based inkjet-printed microfluidic analytical devices.

    PubMed

    Yamada, Kentaro; Henares, Terence G; Suzuki, Koji; Citterio, Daniel

    2015-04-27

    Rapid, precise, and reproducible deposition of a broad variety of functional materials, including analytical assay reagents and biomolecules, has made inkjet printing an effective tool for the fabrication of microanalytical devices. A ubiquitous office device as simple as a standard desktop printer with its multiple ink cartridges can be used for this purpose. This Review discusses the combination of inkjet printing technology with paper as a printing substrate for the fabrication of microfluidic paper-based analytical devices (μPADs), which have developed into a fast-growing new field in analytical chemistry. After introducing the fundamentals of μPADs and inkjet printing, it touches on topics such as the microfluidic patterning of paper, tailored arrangement of materials, and functionalities achievable exclusively by the inkjet deposition of analytical assay components, before concluding with an outlook on future perspectives.

  17. SIFT - A Component-Based Integration Architecture for Enterprise Analytics

    SciTech Connect

    Thurman, David A.; Almquist, Justin P.; Gorton, Ian; Wynne, Adam S.; Chatterton, Jack

    2007-02-01

    Architectures and technologies for enterprise application integration are relatively mature, resulting in a range of standards-based and proprietary middleware technologies. In the domain of complex analytical applications, integration architectures are not so well understood. Analytical applications such as those used in scientific discovery, emergency response, financial and intelligence analysis exert unique demands on their underlying architecture. These demands make existing integration middleware inappropriate for use in enterprise analytics environments. In this paper we describe SIFT (Scalable Information Fusion and Triage), a platform designed for integrating the various components that comprise enterprise analytics applications. SIFT exploits a common pattern for composing analytical components, and extends an existing messaging platform with dynamic configuration mechanisms and scaling capabilities. We demonstrate the use of SIFT to create a decision support platform for quality control based on large volumes of incoming delivery data. The strengths of the SIFT solution are discussed, and we conclude by describing where further work is required to create a complete solution applicable to a wide range of analytical application domains.

  18. A Meta-Analytic Review of School-Based Prevention for Cannabis Use

    ERIC Educational Resources Information Center

    Porath-Waller, Amy J.; Beasley, Erin; Beirness, Douglas J.

    2010-01-01

    This investigation used meta-analytic techniques to evaluate the effectiveness of school-based prevention programming in reducing cannabis use among youth aged 12 to 19. It summarized the results from 15 studies published in peer-reviewed journals since 1999 and identified features that influenced program effectiveness. The results from the set of…

  19. Role of Knowledge Management and Analytical CRM in Business: Data Mining Based Framework

    ERIC Educational Resources Information Center

    Ranjan, Jayanthi; Bhatnagar, Vishal

    2011-01-01

    Purpose: The purpose of the paper is to provide a thorough analysis of the concepts of business intelligence (BI), knowledge management (KM) and analytical CRM (aCRM) and to establish a framework for integrating all the three to each other. The paper also seeks to establish a KM and aCRM based framework using data mining (DM) techniques, which…

  20. A Meta-Analytic Review of the Role of Instructional Support in Game-Based Learning

    ERIC Educational Resources Information Center

    Wouters, Pieter; van Oostendorp, Herre

    2013-01-01

    Computer games can be considered complex learning environments in which players require instructional support to engage in cognitive processes such as selecting and actively organizing/integrating new information. We used meta-analytical techniques to test if instructional support enhances learning in game-based learning (k = 107, N[subscript adj]…

  1. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart.

  2. Operator-based analytic theory of decoherence in NMR

    NASA Astrophysics Data System (ADS)

    Pandey, Manoj Kumar; Ramachandran, Ramesh

    2011-06-01

    The operator-based analytic description of polarization transfer in NMR spectroscopy is often fraught with difficulty due to (a) the dimension and (b) the non-commuting nature of the spin Hamiltonians. In this article, an analytic model is presented to elucidate the mechanism of polarization transfer between dilute spins I 1 and I 2 coupled to a reservoir of abundant S-spins (i.e. ? ) in the solid state. Specifically, the factors responsible for the decoherence observed in double cross-polarization (DCP) experiments are outlined in terms of operators via effective Floquet Hamiltonians. The interplay between the various anisotropic interactions is thoroughly investigated by comparing the simulations from the analytic theory with exact numerical methods. The analytical theory presents a framework for incorporating multi-spin effects within a reduced subspace spanned by spins I 1 and I 2. The simulation results from the analytic model comprising eight spins are in excellent agreement with the numerical methods and present an attractive tool for understanding the phenomenon of decoherence in NMR.

  3. Modeling HTL of industrial workers using multiple regression and path analytic techniques.

    PubMed

    Smith, C R; Seitz, M R; Borton, T E; Kleinstein, R N; Wilmoth, J N

    1984-04-01

    This study compared path analytic with multiple regression analyses of hearing threshold levels (HTLs) on 258 adult textile workers evenly divided into low- and high-noise exposure groups. Demographic variables common in HTL studies were examined, with the addition of iris color, as well as selected two-way interactions. Variables of interest were similarly distributed in both groups. The results indicated that (1) different statistical procedures can lead to different conclusions even with the same HTL data for the same Ss; (2) conflicting conclusions may be artifacts of the analytic methodologies employed for data analysis; (3) a well-formulated theory under which path analytic techniques are employed may clarify somewhat the way a variable affects HTL values through its correlational connections with other antecedent variables included in the theoretical model; (4) multicollinearity among independent variables on which HTL is regressed usually presents a problem in unraveling exactly how each variable influences noise-induced hearing loss; and (5) because of the contradictory nature of its direct and indirect effects on HTL, iris color provides little, if any, explanatory assistance for modeling HTL.

  4. Evaluation of available analytical techniques for monitoring the quality of space station potable water

    NASA Technical Reports Server (NTRS)

    Geer, Richard D.

    1989-01-01

    To assure the quality of potable water (PW) on the Space Station (SS) a number of chemical and physical tests must be conducted routinely. After reviewing the requirements for potable water, both direct and indirect analytical methods are evaluated that could make the required tests and improvements compatible with the Space Station operation. A variety of suggestions are made to improve the analytical techniques for SS operation. The most important recommendations are: (1) the silver/silver chloride electrode (SB) method of removing I sub 2/I (-) biocide from the water, since it may interfere with analytical procedures for PW and also its end uses; (2) the orbital reactor (OR) method of carrying out chemistry and electrochemistry in microgravity by using a disk shaped reactor on an orbital table to impart artificial G force to the contents, allowing solution mixing and separation of gases and liquids; and (3) a simple ultra low volume highly sensitive electrochemical/conductivity detector for use with a capillary zone electrophoresis apparatus. It is also recommended, since several different conductivity and resistance measurements are made during the analysis of PW, that the bipolar pulse measuring circuit be used in all these applications for maximum compatibility and redundancy of equipment.

  5. Analytical techniques for characterization of cyclodextrin complexes in aqueous solution: a review.

    PubMed

    Mura, Paola

    2014-12-01

    Cyclodextrins are cyclic oligosaccharides endowed with a hydrophilic outer surface and a hydrophobic inner cavity, able to form inclusion complexes with a wide variety of guest molecules, positively affecting their physicochemical properties. In particular, in the pharmaceutical field, cyclodextrin complexation is mainly used to increase the aqueous solubility and dissolution rate of poorly soluble drugs, and to enhance their bioavailability and stability. Analytical characterization of host-guest interactions is of fundamental importance for fully exploiting the potential benefits of complexation, helping in selection of the most appropriate cyclodextrin. The assessment of the actual formation of a drug-cyclodextrin inclusion complex and its full characterization is not a simple task and often requires the use of different analytical methods, whose results have to be combined and examined together. The purpose of the present review is to give, as much as possible, a general overview of the main analytical tools which can be employed for the characterization of drug-cyclodextrin inclusion complexes in solution, with emphasis on their respective potential merits, disadvantages and limits. Further, the applicability of each examined technique is illustrated and discussed by specific examples from literature.

  6. New analytical expressions of the Rossiter-McLaughlin effect adapted to different observation techniques

    NASA Astrophysics Data System (ADS)

    Boué, G.; Montalto, M.; Boisse, I.; Oshagh, M.; Santos, N. C.

    2013-02-01

    The Rossiter-McLaughlin (hereafter RM) effect is a key tool for measuring the projected spin-orbit angle between stellar spin axes and orbits of transiting planets. However, the measured radial velocity (RV) anomalies produced by this effect are not intrinsic and depend on both instrumental resolution and data reduction routines. Using inappropriate formulas to model the RM effect introduces biases, at least in the projected velocity Vsini⋆ compared to the spectroscopic value. Currently, only the iodine cell technique has been modeled, which corresponds to observations done by, e.g., the HIRES spectrograph of the Keck telescope. In this paper, we provide a simple expression of the RM effect specially designed to model observations done by the Gaussian fit of a cross-correlation function (CCF) as in the routines performed by the HARPS team. We derived a new analytical formulation of the RV anomaly associated to the iodine cell technique. For both formulas, we modeled the subplanet mean velocity vp and dispersion βp accurately taking the rotational broadening on the subplanet profile into account. We compare our formulas adapted to the CCF technique with simulated data generated with the numerical software SOAP-T and find good agreement up to Vsini⋆ ≲ 20 km s-1. In contrast, the analytical models simulating the two different observation techniques can disagree by about 10σ in Vsini⋆ for large spin-orbit misalignments. It is thus important to apply the adapted model when fitting data. A public code implementing the expressions derived in this paper is available at http://www.astro.up.pt/resources/arome. A copy of the code is also available at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/550/A53

  7. Comparison of analytical techniques for the determination of aldehydes in test chambers.

    PubMed

    Salthammer, Tunga; Mentese, Sibel

    2008-11-01

    The level of carbonyl compounds in indoor air is crucial due to possible health effects and the high prevalence of their potential sources. Therefore, selecting a convenient and rapid analytical technique for the reliable detection of carbonyl compound concentrations is important. The acetyl acetone (acac) method is a widely used standard procedure for detecting gaseous formaldehyde. For measuring formaldehyde along with other carbonyl compounds, the DNPH-method is commonly applied. The recommended procedure for measuring volatile organic compounds (VOCs) is sampling on Tenax TA, followed by thermal desorption and GC/MS analysis. In this study, different analytical techniques for the quantification of formaldehyde, pentanal, and hexanal are critically compared. It was found that the acac- and DNPH-method are in very good agreement for formaldehyde. In contrast, the DNPH-method significantly underestimates indoor air concentrations of the higher aldehydes in comparison to sampling on Tenax TA, although both methods are strongly correlated. The reported results are part of the EURIMA-WKI study on levels of indoor air pollutants resulting from construction, building materials and interior decoration.

  8. Characterization and source identification of hydrocarbons in water samples using multiple analytical techniques.

    PubMed

    Wang, Zhendi; Li, K; Fingas, M; Sigouin, L; Ménard, L

    2002-09-20

    This paper describes a case study in which multiple analytical techniques were used to identify and characterize trace petroleum-related hydrocarbons and other volatile organic compounds in groundwater samples collected in a bedrock aquifer exploited for drinking water purposes. The objective of the study was to confirm the presence of gasoline and other petroleum products or other volatile organic pollutants in those samples in order to assess the respective implication of each of the potentially responsible parties to the contamination of the aquifer. In addition, the degree of contamination at different depths in the aquifer was also of interest. The analytical techniques used for analyses of water samples included gas chromatography-mass spectrometry (GC-MS) and capillary GC with flame-ionization detection, solid-phase microextraction and headspace GC-MS techniques. Chemical characterization results revealed the following: (1) The hydrocarbons in sample A (near-surface groundwater, 0-5 m) were clearly of two types, one being gasoline and the other a heavy petroleum product. The significant distribution of five target petroleum-characteristic alkylkated polycyclic aromatic hydrocarbon homologues and biomarkers confirmed the presence of another heavy petroleum product. The concentrations of the TPHs (total petroleum hydrocarbons) and BTEX (collective name of benzene, toluene, ethylbenzene, and p-, m-, and o-xylenes) were determined to be 1070 and 155 microg/kg of water for sample A, respectively. (2) The deepest groundwater (sample B, collected at a depth ranging between 15 and 60 m) was also contaminated, but to a much lesser degree. The concentrations of the TPH and BTEX were determined to be only 130 and 2.6 microg/kg of water for sample B, respectively. (3) The presence of a variety of volatile chlorinated compounds to the groundwater was also clearly identified. PMID:12350112

  9. Characterization and source identification of hydrocarbons in water samples using multiple analytical techniques.

    PubMed

    Wang, Zhendi; Li, K; Fingas, M; Sigouin, L; Ménard, L

    2002-09-20

    This paper describes a case study in which multiple analytical techniques were used to identify and characterize trace petroleum-related hydrocarbons and other volatile organic compounds in groundwater samples collected in a bedrock aquifer exploited for drinking water purposes. The objective of the study was to confirm the presence of gasoline and other petroleum products or other volatile organic pollutants in those samples in order to assess the respective implication of each of the potentially responsible parties to the contamination of the aquifer. In addition, the degree of contamination at different depths in the aquifer was also of interest. The analytical techniques used for analyses of water samples included gas chromatography-mass spectrometry (GC-MS) and capillary GC with flame-ionization detection, solid-phase microextraction and headspace GC-MS techniques. Chemical characterization results revealed the following: (1) The hydrocarbons in sample A (near-surface groundwater, 0-5 m) were clearly of two types, one being gasoline and the other a heavy petroleum product. The significant distribution of five target petroleum-characteristic alkylkated polycyclic aromatic hydrocarbon homologues and biomarkers confirmed the presence of another heavy petroleum product. The concentrations of the TPHs (total petroleum hydrocarbons) and BTEX (collective name of benzene, toluene, ethylbenzene, and p-, m-, and o-xylenes) were determined to be 1070 and 155 microg/kg of water for sample A, respectively. (2) The deepest groundwater (sample B, collected at a depth ranging between 15 and 60 m) was also contaminated, but to a much lesser degree. The concentrations of the TPH and BTEX were determined to be only 130 and 2.6 microg/kg of water for sample B, respectively. (3) The presence of a variety of volatile chlorinated compounds to the groundwater was also clearly identified.

  10. Big Data Analytics in Immunology: A Knowledge-Based Approach

    PubMed Central

    Zhang, Guang Lan

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  11. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow. PMID:25045677

  12. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  13. General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems

    PubMed Central

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  14. General analytical schemes for the characterization of pectin-based edible gelled systems.

    PubMed

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  15. Lignocellulose-based analytical devices: bamboo as an analytical platform for chemical detection

    PubMed Central

    Kuan, Chen-Meng; York, Roger L.; Cheng, Chao-Min

    2015-01-01

    This article describes the development of lignocellulose-based analytical devices (LADs) for rapid bioanalysis in low-resource settings. LADs are constructed using either a single lignocellulose or a hybrid design consisting of multiple types of lignocellulose. LADs are simple, low-cost, easy to use, provide rapid response, and do not require external instrumentation during operation. Here, we demonstrate the implementation of LADs for food and water safety (i.e., nitrite assay in hot-pot soup, bacterial detection in water, and resazurin assay in milk) and urinalysis (i.e., nitrite, urobilinogen, and pH assays in human urine). Notably, we created a unique approach using simple chemicals to achieve sensitivity similar to that of commercially available immunochromatographic strips that is low-cost, and provides on-site, rapid detection, for instance, of Eschericia coli (E. coli) in water. PMID:26686576

  16. Review of the various analytical techniques and algorithms for detection and quantification of TATP

    NASA Astrophysics Data System (ADS)

    Pacheco-Londono, Leonardo; Primera, Oliva M.; Ramirez, Michael; Ruiz, Orlando; Hernandez-Rivera, Samuel

    2005-05-01

    The objective of this research is to design and develop a multi-sensor capable of fast detection and of recognition optimization of the techniques for used for quantification of TATP by Pattern Recognition. In particular, the long range goal of the research is to use sensor fusion and sensor "talking" modalities to couple Stand Off detectors with Chemical Point detectors for detection of airborne chemical agents and detection of Improvised Explosive Devices (IEDs). Vibrational spectroscopy techniques are very fast and can be used for real time detection. Good results have been obtained with various target molecular (chemical) systems such as TATP, TNT and DNT. Samples of TATP were detected and quantified in air, in solution and in solid phase on surfaces by different techniques. FTIR Spectroscopy and GS-MS were used to generate new analytical procedures for detection and analysis of the organic peroxide. These procedures were compared and taken to their limits by optimization with Chemometrics, Partial Least Squares (PLS), and Discriminant Analysis (DA).

  17. Applications of MEMS-based biochemical analytical instrumentation

    SciTech Connect

    Morse, J. D., LLNL

    1997-05-21

    The MicroTechnology Center at Lawrence Livermore National Laboratory is developing a variety of MEMS-Based analytical instrumentation systems in support of programmatic needs, along with numerous external customers. Several of the applications of interest are in the area of biochemical identification and analysis. These applications range from DNA fragment analysis and collection in support of the Human Genome Project, to detection of viruses or biological warfare agents. Each of the applications of interest has focused in micro-machined MEMS technology for reduced cost, higher throughput, and faster results. Development of these analytical instrumentation systems will have long term benefits for the medical community as well. The following describes the technologies several specific applications.

  18. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  19. Analytical techniques for the detection and identification of chemical warfare materials from environmental samples

    SciTech Connect

    Beaudry, W.T.; Weimaster, J.F.

    1995-06-01

    The detection and identification of chemical warfare (CW) material in diverse and complex matrices has become increasingly important to support the environmental clean-up of military and industrial sites that were historically used in the research, production, use, storage and/or demilitarization of chemical weapons. Reliable and defensible identification of hazardous materials (HM) is necessary to comply with the increasingly stringent regulations imposed by local, state, and federal agencies which govern handling, treatment, storage, and disposal of HM. In addition, before sites can be reutilized, existing HM must be properly identified so that the proper methods of removal, treatment and disposal can be determined. An overview of sample preparation and analytical techniques for the detection and identification of CW materials is presented in this paper.

  20. Role of nuclear analytical probe techniques in biological trace element research

    SciTech Connect

    Jones, K.W.; Pounds, J.G.

    1985-01-01

    Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace elements metabolism, and of conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research, and examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team. 21 refs., 4 figs., 1 tab.

  1. Balloon-based interferometric techniques

    NASA Technical Reports Server (NTRS)

    Rees, David

    1985-01-01

    A balloon-borne triple-etalon Fabry-Perot Interferometer, observing the Doppler shifts of absorption lines caused by molecular oxygen and water vapor in the far red/near infrared spectrum of backscattered sunlight, has been used to evaluate a passive spaceborne remote sensing technique for measuring winds in the troposphere and stratosphere. There have been two successful high altitude balloon flights of the prototype UCL instrument from the National Scientific Balloon Facility at Palestine, TE (May 80, Oct. 83). The results from these flights have demonstrated that an interferometer with adequate resolution, stability and sensitivity can be built. The wind data are of comparable quality to those obtained from operational techniques (balloon and rocket sonde, cloud-top drift analysis, and from the gradient wind analysis of satellite radiance measurements). However, the interferometric data can provide a regular global grid, over a height range from 5 to 50 km in regions of clear air. Between the middle troposphere (5 km) and the upper stratosphere (40 to 50 km), an optimized instrument can make wind measurements over the daylit hemisphere with an accuracy of about 3 to 5 m/sec (2 sigma). It is possible to obtain full height profiles between altitudes of 5 and 50 km, with 4 km height resolution, and a spatial resolution of about 200 km, along the orbit track. Below an altitude of about 10 km, Fraunhofer lines of solar origin are possible targets of the Doppler wind analysis. Above an altitude of 50 km, the weakness of the backscattered solar spectrum (decreasing air density) is coupled with the low absorption crosssection of all atmospheric species in the spectral region up to 800 nm (where imaging photon detectors can be used), causing the along-the-track resolution (or error) to increase beyond values useful for operational purposes. Within the region of optimum performance (5 to 50 km), however, the technique is a valuable potential complement to existing wind

  2. Screening of synthetic PDE-5 inhibitors and their analogues as adulterants: analytical techniques and challenges.

    PubMed

    Patel, Dhavalkumar Narendrabhai; Li, Lin; Kee, Chee-Leong; Ge, Xiaowei; Low, Min-Yong; Koh, Hwee-Ling

    2014-01-01

    The popularity of phosphodiesterase type 5 (PDE-5) enzyme inhibitors for the treatment of erectile dysfunction has led to the increase in prevalence of illicit sexual performance enhancement products. PDE-5 inhibitors, namely sildenafil, tadalafil and vardenafil, and their unapproved designer analogues are being increasingly used as adulterants in the herbal products and health supplements marketed for sexual performance enhancement. To date, more than 50 unapproved analogues of prescription PDE-5 inhibitors were found as adulterants in the literature. To avoid detection of such adulteration by standard screening protocols, the perpetrators of such illegal products are investing time and resources to synthesize exotic analogues and devise novel means for adulteration. A comprehensive review of conventional and advance analytical techniques to detect and characterize the adulterants is presented. The rapid identification and structural elucidation of unknown analogues as adulterants is greatly enhanced by the wide myriad of analytical techniques employed, including high performance liquid chromatography (HPLC), gas chromatography-mass spectrometry (GC-MS), liquid chromatography mass-spectrometry (LC-MS), nuclear magnetic resonance (NMR) spectroscopy, vibrational spectroscopy, liquid chromatography-Fourier transform ion cyclotron resonance-mass spectrometry (LC-FT-ICR-MS), liquid chromatograph-hybrid triple quadrupole linear ion trap mass spectrometer with information dependent acquisition, ultra high performance liquid chromatography-time of flight-mass spectrometry (UHPLC-TOF-MS), ion mobility spectroscopy (IMS) and immunoassay methods. The many challenges in detecting and characterizing such adulterants, and the need for concerted effort to curb adulteration in order to safe guard public safety and interest are discussed.

  3. Screening of synthetic PDE-5 inhibitors and their analogues as adulterants: analytical techniques and challenges.

    PubMed

    Patel, Dhavalkumar Narendrabhai; Li, Lin; Kee, Chee-Leong; Ge, Xiaowei; Low, Min-Yong; Koh, Hwee-Ling

    2014-01-01

    The popularity of phosphodiesterase type 5 (PDE-5) enzyme inhibitors for the treatment of erectile dysfunction has led to the increase in prevalence of illicit sexual performance enhancement products. PDE-5 inhibitors, namely sildenafil, tadalafil and vardenafil, and their unapproved designer analogues are being increasingly used as adulterants in the herbal products and health supplements marketed for sexual performance enhancement. To date, more than 50 unapproved analogues of prescription PDE-5 inhibitors were found as adulterants in the literature. To avoid detection of such adulteration by standard screening protocols, the perpetrators of such illegal products are investing time and resources to synthesize exotic analogues and devise novel means for adulteration. A comprehensive review of conventional and advance analytical techniques to detect and characterize the adulterants is presented. The rapid identification and structural elucidation of unknown analogues as adulterants is greatly enhanced by the wide myriad of analytical techniques employed, including high performance liquid chromatography (HPLC), gas chromatography-mass spectrometry (GC-MS), liquid chromatography mass-spectrometry (LC-MS), nuclear magnetic resonance (NMR) spectroscopy, vibrational spectroscopy, liquid chromatography-Fourier transform ion cyclotron resonance-mass spectrometry (LC-FT-ICR-MS), liquid chromatograph-hybrid triple quadrupole linear ion trap mass spectrometer with information dependent acquisition, ultra high performance liquid chromatography-time of flight-mass spectrometry (UHPLC-TOF-MS), ion mobility spectroscopy (IMS) and immunoassay methods. The many challenges in detecting and characterizing such adulterants, and the need for concerted effort to curb adulteration in order to safe guard public safety and interest are discussed. PMID:23721687

  4. A Web-based Geovisual Analytical System for Climate Studies

    NASA Astrophysics Data System (ADS)

    Sun, M.; Li, J.; Yang, C.; Schmidt, G. A.; Bambacus, M.; Cahalan, R.; Huang, Q.; Xu, C.; Noble, E.

    2012-12-01

    Climate studies involve petabytes of spatiotemporal datasets that are produced and archived at distributed computing resources. Scientists need an intuitive and convenient tool to explore the distributed spatiotemporal data. Geovisual analytical tools have the potential to provide such an intuitive and convenient method for scientists to access climate data, discover the relationships between various climate parameters, and communicate the results across different research communities. However, implementing a geovisual analytical tool for complex climate data in a distributed environment poses several challenges. This paper reports our efforts in developing a web-based geovisual analytical system to support the analysis of climate data generated by Intergovernmental Panel on Climate Change (IPCC), Fourth Assessment Report (AR4) models. Using the ModelE developed by NASA Goddard Institute for Space Studies (GISS) as an example, we demonstrate that the system is able to 1) manage large volume datasets over the Internet, 2) visualize 3D/4D spatiotemporal data, 3) broker various spatiotemporal statistical analyses for climate research, and 4) support interactive data analysis and knowledge discovery. This research also provides an example of how to manage, disseminate, and analyze Big Data in the 21st century.

  5. Nine-analyte detection using an array-based biosensor

    NASA Technical Reports Server (NTRS)

    Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.

    2002-01-01

    A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.

  6. Algal Biomass Analysis by Laser-Based Analytical Techniques—A Review

    PubMed Central

    Pořízka, Pavel; Prochazková, Petra; Prochazka, David; Sládková, Lucia; Novotný, Jan; Petrilak, Michal; Brada, Michal; Samek, Ota; Pilát, Zdeněk; Zemánek, Pavel; Adam, Vojtěch; Kizek, René; Novotný, Karel; Kaiser, Jozef

    2014-01-01

    Algal biomass that is represented mainly by commercially grown algal strains has recently found many potential applications in various fields of interest. Its utilization has been found advantageous in the fields of bioremediation, biofuel production and the food industry. This paper reviews recent developments in the analysis of algal biomass with the main focus on the Laser-Induced Breakdown Spectroscopy, Raman spectroscopy, and partly Laser-Ablation Inductively Coupled Plasma techniques. The advantages of the selected laser-based analytical techniques are revealed and their fields of use are discussed in detail. PMID:25251409

  7. Setting analytical performance specifications based on outcome studies - is it possible?

    PubMed

    Horvath, Andrea Rita; Bossuyt, Patrick M M; Sandberg, Sverre; John, Andrew St; Monaghan, Phillip J; Verhagen-Kamerbeek, Wilma D J; Lennartz, Lieselotte; Cobbaert, Christa M; Ebert, Christoph; Lord, Sarah J

    2015-05-01

    The 1st Strategic Conference of the European Federation of Clinical Chemistry and Laboratory Medicine proposed a simplified hierarchy for setting analytical performance specifications (APS). The top two levels of the 1999 Stockholm hierarchy, i.e., evaluation of the effect of analytical performance on clinical outcomes and clinical decisions have been proposed to be replaced by one outcome-based model. This model can be supported by: (1a) direct outcome studies; and (1b) indirect outcome studies investigating the impact of analytical performance of the test on clinical classifications or decisions and thereby on the probability of patient relevant clinical outcomes. This paper reviews the need for outcome-based specifications, the most relevant types of outcomes to be considered, and the challenges and limitations faced when setting outcome-based APS. The methods of Model 1a and b are discussed and examples are provided for how outcome data can be translated to APS using the linked evidence and simulation or decision analytic techniques. Outcome-based APS should primarily reflect the clinical needs of patients; should be tailored to the purpose, role and significance of the test in a well defined clinical pathway; and should be defined at a level that achieves net health benefit for patients at reasonable costs. Whilst it is acknowledged that direct evaluations are difficult and may not be possible for all measurands, all other forms of setting APS should be weighed against that standard, and regarded as approximations. Better definition of the relationship between the analytical performance of tests and health outcomes can be used to set analytical performance criteria that aim to improve the clinical and cost-effectiveness of laboratory tests.

  8. [Clinical Application of Analytical and Medical Instruments Mainly Using MS Techniques].

    PubMed

    Tanaka, Koichi

    2016-02-01

    Analytical instruments for clinical use are commonly required to confirm the compounds and forms related to diseases with the highest possible sensitivity, quantitative performance, and specificity and minimal invasiveness within a short time, easily, and at a low cost. Advancements of technical innovation for Mass Spectrometer (MS) have led to techniques that meet such requirements. Besides confirming known substances, other purposes and advantages of MS that are not fully known to the public are using MS as a tool to discover unknown phenomena and compounds. An example is clarifying the mechanisms of human diseases. The human body has approximately 100 thousand types of protein, and there may be more than several million types of protein and their metabolites. Most of them have yet to be discovered, and their discovery may give birth to new academic fields and lead to the clarification of diseases, development of new medicines, etc. For example, using the MS system developed under "Contribution to drug discovery and diagnosis by next generation of advanced mass spectrometry system," one of the 30 projects of the "Funding Program for World-Leading Innovative R&D on Science and Technology" (FIRST program), and other individual basic technologies, we succeeded in discovering new disease biomarker candidates for Alzheimer's disease, cancer, etc. Further contribution of MS to clinical medicine can be expected through the development and improvement of new techniques, efforts to verify discoveries, and communications with the medical front. PMID:27311284

  9. Disc-based microarrays: principles and analytical applications.

    PubMed

    Morais, Sergi; Puchades, Rosa; Maquieira, Ángel

    2016-07-01

    The idea of using disk drives to monitor molecular biorecognition events on regular optical discs has received considerable attention during the last decade. CDs, DVDs, Blu-ray discs and other new optical discs are universal and versatile supports with the potential for development of protein and DNA microarrays. Besides, standard disk drives incorporated in personal computers can be used as compact and affordable optical reading devices. Consequently, a CD technology, resulting from the audio-video industry, has been used to develop analytical applications in health care, environmental monitoring, food safety and quality assurance. The review presents and critically evaluates the current state of the art of disc-based microarrays with illustrative examples, including past, current and future developments. Special mention is made of the analytical developments that use either chemically activated or raw standard CDs where proteins, oligonucleotides, peptides, haptens or other biological probes are immobilized. The discs are also used to perform the assays and must maintain their readability with standard optical drives. The concept and principle of evolving disc-based microarrays and the evolution of disk drives as optical detectors are also described. The review concludes with the most relevant uses ordered chronologically to provide an overview of the progress of CD technology applications in the life sciences. Also, it provides a selection of important references to the current literature. Graphical Abstract High density disc-based microarrays. PMID:26922341

  10. Analytic calculation of physiological acid-base parameters in plasma.

    PubMed

    Wooten, E W

    1999-01-01

    Analytic expressions for plasma total titratable base, base excess (DeltaCB), strong-ion difference, change in strong-ion difference (DeltaSID), change in Van Slyke standard bicarbonate (DeltaVSSB), anion gap, and change in anion gap are derived as a function of pH, total buffer ion concentration, and conditional molar equilibrium constants. The behavior of these various parameters under respiratory and metabolic acid-base disturbances for constant and variable buffer ion concentrations is considered. For constant noncarbonate buffer concentrations, DeltaSID = DeltaCB = DeltaVSSB, whereas these equalities no longer hold under changes in noncarbonate buffer concentration. The equivalence is restored if the reference state is changed to include the new buffer concentrations.

  11. Analysis of hydroponic fertilizer matrixes for perchlorate: comparison of analytical techniques.

    PubMed

    Collette, Timothy W; Williams, Ted L; Urbansky, Edward T; Magnuson, Matthew L; Hebert, Gretchen N; Strauss, Steven H

    2003-01-01

    Seven retail hydroponic nitrate fertilizer products, two liquid and five solid, were comparatively analyzed for the perchlorate anion (ClO4-) by ion chromatography (IC) with suppressed conductivity detection, complexation electrospray ionization mass spectrometry (cESI-MS), normal Raman spectroscopy, and infrared spectroscopy using an attenuated total reflectance crystal (ATR-FTIR) coated with a thin film of an organometallic ion-exchange compound. Three of the five solid products were found by all techniques to contain perchlorate at the level of approximately 100-350 mg kg(-1). The remaining products did not contain perchlorate above the detection level of any of the techniques. Comparative analysis using several analytical techniques that depend on different properties of perchlorate allow for a high degree of certainty in both the qualitative and quantitative determinations. This proved particularly useful for these samples, due to the complexity of the matrix. Analyses of this type, including multiple spectroscopic confirmations, may also be useful for other complicated matrixes (e.g., biological samples) or in forensic/regulatory frameworks where data are likely to be challenged. While the source of perchlorate in these hydroponic products is not known, the perchlorate-to-nitrate concentration ratio (w/w) in the aqueous extracts is generally consistent with the historical weight percent of water soluble components in caliche, a nitrate-bearing ore found predominantly in Chile. This ore, which is the only well-established natural source of perchlorate, is mined and used, albeit minimally, as a nitrogen source in some fertilizer products.

  12. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and

  13. Analytical phase-tracking-based strain estimation for ultrasound elasticity.

    PubMed

    Yuan, Lili; Pedersen, Peder C

    2015-01-01

    A new strain estimator for quasi-static elastography is presented, based on tracking of the analytical signal phase as a function of the external force. Two implementations are introduced: zero-phase search with moving window (SMW) and zero-phase band tracking using connected component labeling (CCL). Low analytical signal amplitude caused by local destructive interference is associated with large error in the phase trajectories, and amplitude thresholding can thus be used to terminate the phase tracking along a particular path. Interpolation is then applied to estimate displacement in the eliminated path. The paper describes first a mathematical analysis based on 1-D multi-scatter modeling, followed by a statistical study of the displacement and strain error. Simulation and experiment with an inhomogeneous phantom indicate that SMW and CCL are capable of reliably estimating tissue displacement and strain over a larger range of deformation than standard timedomain cross-correlation (SCC). Results also show that SMW is roughly 40 times faster than SCC with comparable or even better accuracy. CCL is slower than SMW, but more noise robust. Simulation assessment at compression level 3% and 6% with SNR 20 dB demonstrates average strain error for SMW and CCL of 10%, whereas SCC achieves 18%. PMID:25585402

  14. Multiplexed Paper Analytical Device for Quantification of Metals using Distance-Based Detection

    PubMed Central

    Cate, David M.; Noblitt, Scott D.; Volckens, John; Henry, Charles S.

    2015-01-01

    Exposure to metal-containing aerosols has been linked with adverse health outcomes for almost every organ in the human body. Commercially available techniques for quantifying particulate metals are time-intensive, laborious, and expensive; often sample analysis exceeds $100. We report a simple technique, based upon a distance-based detection motif, for quantifying metal concentrations of Ni, Cu, and Fe in airborne particulate matter using microfluidic paper-based analytical devices. Paper substrates are used to create sensors that are self-contained, self-timing, and require only a drop of sample for operation. Unlike other colorimetric approaches in paper microfluidics that rely on optical instrumentation for analysis, with distance-based detection, analyte is quantified visually based on the distance of a colorimetric reaction, similar to reading temperature on a thermometer. To demonstrate the effectiveness of this approach, Ni, Cu, and Fe were measured individually in single-channel devices; detection limits as low as 0.1, 0.1, and 0.05 µg were reported for Ni, Cu, and Fe. Multiplexed analysis of all three metals was achieved with detection limits of 1, 5, and 1 µg for Ni, Cu, and Fe. We also extended the dynamic range for multi-analyte detection by printing concentration gradients of colorimetric reagents using an off the shelf inkjet printer. Analyte selectivity was demonstrated for common interferences. To demonstrate utility of the method, Ni, Cu, and Fe were measured from samples of certified welding fume; levels measured with paper sensors matched known values determined gravimetrically. PMID:26009988

  15. Development of analytical techniques for ultra trace amounts of nuclear materials in environmental samples using ICP-MS for safeguards

    PubMed

    Magara; Hanzawa; Esaka; Miyamoto; Yasuda; Watanabe; Usuda; Nishimura; Adachi

    2000-07-01

    The authors have begun to develop analytical techniques for ultra trace amounts of nuclear materials and to prepare a clean chemistry laboratory for environmental sample analyses. The analytical techniques include bulk and particle analyses. For the bulk analysis, concentrations and isotopic ratios of U and/or Pu are determined by inductively-coupled plasma mass spectrometry (ICP-MS) and thermal ionization mass spectrometry (TIMS). In the particle analysis, isotopic ratios of U and/or Pu in each particle will be measured by secondary ion mass spectrometry (SIMS). This paper reports on the outline for the development of analytical techniques and the current situation of the development of the bulk analysis using ICP-MS is described.

  16. Techniques for Enhancing Web-Based Education.

    ERIC Educational Resources Information Center

    Barbieri, Kathy; Mehringer, Susan

    The Virtual Workshop is a World Wide Web-based set of modules on high performance computing developed at the Cornell Theory Center (CTC) (New York). This approach reaches a large audience, leverages staff effort, and poses challenges for developing interesting presentation techniques. This paper describes the following techniques with their…

  17. Evaluation of a wind-tunnel gust response technique including correlations with analytical and flight test results

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Hanson, P. W.; Wynne, E. C.

    1979-01-01

    A wind tunnel technique for obtaining gust frequency response functions for use in predicting the response of flexible aircraft to atmospheric turbulence is evaluated. The tunnel test results for a dynamically scaled cable supported aeroelastic model are compared with analytical and flight data. The wind tunnel technique, which employs oscillating vanes in the tunnel throat section to generate a sinusoidally varying flow field around the model, was evaluated by use of a 1/30 scale model of the B-52E airplane. Correlation between the wind tunnel results, flight test results, and analytical predictions for response in the short period and wing first elastic modes of motion are presented.

  18. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    PubMed

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed.

  19. Chemical and mineralogical characterizations of LD converter steel slags: A multi-analytical techniques approach

    SciTech Connect

    Waligora, J.; Bulteel, D.; Degrugilliers, P.; Damidot, D.; Potdevin, J.L.; Measson, M.

    2010-01-15

    The use of LD converter steel slags (coming from Linz-Donawitz steelmaking process) as aggregates in road construction can in certain cases lead to dimensional damage due to a macroscopic swelling that is the consequence of chemical reactions. The aim of this study was to couple several analytical techniques in order to carefully undertake chemical and mineralogical characterizations of LD steel slags and identify the phases that are expected to be responsible for their instability. Optical microscopy, scanning electron microscopy and electron probe microanalyses revealed that LD steel slags mainly contain calcium silicates, dicalcium ferrites, iron oxides and lime. However, as a calcium silicate phase is heterogeneous, Raman microspectrometry and transmitted electron microscopy had to be used to characterize it more precisely. Results showed that lime is present under two forms in slag grains: some nodules observed in the matrix whose size ranges from 20 to 100 {mu}m and some micro-inclusions, enclosed in the heterogeneous calcium silicate phase whose size ranges from 1 to 3 {mu}m. It was also established that without the presence of magnesia, lime is expected to be the only phase responsible for LD steel slags instability. Nevertheless, the distribution of lime between nodules and micro-inclusions may play a major role and could explain that similar amounts of lime can induce different instabilities. Thus, it appears that lime content of LD steel slags is not the only parameter to explain their instability.

  20. An analytical study of reduced-gravity liquid reorientation using a simplified marker and cell technique

    NASA Technical Reports Server (NTRS)

    Betts, W. S., Jr.

    1972-01-01

    A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.

  1. Discrimination between biologically relevant calcium phosphate phases by surface-analytical techniques

    NASA Astrophysics Data System (ADS)

    Kleine-Boymann, Matthias; Rohnke, Marcus; Henss, Anja; Peppler, Klaus; Sann, Joachim; Janek, Juergen

    2014-08-01

    The spatially resolved phase identification of biologically relevant calcium phosphate phases (CPPs) in bone tissue is essential for the elucidation of bone remodeling mechanisms and for the diagnosis of bone diseases. Analytical methods with high spatial resolution for the discrimination between chemically quite close phases are rare. Therefore the applicability of state-of-the-art ToF-SIMS, XPS and EDX as chemically specific techniques was investigated. The eight CPPs hydroxyapatite (HAP), β-tricalcium phosphate (β-TCP), α-tricalcium phosphate (α-TCP), octacalcium phosphate (OCP), dicalcium phosphate dihydrate (DCPD), dicalcium phosphate (DCP), monocalcium phosphate (MCP) and amorphous calcium phosphate (ACP) were either commercial materials in high purity or synthesized by ourselves. The phase purity was proven by XRD analysis. All of the eight CPPs show different mass spectra and the phases can be discriminated by applying the principal component analysis method to the mass spectrometric data. The Ca/P ratios of all phosphates were determined by XPS and EDX. With both methods some CPPs can be distinguished, but the obtained Ca/P ratios deviate systematically from their theoretical values. It is necessary in any case to determine a calibration curve, respectively the ZAF values, from appropriate standards. In XPS also the O(1s)-satellite signals are correlated to the CPPs composition. Angle resolved and long-term XPS measurements of HAP clearly prove that there is no phosphate excess at the surface. Decomposition due to X-ray irradiation has not been observed.

  2. Analytical modeling of glucose biosensors based on carbon nanotubes

    PubMed Central

    2014-01-01

    In recent years, carbon nanotubes have received widespread attention as promising carbon-based nanoelectronic devices. Due to their exceptional physical, chemical, and electrical properties, namely a high surface-to-volume ratio, their enhanced electron transfer properties, and their high thermal conductivity, carbon nanotubes can be used effectively as electrochemical sensors. The integration of carbon nanotubes with a functional group provides a good and solid support for the immobilization of enzymes. The determination of glucose levels using biosensors, particularly in the medical diagnostics and food industries, is gaining mass appeal. Glucose biosensors detect the glucose molecule by catalyzing glucose to gluconic acid and hydrogen peroxide in the presence of oxygen. This action provides high accuracy and a quick detection rate. In this paper, a single-wall carbon nanotube field-effect transistor biosensor for glucose detection is analytically modeled. In the proposed model, the glucose concentration is presented as a function of gate voltage. Subsequently, the proposed model is compared with existing experimental data. A good consensus between the model and the experimental data is reported. The simulated data demonstrate that the analytical model can be employed with an electrochemical glucose sensor to predict the behavior of the sensing mechanism in biosensors. PMID:24428818

  3. Evaluation of Various Hybrid Analytical Techniques for Resolving Arsenic Species in Soils and Natural Waters from Korea

    NASA Astrophysics Data System (ADS)

    Kim, Y.; Yoon, H.; Shin, M.; Yoon, C.; Woo, N.

    2005-12-01

    The importance of arsenic detection in terms of different chemical species is becoming nowadays environmental concern. The analytical technique for true separation of different arsenic species has been challenging in many environmental samples. Achieving correct analytical results and resolving the lowest detection limit of each species is also desirable. However arsenic species can be transformed rapidly by slight changes in redox condition during experimental manipulation. It has been documented by USEPA for standard analytical technique for arsenic from many samples. However the previously documented techniques concerns only total arsenic concentration. Many analytical techniques for analyzing arsenic species are being suggested and hybrid analytical technique is regarded as essential method to resolve arsenic species. In this study, various hybrid techniques using ICP were tested to separate arsenic species in different type of samples. The anion exchange cartridge (Accell Plus QMA, Waters) or column (PRP X-100, Hamilton) in the part for separation and (HG)-ICP-AES or (HG)-ICP-MS in the part for detection coupled. Hydride generation (HG) method is used to improve detection limits. In addition, it can prevent from interference of 75 ArCl which is a significant problem of ICP-MS application to arsenic analysis. Anion exchange cartridge is only applicable in separation of inorganic arsenic species, typically available to acid mine drainage and groundwater samples in which arsenic mainly exists inorganic form. LC-(HG)-ICP-MS techenique is profitable to natural surface water samples which have a small fraction of various arsenic species. The leachate from soil samples should be treated carefully because high content of transition metals cause interferences. If there were a probability of leaching considerable amount of organic matter and transition metals during soil extraction, HG technique with purification procedure and use of masking agent must be applied to

  4. Markov-CA model using analytical hierarchy process and multiregression technique

    NASA Astrophysics Data System (ADS)

    Omar, N. Q.; Sanusi, S. A. M.; Hussin, W. M. W.; Samat, N.; Mohammed, K. S.

    2014-06-01

    The unprecedented increase in population and rapid rate of urbanisation has led to extensive land use changes. Cellular automata (CA) are increasingly used to simulate a variety of urban dynamics. This paper introduces a new CA based on an integration model built-in multi regression and multi-criteria evaluation to improve the representation of CA transition rule. This multi-criteria evaluation is implemented by utilising data relating to the environmental and socioeconomic factors in the study area in order to produce suitability maps (SMs) using an analytical hierarchical process, which is a well-known method. Before being integrated to generate suitability maps for the periods from 1984 to 2010 based on the different decision makings, which have become conditioned for the next step of CA generation. The suitability maps are compared in order to find the best maps based on the values of the root equation (R2). This comparison can help the stakeholders make better decisions. Thus, the resultant suitability map derives a predefined transition rule for the last step for CA model. The approach used in this study highlights a mechanism for monitoring and evaluating land-use and land-cover changes in Kirkuk city, Iraq owing changes in the structures of governments, wars, and an economic blockade over the past decades. The present study asserts the high applicability and flexibility of Markov-CA model. The results have shown that the model and its interrelated concepts are performing rather well.

  5. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; Padmaperuma, Asanga B.; Connatser, Raynella M.; Stankovikj, Filip; Meier, Dietrich; Paasikallio, Ville

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  6. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    PubMed Central

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  7. A Decision Analytic Approach to Exposure-Based Chemical Prioritization

    PubMed Central

    Mitchell, Jade; Pabon, Nicolas; Collier, Zachary A.; Egeghy, Peter P.; Cohen-Hubal, Elaine; Linkov, Igor; Vallero, Daniel A.

    2013-01-01

    The manufacture of novel synthetic chemicals has increased in volume and variety, but often the environmental and health risks are not fully understood in terms of toxicity and, in particular, exposure. While efforts to assess risks have generally been effective when sufficient data are available, the hazard and exposure data necessary to assess risks adequately are unavailable for the vast majority of chemicals in commerce. The US Environmental Protection Agency has initiated the ExpoCast Program to develop tools for rapid chemical evaluation based on potential for exposure. In this context, a model is presented in which chemicals are evaluated based on inherent chemical properties and behaviorally-based usage characteristics over the chemical’s life cycle. These criteria are assessed and integrated within a decision analytic framework, facilitating rapid assessment and prioritization for future targeted testing and systems modeling. A case study outlines the prioritization process using 51 chemicals. The results show a preliminary relative ranking of chemicals based on exposure potential. The strength of this approach is the ability to integrate relevant statistical and mechanistic data with expert judgment, allowing for an initial tier assessment that can further inform targeted testing and risk management strategies. PMID:23940664

  8. Comparison of copper speciation in estuarine water measured using analytical voltammetry and supported liquid membrane techniques.

    PubMed

    Ndungu, Kuria; Hurst, Matthew P; Bruland, Kenneth W

    2005-05-01

    The supported liquid membrane (SLM) is a promising separation and preconcentration technique that is well-suited for trace metal speciation in natural waters. The technique is based on the selective complexation of metal ions by a hydrophobic ligand (carrier) dissolved in a water-immiscible organic solvent immobilized in a porous, inert membrane. This membrane separates two aqueous solutions: the test (or donor) solution and the strip (or acceptor) solution. The metal carrier complex is transported by diffusion across the membrane from the source to the strip solution where metal ions are back-extracted. The technique offers great potential to tune the selectivity by incorporating different complexing ligands in the membrane. A SLM was used to analyze the dissolved (<0.45 microm) copper speciation from two sites in the San Francisco Bay estuary; Dumbarton Bridge, [Cu]total approximately 27 nM, and San Bruno Shoals, [Cu]total approximately 23 nM. The sites were also characterized independently by differential pulse anodic stripping voltammetry (DPASV) using a Nafion-coated thin mercury film electrode (NCTMFE). The SLM employed 10 mM lasalocid, a naturally occurring carboxylic polyether ionophore, in nitrophenyl octyl ether (NPOE) asthe membrane complexing ligand, supported by a microporous, polypropylene, hydrophobic membrane. This is the first study where SLM technique has been compared with an independent speciation technique in marine waters. Results of copper speciation measurements from Dumbarton Bridge, a site in South San Francisco Bay where copper speciation has been well-characterized in previous studies using various voltammetric techniques, indicated that only about 3% (0.9 nM) of the total dissolved copper was SLM labile. The corresponding DPASV labile copper fraction was <0.4% (<0.1 nM) of total dissolved copper. The concentration of total copper binding ligands measured by the membrane technique was 471 nM as compared to 354 nM measured by DPASV, more

  9. An overview of analytical techniques and methods for the study and preservation of artistic and archaeological bronzes.

    NASA Astrophysics Data System (ADS)

    Mazzeo, Rocco; Prati, Silvia; Quaranta, Marta; Sciutto, Giorgia

    The present review intends to give an overview on the type of information that is possible to gather from the application of different non-invasive and micro-destructive analytical techniques. Typically, methods that require the withdrawal of a sample, such as metallography, SEM-EDS, AAS, FTIR and Py-GC-MS are employed. Through their use, it is possible to identify the material constitution, to evaluate the degradation behavior and the state of conservation of excavated bronze artefacts. It is also underlined how a non-invasive approach might be used whenever no sampling is allowed, though some limitation should be considered. Furthermore, analytical techniques play an important role in the characterisation and evaluation of the effectiveness of protective coatings and corrosion inhibitors before and after restoration procedures. An interesting aspect is the implication of science for the recognition of forgeries, when analytical studies provide evidences able to prove or deny objects' authenticity.

  10. Geochemical applications of the tandem LA-ICP-MS/LIBS analytical technique

    NASA Astrophysics Data System (ADS)

    Guitreau, M.; Gonzalez, J. J.; Mukasa, S. B.; Colucci, M. T.

    2013-12-01

    Improvements in Laser Ablation for material sampling over the past few decades have led to the emergence of several applications of this in-situ technique to some important geochemical measurements. The technique is commonly used for both elemental [1] and isotopic analyses [2], and has multiple advantages compared to dissolution techniques, notably higher spatial resolution, easier and faster sample preparation, and for many applications a non-destructive method. A significant advantage of this technique in geochemistry is full characterization of a sample (e.g., glass or mineral) using a single spot of limited size (i.e., 20-80 μm) to eliminate or minimize complexities due to potential chemical zonations. Major advancement is being realized in the analysis of volcanic glasses for their elemental and volatile concentrations as well as zircon elemental and U-Pb isotopic compositions using a new approach that combines the capabilities of the two most common laser ablation modalities; LA-ICP-MS/LIBS, which stands for Laser Ablation Inductively Coupled Plasma Mass Spectrometry/Laser Induced Breakdown Spectroscopy. LIBS is based on direct measurement of the optical emission originating from the laser-induced plasma [3] whereas LA-ICP-MS involves transport and excitation of the ablated aerosol to a secondary source (ICP), before entering a mass spectrometer [4]. Analysis by these two techniques can complement each other quite well, as every laser pulse for ablation provides the optical plasma for emission spectroscopy and particles for ICP mass spectrometry. We will present data demonstrating that rare-earth element (REE) concentrations can be determined using LIBS in both zircon and volcanic glasses. In addition, we have promising, provisional hydrogen concentration data measured concurrently with the REE in volcanic glasses, which is not possible using only LA-ICP-MS.

  11. Rapid detection of transition metals in welding fumes using paper-based analytical devices.

    PubMed

    Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John

    2014-05-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892

  12. Rapid detection of transition metals in welding fumes using paper-based analytical devices.

    PubMed

    Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John

    2014-05-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.

  13. Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices

    PubMed Central

    Volckens, John

    2014-01-01

    Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892

  14. A physically based analytical model of flood frequency curves

    NASA Astrophysics Data System (ADS)

    Basso, S.; Schirmer, M.; Botter, G.

    2016-09-01

    Predicting magnitude and frequency of floods is a key issue in hydrology, with implications in many fields ranging from river science and geomorphology to the insurance industry. In this paper, a novel physically based approach is proposed to estimate the recurrence intervals of seasonal flow maxima. The method links the extremal distribution of streamflows to the stochastic dynamics of daily discharge, providing an analytical expression of the seasonal flood frequency curve. The parameters involved in the formulation embody climate and landscape attributes of the contributing catchment and can be estimated from daily rainfall and streamflow data. Only one parameter, which is linked to the antecedent wetness condition in the watershed, needs to be calibrated on the observed maxima. The performance of the method is discussed through a set of applications in four rivers featuring heterogeneous daily flow regimes. The model provides reliable estimates of seasonal maximum flows in different climatic settings and is able to capture diverse shapes of flood frequency curves emerging in erratic and persistent flow regimes. The proposed method exploits experimental information on the full range of discharges experienced by rivers. As a consequence, model performances do not deteriorate when the magnitude of events with return times longer than the available sample size is estimated. The approach provides a framework for the prediction of floods based on short data series of rainfall and daily streamflows that may be especially valuable in data scarce regions of the world.

  15. Analytical investigation of bilayer lipid biosensor based on graphene.

    PubMed

    Akbari, Elnaz; Buntat, Zolkafle; Shahraki, Elmira; Parvaz, Ramtin; Kiani, Mohammad Javad

    2016-01-01

    Graphene is another allotrope of carbon with two-dimensional monolayer honeycomb. Owing to its special characteristics including electrical, physical and optical properties, graphene is known as a more suitable candidate compared to other materials to be used in the sensor application. It is possible, moreover, to use biosensor by using electrolyte-gated field effect transistor based on graphene (GFET) to identify the alterations in charged lipid membrane properties. The current article aims to show how thickness and charges of a membrane electric can result in a monolayer graphene-based GFET while the emphasis is on the conductance variation. It is proposed that the thickness and electric charge of the lipid bilayer (LLP and QLP) are functions of carrier density, and to find the equation relating these suitable control parameters are introduced. Artificial neural network algorithm as well as support vector regression has also been incorporated to obtain other models for conductance characteristic. The results comparison between analytical models, artificial neural network and support vector regression with the experimental data extracted from previous work show an acceptable agreement.

  16. Two Analyte Calibration From The Transient Response Of Potentiometric Sensors Employed With The SIA Technique

    SciTech Connect

    Cartas, Raul; Mimendia, Aitor; Valle, Manel del; Legin, Andrey

    2009-05-23

    Calibration models for multi-analyte electronic tongues have been commonly built using a set of sensors, at least one per analyte under study. Complex signals recorded with these systems are formed by the sensors' responses to the analytes of interest plus interferents, from which a multivariate response model is then developed. This work describes a data treatment method for the simultaneous quantification of two species in solution employing the signal from a single sensor. The approach used here takes advantage of the complex information recorded with one electrode's transient after insertion of sample for building the calibration models for both analytes. The departure information from the electrode was firstly processed by discrete wavelet for transforming the signals to extract useful information and reduce its length, and then by artificial neural networks for fitting a model. Two different potentiometric sensors were used as study case for simultaneously corroborating the effectiveness of the approach.

  17. In Situ Analytical Strategy for Mars Combining X-Ray and Optical Techniques

    NASA Astrophysics Data System (ADS)

    Marshall, J.; Martin, J. P.; Mason, L. W.; Williamson, D. L.

    2004-03-01

    The “MICA” instrument combines XRD, XRF, and optical analytical methods for in situ analysis of Martian rocks. Optical analysis is critical in rock identification since neither XRD mineralogy nor XRF chemistry can be guaranteed to define lithology.

  18. Empirical technique to measure x-ray production and detection efficiencies in the analytical electron microscope

    SciTech Connect

    King, W.E.

    1985-01-01

    In the present work, a technique is proposed to experimentally measure the effective x-ray production and detection efficiency in pure element standards. This technique supplements and in some cases is preferable to the multi-element standard technique. Measurements of effective x-ray production and detection efficiencies are expected to be preferable to the standardless technique in cases where pure element samples can be prepared since the most uncertain parameters in the standardless technique are measured in the proposed technique.

  19. Electrochemiluminescence detection in microfluidic cloth-based analytical devices.

    PubMed

    Guan, Wenrong; Liu, Min; Zhang, Chunsun

    2016-01-15

    This work describes the first approach at combining microfluidic cloth-based analytical devices (μCADs) with electrochemiluminescence (ECL) detection. Wax screen-printing is employed to make cloth-based microfluidic chambers which are patterned with carbon screen-printed electrodes (SPEs) to create truly disposable, simple, inexpensive sensors which can be read with a low-cost, portable charge coupled device (CCD) imaging sensing system. And, the two most commonly used ECL systems of tris(2,2'-bipyridyl)ruthenium(II)/tri-n-propylamine (Ru(bpy)3(2+)/TPA) and 3-aminophthalhydrazide/hydrogen peroxide (luminol/H2O2) are applied to demonstrate the quantitative ability of the ECL μCADs. In this study, the proposed devices have successfully fulfilled the determination of TPA with a linear range from 2.5 to 2500μM with a detection limit of 1.265μM. In addition, the detection of H2O2 can be performed in the linear range of 0.05-2.0mM, with a detection limit of 0.027mM. It has been shown that the ECL emission on the wax-patterned cloth device has an acceptable sensitivity, stability and reproducibility. Finally, the applicability of cloth-based ECL is demonstrated for determination of glucose in phosphate buffer solution (PBS) and artificial urine (AU) samples, with the detection limits of 0.032mM and 0.038mM, respectively. It can be foreseen, therefore, that μCADs with ECL detection could provide a new sensing platform for point-of-care testing, public health, food safety detection and environmental monitoring in remote regions, developing or developed countries. PMID:26319168

  20. A smog chamber comparison of a microfluidic derivatisation measurement of gas-phase glyoxal and methylglyoxal with other analytical techniques

    NASA Astrophysics Data System (ADS)

    Pang, xiaobing; Lewis, Alastair; Rickard, Andrew R.; Baeza-Romero, Maria Teresa; Adams, Thomas J.; Ball, Stephen M.; Goodall, Iain C. A.; Monks, Paul S.; Peppe, Salvatore; Ródenas García, Milagros; Sánchez, Pilar; Muñoz, Amalia

    2014-05-01

    A microfluidic lab-on-a-chip derivatisation technique has been developed to measure part per billion (ppbV) mixing ratios of gaseous glyoxal (GLY) and methylglyoxal (MGLY), and the method is compared with other techniques in a smog chamber experiment. The method uses o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA) as a derivatisation reagent and a microfabricated planar glass micro-reactor comprising an inlet, gas and fluid splitting and combining channels, mixing junctions, and a heated capillary reaction microchannel. The enhanced phase contact area-to-volume ratio and the high heat transfer rate in the micro-reactor result in a fast and highly efficient derivatisation reaction, generating an effluent stream ready for direct introduction to a gas chromatograph-mass spectrometer (GC-MS). A linear response for GLY was observed over a calibration range 0.7 to 400 ppbV, and for MGLY of 1.2 to 300 ppbV, when derivatised under optimal reaction conditions. The analytical performance shows good accuracy (6.6 % for GLY and 7.5 % for MGLY), suitable precision (< 12.0 %) and method detection limits (MDLs) (75 pptV for GLY and 185 pptV for MGLY) with a time resolution of 30 minutes. These MDLs are below or close to typical concentrations of these compounds observed in ambient air. The microfluidic derivatisation technique would be appropriate for ambient α-dicarbonyl measurements in a range of field environments based on its performance in a large-scale outdoor atmospheric simulation chamber (EUPHORE). The feasibility of the technique was assessed by applying the methodology to quantify of α-dicarbonyls formed during the photo-oxidation of isoprene in the EUPHORE chamber. Good correlations were found between microfluidic measurements and Fourier Transform InfraRed spectroscopy (FTIR) with the correlation coefficient (r2) of 0.84, Broad Band Cavity Enhanced Absorption Spectroscopy (BBCEAS) (r2 = 0.75), solid phase micro extraction (SPME) (r2 = 0.89), and a

  1. Fault detection and isolation of PEM fuel cell system based on nonlinear analytical redundancy. An application via parity space approach

    NASA Astrophysics Data System (ADS)

    Aitouche, A.; Yang, Q.; Ould Bouamama, B.

    2011-05-01

    This paper presents a procedure dealing with the issue of fault detection and isolation (FDI) using nonlinear analytical redundancy (NLAR) technique applied in a proton exchange membrane (PEM) fuel cell system based on its mathematic model. The model is proposed and simplified into a five orders state space representation. The transient phenomena captured in the model include the compressor dynamics, the flow characteristics, mass and energy conservation and manifold fluidic mechanics. Nonlinear analytical residuals are generated based on the elimination of the unknown variables of the system by an extended parity space approach to detect and isolate actuator and sensor faults. Finally, numerical simulation results are given corresponding to a faults signature matrix.

  2. Analytical techniques for reduction of computational effort in reflector antenna analysis

    NASA Astrophysics Data System (ADS)

    Franceschetti, G.

    Techniques used for computing the radiation integral in reflector antenna analysis are briefly reviewed. The techniques discussed include numerical approaches, such as Monte Carlo multidimensional integration and the Ludwig method (1968), asymptotic solutions, expansion techniques, and the sampling approach. It is pointed out that none of the techniques discussed provides optimum results in the full angular range 0-180 deg, and consequently different techniques are generally used in different angular sectors.

  3. Contactless conductivity detection for analytical techniques-developments from 2012 to 2014.

    PubMed

    Kubáň, Pavel; Hauser, Peter C

    2015-01-01

    The review covers the progress of capacitively coupled contactless conductivity detection over the 2 years leading up to mid-2014. During this period many new applications for conventional CE as well as for microchip separation devices have been reported; prominent areas have been clinical, pharmaceutical, forensic, and food analyses. Further progress has been made in the development of field portable instrumentation based on CE with contactless conductivity detection. Several reports concern the combination with sample pretreatment techniques, in particular electrodriven extractions. Accounts of arrays of contactless conductivity detectors have appeared, which have been created for quite different tasks requiring spatially resolved information. The trend of the use of contactless conductivity measurements for applications other than CE has continued.

  4. Loading of red blood cells with an analyte-sensitive dye for development of a long-term monitoring technique

    NASA Astrophysics Data System (ADS)

    Ritter, Sarah C.; Meissner, Kenith E.

    2012-03-01

    Measurement of blood analytes, such as pH and glucose, provide crucial information about a patient's health. Some such analytes, such as glucose in the case of diabetes, require long-term or near-continuous monitoring for proper disease management. However, current monitoring techniques are far from ideal: multiple-per-day finger stick tests are inconvenient and painful for the patient; implantable sensors have short functional life spans (i.e., 3-7 days). Red blood cells serve as an attractive alternative for carriers of analyte sensors. Once reintroduced to the blood stream, these carriers may continue to live for the remainder of their life span (120 days for humans). They are also biodegradable and biocompatible, thereby eliminating the immune system response common for many implanted devices. The proposed carrier system takes advantage of the ability of the red blood cells to swell in response to a decrease in the osmolarity of the extracellular solution. Just before the membranes lyse, they develop small pores on the scale of tens of nanometers. Analyte-sensitive dyes in the extracellular solution may then diffuse into the perforated red blood cells and become entrapped upon restoration of physiological temperature and osmolarity. Because the membranes contain various analyte transporters, intracellular analyte levels rapidly equilibrate to those of the extracellular solution. A fluorescent dye has been loaded inside of red blood cells using a preswelling technique. Alterations in preparation parameters have been shown to affect characteristics of the resulting dye-loaded red blood cells (e.g., intensity of fluorescence).

  5. Insulator-based DEP with impedance measurements for analyte detection

    DOEpatents

    Davalos, Rafael V.; Simmons, Blake A.; Crocker, Robert W.; Cummings, Eric B.

    2010-03-16

    Disclosed herein are microfluidic devices for assaying at least one analyte specie in a sample comprising at least one analyte concentration area in a microchannel having insulating structures on or in at least one wall of the microchannel which provide a nonuniform electric field in the presence of an electric field provided by off-chip electrodes; and a pair of passivated sensing electrodes for impedance detection in a detection area. Also disclosed are assay methods and methods of making.

  6. Inspiraling black-hole binary spacetimes: Challenges in transitioning from analytical to numerical techniques

    NASA Astrophysics Data System (ADS)

    Zlochower, Yosef; Nakano, Hiroyuki; Mundim, Bruno C.; Campanelli, Manuela; Noble, Scott; Zilhão, Miguel

    2016-06-01

    We explore how a recently developed analytical black-hole binary spacetime can be extended using numerical simulations to go beyond the slow-inspiral phase. The analytic spacetime solves the Einstein field equations approximately, with the approximation error becoming progressively smaller the more separated the binary. To continue the spacetime beyond the slow-inspiral phase, we need to transition. Such a transition was previously explored at smaller separations. Here, we perform this transition at a separation of D =20 M (large enough that the analytical metric is expected to be accurate), and evolve for six orbits. We find that small constraint violations can have large dynamical effects, but these can be removed by using a constraint-damping system like the conformal covariant formulation of the Z4 system. We find agreement between the subsequent numerical spacetime and the predictions of post-Newtonian theory for the waveform and inspiral rate that is within the post-Newtonian truncation error.

  7. Experimental, computational, and analytical techniques for diagnosing breast cancer using optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Palmer, Gregory M.

    This dissertation presents the results of an investigation into experimental, computational, and analytical methodologies for diagnosing breast cancer using fluorescence and diffuse reflectance spectroscopy. First, the optimal experimental methodology for tissue biopsy studies was determined using an animal study. It was found that the use of freshly excised tissue samples preserved the original spectral line shape and magnitude of the fluorescence and diffuse reflectance. Having established the optimal experimental methodology, a clinical study investigating the use of fluorescence and diffuse reflectance spectroscopy for the diagnosis of breast cancer was undertaken. In addition, Monte Carlo-based models of diffuse reflectance and fluorescence were developed and validated to interpret these data. These models enable the extraction of physically meaningful information from the measured spectra, including absorber concentrations, and scattering and intrinsic fluorescence properties. The model was applied to the measured spectra, and using a support vector machine classification algorithm based on physical features extracted from the diffuse reflectance spectra, it was found that breast cancer could be diagnosed with a cross-validated sensitivity and specificity of 82% and 92%, respectively, which are substantially better than that obtained using a conventional, empirical algorithm. It was found that malignant tissues had lower hemoglobin oxygen saturation, were more scattering, and had lower beta-carotene concentration, relative to the non-malignant tissues. It was also found that the fluorescence model could successfully extract the intrinsic fluorescence line shape from tissue samples. One limitation of the previous study is that a priori knowledge of the tissue's absorbers and scatterers is required. To address this limitation, and to improve upon the method with which fiber optic probes are designed, an alternate approach was developed. This method used a

  8. Chemical and process mineralogical characterizations of spent lithium-ion batteries: an approach by multi-analytical techniques.

    PubMed

    Zhang, Tao; He, Yaqun; Wang, Fangfang; Ge, Linhan; Zhu, Xiangnan; Li, Hong

    2014-06-01

    Mineral processing operation is a critical step in any recycling process to realize liberation, separation and concentration of the target parts. Developing effective recycling methods to recover all the valuable parts from spent lithium-ion batteries is in great necessity. The aim of this study is to carefully undertake chemical and process mineralogical characterizations of spent lithium-ion batteries by coupling several analytical techniques to provide basic information for the researches on effective mechanical crushing and separation methods in recycling process. The results show that the grade of Co, Cu and Al is fairly high in spent lithium ion batteries and up to 17.62 wt.%, 7.17 wt.% and 21.60 wt.%. Spent lithium-ion batteries have good selective crushing property, the crushed products could be divided into three parts, they are Al-enriched fraction (+2 mm), Cu and Al-enriched fraction (-2+0.25 mm) and Co and graphite-enriched fraction (-0.25 mm). The mineral phase and chemical state analysis reveal the electrode materials recovered from -0.25 mm size fraction keep the original crystal forms and chemical states in lithium-ion batteries, but the surface of the powders has been coated by a certain kind of hydrocarbon. Based on these results a flowsheet to recycle spent LiBs is proposed.

  9. An analytical technique to extract surface information of negatively stained or heavy-metal shadowed organic materials within the TEM.

    PubMed

    Matsko, Nadejda B; Letofsky-Papst, Ilse; Albu, Mihaela; Mittal, Vikas

    2013-06-01

    Using a series of uranyl acetate stained or platinum-palladium shadowed organic samples, an empirical analytical method to extract surface information from energy-filtered transmission electron microscopy (EFTEM) images is described. The distribution of uranium or platinum-palladium atoms, which replicate the sample surface topography, have been mathematically extracted by dividing the image acquired in the valence bulk plasmon energy region (between 20 and 30 eV) by the image acquired at the carbon K ionization edge (between 284 and 300 eV). The resulting plasmon-to-carbon ratio (PCR) image may be interpreted as a precise metal replica of the sample surface. In contrast to conventional EFTEM elemental mapping, including an absolute quantification approach, this technique can be applied to 200-600 nm thick organic samples. A combination of conventional TEM and PCR imaging allows one to detect complementary transmission and topographical information with nanometer precision of the same area of carbon-based samples. The advantages and limitations of PCR imaging are highlighted. PMID:23570815

  10. Analytical performance specifications based on how clinicians use laboratory tests. Experiences from a post-analytical external quality assessment programme.

    PubMed

    Thue, Geir; Sandberg, Sverre

    2015-05-01

    Analytical performance specifications can be based on three different models: the effect of analytical performance on clinical outcome, based on components of biological variation of the measurand or based on state-of-the-art. Models 1 and 3 may to some degree be combined by using case histories presented to a large number of clinicians. The Norwegian Quality Improvement of Primary Care Laboratories (Noklus) has integrated vignettes in its external quality assessment programme since 1991, focusing on typical clinical situations in primary care. Haemoglobin, erythrocyte sedimentation rate (ESR), HbA1c, glucose, u-albumin, creatinine/estimated glomerular filtration rate (eGFR), and Internationl Normalised Ratio (INR) have been evaluated focusing on critical differences in test results, i.e., a change from a previous result that will generate an "action" such as a change in treatment or follow-up of the patient. These critical differences, stated by physicians, can translate into reference change values (RCVs) and assumed analytical performance can be calculated. In general, assessments of RCVs and therefore performance specifications vary both within and between groups of doctors, but with no or minor differences regarding specialisation, age or sex of the general practitioner. In some instances state-of-the-art analytical performance could not meet clinical demands using 95% confidence, whereas clinical demands were met using 80% confidence in nearly all instances. RCVs from vignettes should probably not be used on their own as a basis for setting analytical performance specifications, since clinicians seem "uninformed" regarding important principles. They could rather be used as a background for focus groups of "informed" physicians in discussions of performance specifications tailored to "typical" clinical situations.

  11. Rapid Late Holocene glacier fluctuations reconstructed from South Georgia lake sediments using novel analytical and numerical techniques

    NASA Astrophysics Data System (ADS)

    van der Bilt, Willem; Bakke, Jostein; Werner, Johannes; Paasche, Øyvind; Rosqvist, Gunhild

    2016-04-01

    synchronous bi-polar Little Ice Age (LIA). In conclusion, our work shows the potential of novel analytical and numerical tools to improve the robustness and resolution of lake sediment-based paleoclimate reconstructions beyond the current state-of-the-art.

  12. Analytical techniques for the study of some parameters of multispectral scanner systems for remote sensing

    NASA Technical Reports Server (NTRS)

    Wiswell, E. R.; Cooper, G. R. (Principal Investigator)

    1978-01-01

    The author has identified the following significant results. The concept of average mutual information in the received spectral random process about the spectral scene was developed. Techniques amenable to implementation on a digital computer were also developed to make the required average mutual information calculations. These techniques required identification of models for the spectral response process of scenes. Stochastic modeling techniques were adapted for use. These techniques were demonstrated on empirical data from wheat and vegetation scenes.

  13. The use of surface analytical techniques to measure the loadings of uranium and plutonium sorbed simultaneously from solution onto rocks

    SciTech Connect

    Berry, J.A.; Bishop, H.E.; Cowper, M.M.; Fozard, P.R.; McMillan, J.W.

    1995-12-31

    Small polished blocks of granite, diorite and dolerite were immersed in solutions containing uranium and plutonium at equal initial concentration.The samples were analyzed by the advanced surface analytical techniques of secondary ion mass spectrometry (SIMS) and nuclear microprobe analysis. The results show that both actinides sorb onto the same minerals in the three rocks. However, SIMS data show that significantly more uranium was sorbed than plutonium.

  14. Implementation of IAEA /1/INT/054 Project in Nuclear Analytical Techniques Group of Argentina: Current State

    SciTech Connect

    Sara, Resnizky; Rita, Pla; Alba, Zaretzky

    2008-08-14

    This paper presents the implementation of the training received through the IAEA Project 'Preparation of Reference Materials and Organization of Proficiency Tests Rounds' in the Nuclear Analytical (NAT) Group of CNEA. Special emphasis is done on those activities related to the first Proficiency Test being carried out by the NAT Group.

  15. Petroleum-specific analytical and interpretive techniques for product identification and source allocation

    SciTech Connect

    Dahlen, D.T.; Uhler, A.D.; Sauer, T.C.; McCarthy, K.J.

    1995-12-31

    As a class, petroleum and refined petroleum products represent the single largest source of contamination to the terrestrial and aquatic environments. Petroleum products including crude oil and a wide spectrum of refined petroleum products are accidentally released by a variety of mechanisms including surface spills, tanker accidents, storage tank leaks, and pipeline ruptures. In many cases, the petroleum and other hydrocarbon products discovered in impacted environments usually cannot be readily linked to a source, because the products are often complex mixtures of various hydrocarbon-based components. The identification of hydrocarbon products in the environment is further confounded by complex physical, chemical, and biological mechanisms that alter the fresh hydrocarbons through processes known collectively as weathering. The dilution, mixing, transport, and weathering of these hydrocarbon materials in the environment renders them very different in composition and appearance from their original source materials. This paper describes the use of EPA methods of analysis that have been modified for measurement of a large suite of petrogenic organic compounds that can be used for sensitive measurement of fresh and degraded petroleum in environmental samples ranging from volatile products such as gasoline to heavy crude oils. Petroleum specific data is reduced using multivariate chemometric techniques including hierarchal cluster analysis (HCA) and principal component analysis to evaluate linkage between the samples and likely petroleum source products. Combined with high quality, petroleum specific data, these analysis techniques offer unparalleled means of identifying materials, differentiating between similar products such as gasolines, and linking contaminated samples to sources. Examples of the use of the methods to deduce sources of gasoline, jet fuel, and diesel to terrestrial spills will be presented.

  16. Terracotta polychrome sculptures examined before and after their conservation work: contributions from non-invasive in situ analytical techniques.

    PubMed

    Colombo, C; Bevilacqua, F; Brambilla, L; Conti, C; Realini, M; Striova, J; Zerbi, G

    2011-08-01

    The potential of non-invasive in situ analytical techniques such as portable Raman, portable X-ray fluorescence, portable optical microscope and fibre optics reflectance spectroscopy has been shown studying painted layers of Renaissance terracotta polychrome sculptures belonging to the statuary of Santo Sepolcro Church in Milan. The results obtained allowed pointing out the contribution of these techniques to the compositional diagnostic, providing complete information, in some cases, better than micro-destructive techniques, on the kind of pigments used on the external painted layers. Moreover, a comparison with the results obtained before the last conservation work (2009) with micro-destructive techniques allowed ascertaining the removal of the external painted layers during the conservation operations.

  17. Fibre based integral field unit constructional techniques

    NASA Astrophysics Data System (ADS)

    Murray, Graham J.

    2006-06-01

    Presented here is a selected overview of constructional techniques and principles that have been developed and implemented at the University of Durham in the manufacture of successful fibre-based integral field units. The information contained herein is specifically intended to highlight the constructional methods that have been devised to assemble an efficient fibre bundle. Potential pitfalls that need to be considered when embarking upon such a deceptively simple instrument are also discussed.

  18. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general. PMID:24499071

  19. Laboratory Techniques in Geology: Embedding Analytical Methods into the Undergraduate Curriculum

    NASA Astrophysics Data System (ADS)

    Baedke, S. J.; Johnson, E. A.; Kearns, L. E.; Mazza, S. E.; Gazel, E.

    2014-12-01

    Paid summer REU experiences successfully engage undergraduate students in research and encourage them to continue to graduate school and scientific careers. However these programs only accommodate a limited number of students due to funding constraints, faculty time commitments, and limited access to needed instrumentation. At JMU, the Department of Geology and Environmental Science has embedded undergraduate research into the curriculum. Each student fulfilling a BS in Geology or a BA in Earth Science completes 3 credits of research, including a 1-credit course on scientific communication, 2 credits of research or internship, followed by a presentation of that research. Our department has successfully acquired many analytical instruments and now has an XRD, SEM/EDS, FTIR, handheld Raman, AA, ion chromatograph, and an IRMS. To give as many students as possible an overview to the scientific uses and operation methods for these instruments, we revived a laboratory methods course that includes theory and practical use of instrumentation at JMU, plus XRF sample preparation and analysis training at Virginia Tech during a 1-day field trip. In addition to practical training, projects included analytical concepts such as evaluating analytical vs. natural uncertainty, determining error on multiple measurements, signal-to-noise ratio, and evaluating data quality. State funding through the 4-VA program helped pay for analytical supplies and support for students to complete research projects over the summer or during the next academic year using instrumentation from the course. This course exemplifies an alternative path to broadening participation in undergraduate research and creating stronger partnerships between PUI's and research universities.

  20. Laser Remote Sensing: Velocimetry Based Techniques

    NASA Astrophysics Data System (ADS)

    Molebny, Vasyl; Steinvall, Ove

    Laser-based velocity measurement is an area of the field of remote sensing where the coherent properties of laser radiation are the most exposed. Much of the published literature deals with the theory and techniques of remote sensing. We restrict our discussion to current trends in this area, gathered from recent conferences and professional journals. Remote wind sensing and vibrometry are promising in their new scientific, industrial, military, and biomedical applications, including improving flight safety, precise weapon correction, non-contact mine detection, optimization of wind farm operation, object identification based on its vibration signature, fluid flow studies, and vibrometry-associated diagnosis.

  1. [Recent advancement of photonic-crystal-based analytical chemistry].

    PubMed

    Chen, Yun; Guo, Zhenpeng; Wang, Jinyi; Chen, Yi

    2014-04-01

    Photonic crystals are a type of novel materials with ordered structure, nanopores/channels and optical band gap. They have hence important applications in physics, chemistry, biological science and engineering fields. This review summarizes the recent advancement of photonic crystals in analytical chemistry applications, with focus on sensing and separating fields happening in the nearest 5 years.

  2. Exploring the Efficacy of Behavioral Skills Training to Teach Basic Behavior Analytic Techniques to Oral Care Providers

    ERIC Educational Resources Information Center

    Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella

    2012-01-01

    Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…

  3. An Examination of Sampling Characteristics of Some Analytic Factor Transformation Techniques.

    ERIC Educational Resources Information Center

    Skakun, Ernest N.; Hakstian, A. Ralph

    Two population raw data matrices were constructed by computer simulation techniques. Each consisted of 10,000 subjects and 12 variables, and each was constructed according to an underlying factorial model consisting of four major common factors, eight minor common factors, and 12 unique factors. The computer simulation techniques were employed to…

  4. Video based lifting technique coding system.

    PubMed

    Hsiang, S M; Brogmus, G E; Martin, S E; Bezverkhny, I B

    1998-03-01

    Despite automation and improved working conditions, many materials in industry are still handled manually. Among the basic activities involved in manual materials handling, lifting is the one most frequently associated with low-back pain (LBP). Biomechanical analysis techniques have been used to better understand the risk factors associated with manual handling, but because these techniques require specialized equipment, highly trained personnel, and interfere with normal business operations, they are limited in their usefulness. A video based lifting technique analysis system (the VidLiTeCTM System) is presented that provides for quantifiable non-invasive biomechanical analysis of the dynamic features of lifting with high inter-coder reliability and low sensitivity to absolute errors. Analysis of results from a laboratory experiment and from field-collected videotape are described that support the reliability, sensitivity, and accuracy claims of the VidLiTeCTM System. The VidLiTeCTM System allows technicians with minimal training and low-tech equipment (a camcorder) to collect large sets of lifting data without interfering with normal business operations. A reasonably accurate estimate of the peak compressive force on the L5/S1 joint can be made from the data collected. Such a system can be used to collect quantified data on lifting techniques that can be related to LBP reporting.

  5. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    NASA Astrophysics Data System (ADS)

    Singh Duksh, Yograj; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-05-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE.

  6. A New Project-Based Lab for Undergraduate Environmental and Analytical Chemistry

    ERIC Educational Resources Information Center

    Adami, Gianpiero

    2006-01-01

    A new project-based lab was developed for third year undergraduate chemistry students based on real world applications. The experience suggests that the total analytical procedure (TAP) project offers a stimulating alternative for delivering science skills and developing a greater interest for analytical chemistry and environmental sciences and…

  7. Biomedical microelectromechanical systems (BioMEMS): Revolution in drug delivery and analytical techniques.

    PubMed

    Jivani, Rishad R; Lakhtaria, Gaurang J; Patadiya, Dhaval D; Patel, Laxman D; Jivani, Nurrudin P; Jhala, Bhagyesh P

    2016-01-01

    Advancement in microelectromechanical system has facilitated the microfabrication of polymeric substrates and the development of the novel class of controlled drug delivery devices. These vehicles have specifically tailored three dimensional physical and chemical features which together, provide the capacity to target cell, stimulate unidirectional controlled release of therapeutics and augment permeation across the barriers. Apart from drug delivery devices microfabrication technology's offer exciting prospects to generate biomimetic gastrointestinal tract models. BioMEMS are capable of analysing biochemical liquid sample like solution of metabolites, macromolecules, proteins, nucleic acid, cells and viruses. This review summarized multidisciplinary application of biomedical microelectromechanical systems in drug delivery and its potential in analytical procedures. PMID:26903763

  8. Biomedical microelectromechanical systems (BioMEMS): Revolution in drug delivery and analytical techniques

    PubMed Central

    Jivani, Rishad R.; Lakhtaria, Gaurang J.; Patadiya, Dhaval D.; Patel, Laxman D.; Jivani, Nurrudin P.; Jhala, Bhagyesh P.

    2013-01-01

    Advancement in microelectromechanical system has facilitated the microfabrication of polymeric substrates and the development of the novel class of controlled drug delivery devices. These vehicles have specifically tailored three dimensional physical and chemical features which together, provide the capacity to target cell, stimulate unidirectional controlled release of therapeutics and augment permeation across the barriers. Apart from drug delivery devices microfabrication technology’s offer exciting prospects to generate biomimetic gastrointestinal tract models. BioMEMS are capable of analysing biochemical liquid sample like solution of metabolites, macromolecules, proteins, nucleic acid, cells and viruses. This review summarized multidisciplinary application of biomedical microelectromechanical systems in drug delivery and its potential in analytical procedures. PMID:26903763

  9. Visualizing metal ions in cells: an overview of analytical techniques, approaches, and probes

    PubMed Central

    Dean, Kevin M.; Qin, Yan; Palmer, Amy E.

    2012-01-01

    Quantifying the amount and defining the location of metal ions in cells and organisms are critical steps in understanding metal homeostasis and how dyshomeostasis causes or is a consequence of disease. A number of recent advances have been made in the development and application of analytical methods to visualize metal ions in biological specimens. Here, we briefly summarize these advances before focusing in more depth on probes for examining transition metals in living cells with high spatial and temporal resolution using fluorescence microscopy. PMID:22521452

  10. COMPARISON OF ANALYTICAL TECHNIQUES FOR MEASURING HYDROCARBON EMISSIONS FROM THE MANUFACTURE OF FIBERGLASS-REINFORCED PLASTICS

    EPA Science Inventory

    The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equ...

  11. Transcription factor-based biosensors enlightened by the analyte

    PubMed Central

    Fernandez-López, Raul; Ruiz, Raul; de la Cruz, Fernando; Moncalián, Gabriel

    2015-01-01

    Whole cell biosensors (WCBs) have multiple applications for environmental monitoring, detecting a wide range of pollutants. WCBs depend critically on the sensitivity and specificity of the transcription factor (TF) used to detect the analyte. We describe the mechanism of regulation and the structural and biochemical properties of TF families that are used, or could be used, for the development of environmental WCBs. Focusing on the chemical nature of the analyte, we review TFs that respond to aromatic compounds (XylS-AraC, XylR-NtrC, and LysR), metal ions (MerR, ArsR, DtxR, Fur, and NikR) or antibiotics (TetR and MarR). Analyzing the structural domains involved in DNA recognition, we highlight the similitudes in the DNA binding domains (DBDs) of these TF families. Opposite to DBDs, the wide range of analytes detected by TFs results in a diversity of structures at the effector binding domain. The modular architecture of TFs opens the possibility of engineering TFs with hybrid DNA and effector specificities. Yet, the lack of a crisp correlation between structural domains and specific functions makes this a challenging task. PMID:26191047

  12. Neutron capture radiography: a technique for isotopic labelling and analytical imaging with a few stable isotopes.

    PubMed

    Thellier, Michel; Ripoll, Camille

    2006-06-19

    NCR (neutron capture radiography) may be used successfully for the imaging of one of the stable isotopes of a few chemical elements (especially 6Li and 10B, possibly also 14N, 17O, and others) and for labelling experiments using these stable isotopes. Other physical techniques compete with NCR. However, NCR can remain extremely useful in a certain number of cases, because it is usually more easily done and is less expensive than the other techniques.

  13. Development of an analytical technique for the detection of alteration minerals formed in bentonite by reaction with alkaline solutions

    NASA Astrophysics Data System (ADS)

    Sakamoto, H.; Shibata, M.; Owada, H.; Kaneko, M.; Kuno, Y.; Asano, H.

    A multibarrier system consisting of cement-based backfill, structures and support materials, and a bentonite-based buffer material has been studied for the TRU waste disposal concept being developed in Japan, the aim being to restrict the migration of radionuclides. Concern regarding bentonite-based materials in this disposal environment relates to long-term alteration under hyper-alkaline conditions due to the presence of cementitious materials. In tests simulating the interaction between bentonite and cement, formation of secondary minerals due to alteration reactions under the conditions expected for geological disposal of TRU waste (equilibrated water with cement at low liquid/solid ratio) has not been observed, although alteration was observed under extremely hyper-alkaline conditions with high temperatures. This was considered to be due to the fact that analysis of C-S-H gel formed at the interface as a secondary mineral was difficult using XRD, because of its low crystallinity and low content. This paper describes an analytical technique for the characterization of C-S-H gel using a heavy liquid separation method which separates C-S-H gel from Kunigel V1 bentonite (bentonite produced in Japan) based on the difference in specific gravity between the crystalline minerals constituting Kunigel V1 and the secondary C-S-H gel. For development of C-S-H gel separation methods, simulated alteration samples were prepared by mixing 990 mg of unaltered Kunigel V1 and 10 mg of C-S-H gel synthesized using pure chemicals at a ratio of Ca/Si = 1.2. The simulated alteration samples were dispersed in bromoform-methanol mixtures with specific gravities ranging from 2.00 to 2.57 g/cm 3 and subjected to centrifuge separation to recover the light density fraction. Subsequent XRD analysis to identify the minerals was complemented by dissolution in 0.6 N hydrochloric acid to measure the Ca and Si contents. The primary peak (2 θ = 29.4°, Cu Kα) and secondary peaks (2 θ = 32.1

  14. multiplex gas chromatography: A novel analytical technique for future planetary studies

    NASA Technical Reports Server (NTRS)

    Valentin, J. R.; Carle, G. C.; Phillips, J. B.

    1986-01-01

    Determination of molecular species comprised of the biogenic elements in the atmospheres of planets and moons of the solar system is one the foremost requirements of the exobiologist studying chemical evolution and the origin of life. Multiplex chromatography is a technique where many samples are pseudo-randomly introduced to the chromatograph without regard to elution of preceding components. The resulting data are then reduced using mathematical techniques such as cross correlation or Fourier Transforms. To demonstrate the utility of this technique for future solar system exploration, chemical modulators were developed. Several advantages were realized from this technique in combination with these modulators: improvement in detection limits of several orders of magnitude, improvement in the analysis of complex mixtures by selectively modulating some of the components present in the sample, increase in the number of analyses that can be conducted in a given period of time, and reduction in the amount of expendables needed to run an analysis. In order to apply this technique in a real application, methane in ambient air was monitored continuously over a period of one week. By using ambient air as its own carrier all expendables beyond power were eliminated.

  15. Hydrocarbon microseepage mapping using signature based target detection techniques

    NASA Astrophysics Data System (ADS)

    Soydan, Hilal; Koz, Alper; Şebnem Düzgün, H.; Aydin Alatan, A.

    2015-10-01

    In this paper, we compare the conventional methods in hydrocarbon seepage anomalies with the signature based detection algorithms. The Crosta technique [1] is selected as a basement in the experimental comparisons for the conventional approach. The Crosta technique utilizes the characteristic bands of the searched target for principal component transformation in order to determine the components characterizing the target in interest. Desired Target Detection and Classification Algorithm (DTDCA), Spectral Matched Filter (SMF), and Normalized Correlation (NC) are employed for signature based target detection. Signature based target detection algorithms are applied to the whole spectrum benefiting from the information stored in all spectral bands. The selected methods are applied to a multispectral Advanced SpaceBorne Thermal Emission and Radiometer (ASTER) image of the study region, with an atmospheric correction prior to the realization of the algorithms. ASTER provides multispectral bands covering visible, short wave, and thermal infrared region, which serves as a useful tool for the interpretation of the areas with hydrocarbon anomalies. The exploration area is selected as Gemrik Anticline which is located in South East Anatolia, Adıyaman, Bozova Oil Field, where microseeps can be observed with almost no vegetation cover. The spectral signatures collected with Analytical Spectral Devices Inc. (ASD) spectrometer from the reference valley [2] have been utilized as an input to the signature based detection algorithms. The experiments have indicated that DTDCA and MF outperforms the Crosta technique by locating the microseepage patterns along the mitigation pathways with a better contrast. On the other hand, NC has not been able to map the searched target with a visible distinction. It is concluded that the signature based algorithms can be more effective than the conventional methods for the detection of microseepage induced anomalies.

  16. Advanced Analytical Techniques for the Measurement of Nanomaterials in Food and Agricultural Samples: A Review

    PubMed Central

    Bandyopadhyay, Susmita; Peralta-Videa, Jose R.; Gardea-Torresdey, Jorge L.

    2013-01-01

    Abstract Nanotechnology offers substantial prospects for the development of state-of-the-art products and applications for agriculture, water treatment, and food industry. Profuse use of nanoproducts will bring potential benefits to farmers, the food industry, and consumers, equally. However, after end-user applications, these products and residues will find their way into the environment. Therefore, discharged nanomaterials (NMs) need to be identified and quantified to determine their ecotoxicity and the levels of exposure. Detection and characterization of NMs and their residues in the environment, particularly in food and agricultural products, have been limited, as no single technique or method is suitable to identify and quantify NMs. In this review, we have discussed the available literature concerning detection, characterization, and measurement techniques for NMs in food and agricultural matrices, which include chromatography, flow field fractionation, electron microscopy, light scattering, and autofluorescence techniques, among others. PMID:23483065

  17. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  18. Cross-cultural patterns in emotion recognition: highlighting design and analytical techniques.

    PubMed

    Elfenbein, Hillary Anger; Mandal, Manas K; Ambady, Nalini; Harizuka, Susumu; Kumar, Surender

    2002-03-01

    This article highlights a range of design and analytical tools for studying the cross-cultural communication of emotion using forced-choice experimental designs. American, Indian, and Japanese participants judged facial expressions from all 3 cultures. A factorial experimental design is used, balanced n x n across cultures, to separate "absolute" cultural differences from "relational" effects characterizing the relationship between the emotion expressor and perceiver. Use of a response bias correction is illustrated for the tendency to endorse particular multiple-choice categories more often than others. Treating response bias also as an opportunity to gain insight into attributional style, the authors examined similarities and differences in response patterns across cultural groups. Finally, the authors examined patterns in the errors or confusions that participants make during emotion recognition and documented strong similarity across cultures.

  19. Modern analytical techniques in the assessment of the authenticity of Serbian honey.

    PubMed

    Milojković Opsenica, Dušanka; Lušić, Dražen; Tešić, Živoslav

    2015-12-01

    Food authenticity in a broader sense means fulfilling chemical and physical criteria prescribed by the proposed legislation. In the case of honey authenticity, two aspects are of major concern: the manufacturing process and the labelling of final products in terms of their geographical and botanical origin. A reliable assessment of honey authenticity has been a longterm preoccupation of chemists-analysts and it usually involves the use of several criteria and chemical markers, as well as a combination of analytical and statistical (chemometric) methods. This paper provides an overview of different criteria and modern methods for the assessment of honey authenticity in the case of a statistically significant number of authentic honey samples of several botanical types from various regions of Serbia.

  20. Modern analytical techniques in the assessment of the authenticity of Serbian honey.

    PubMed

    Milojković Opsenica, Dušanka; Lušić, Dražen; Tešić, Živoslav

    2015-12-01

    Food authenticity in a broader sense means fulfilling chemical and physical criteria prescribed by the proposed legislation. In the case of honey authenticity, two aspects are of major concern: the manufacturing process and the labelling of final products in terms of their geographical and botanical origin. A reliable assessment of honey authenticity has been a longterm preoccupation of chemists-analysts and it usually involves the use of several criteria and chemical markers, as well as a combination of analytical and statistical (chemometric) methods. This paper provides an overview of different criteria and modern methods for the assessment of honey authenticity in the case of a statistically significant number of authentic honey samples of several botanical types from various regions of Serbia. PMID:26751854

  1. Effect of Potassium on the Mechanisms of Biomass Pyrolysis Studied using Complementary Analytical Techniques.

    PubMed

    Le Brech, Yann; Ghislain, Thierry; Leclerc, Sébastien; Bouroukba, Mohammed; Delmotte, Luc; Brosse, Nicolas; Snape, Colin; Chaimbault, Patrick; Dufour, Anthony

    2016-04-21

    Complementary analytical methods have been used to study the effect of potassium on the pyrolysis mechanisms of cellulose and lignocellulosic biomasses. Thermogravimetry, calorimetry, high-temperature (1) H NMR spectroscopy (in situ and real-time analysis of the fluid phase formed during pyrolysis), and water extraction of quenched char followed by size-exclusion chromatography coupled with mass spectrometry have been combined. Potassium impregnated in cellulose suppresses the formation of anhydrosugars, reduces the formation of mobile protons, and gives rise to a mainly exothermic signal. The evolution of mobile protons formed from K-impregnated cellulose has a very similar pattern to the evolution of the mass loss rate. This methodology has been also applied to analyze miscanthus, demineralized miscanthus, miscanthus re-impregnated with potassium after demineralization, raw oak, and Douglas fir. Hydrogen mobility and transfer are of high importance in the mechanisms of biomass pyrolysis. PMID:26990591

  2. Analytical-scale separations of lanthanides : a review of techniques and fundamentals.

    SciTech Connect

    Nash, K. L.; Jensen, M. P.

    1999-10-27

    Separations chemistry is at the heart of most analytical procedures to determine the rare earth content of both man-made and naturally occurring materials. Such procedures are widely used in mineral exploration, fundamental geology and geochemistry, material science, and in the nuclear industry. Chromatographic methods that rely on aqueous solutions containing complexing agents sensitive to the lanthanide cationic radius and cation-exchange phase transfer reactions (using a variety of different solid media) have enjoyed the greatest success for these procedures. In this report, they will briefly summarize the most important methods for completing such analyses. they consider in some detail the basic aqueous (and two-phase) solution chemistry that accounts for separations that work well and offer explanations for why others are less successful.

  3. Analytical and experimental evaluation of techniques for the fabrication of thermoplastic hologram storage devices

    NASA Technical Reports Server (NTRS)

    Rogers, J. W.

    1975-01-01

    The results of an experimental investigation on recording information on thermoplastic are given. A description was given of a typical fabrication configuration, the recording sequence, and the samples which were examined. There are basically three configurations which can be used for the recording of information on thermoplastic. The most popular technique uses corona which furnishes free charge. The necessary energy for deformation is derived from a charge layer atop the thermoplastic. The other two techniques simply use a dc potential in place of the corona for deformation energy.

  4. Artificial Intelligence based technique for BTS placement

    NASA Astrophysics Data System (ADS)

    Alenoghena, C. O.; Emagbetere, J. O.; Aibinu, A. M.

    2013-12-01

    The increase of the base transceiver station (BTS) in most urban areas can be traced to the drive by network providers to meet demand for coverage and capacity. In traditional network planning, the final decision of BTS placement is taken by a team of radio planners, this decision is not fool proof against regulatory requirements. In this paper, an intelligent based algorithm for optimal BTS site placement has been proposed. The proposed technique takes into consideration neighbour and regulation considerations objectively while determining cell site. The application will lead to a quantitatively unbiased evaluated decision making process in BTS placement. An experimental data of a 2km by 3km territory was simulated for testing the new algorithm, results obtained show a 100% performance of the neighbour constrained algorithm in BTS placement optimization. Results on the application of GA with neighbourhood constraint indicate that the choices of location can be unbiased and optimization of facility placement for network design can be carried out.

  5. A Comparative Analysis of Two Order Analytic Techniques: Assessing Item Hierarchies in Real and Simulated Data.

    ERIC Educational Resources Information Center

    Chevalaz, Gerard M.; Tatsuoka, Kikumi K.

    Two order theoretic techniques were presented and compared. Ordering theory of Krus and Bart (1974) and an extended Takeya's item relational structure analysis (IRS) by Tatsuoka and Tatsuoka (1981) were used to extract the hierarchical item structure from three datasets. Directed graphs were constructed and both methods were assessed as to how…

  6. Classification of user interfaces for graph-based online analytical processing

    NASA Astrophysics Data System (ADS)

    Michaelis, James R.

    2016-05-01

    In the domain of business intelligence, user-oriented software for conducting multidimensional analysis via Online- Analytical Processing (OLAP) is now commonplace. In this setting, datasets commonly have well-defined sets of dimensions and measures around which analysis tasks can be conducted. However, many forms of data used in intelligence operations - deriving from social networks, online communications, and text corpora - will consist of graphs with varying forms of potential dimensional structure. Hence, enabling OLAP over such data collections requires explicit definition and extraction of supporting dimensions and measures. Further, as Graph OLAP remains an emerging technique, limited research has been done on its user interface requirements. Namely, on effective pairing of interface designs to different types of graph-derived dimensions and measures. This paper presents a novel technique for pairing of user interface designs to Graph OLAP datasets, rooted in Analytic Hierarchy Process (AHP) driven comparisons. Attributes of the classification strategy are encoded through an AHP ontology, developed in our alternate work and extended to support pairwise comparison of interfaces. Specifically, according to their ability, as perceived by Subject Matter Experts, to support dimensions and measures corresponding to Graph OLAP dataset attributes. To frame this discussion, a survey is provided both on existing variations of Graph OLAP, as well as existing interface designs previously applied in multidimensional analysis settings. Following this, a review of our AHP ontology is provided, along with a listing of corresponding dataset and interface attributes applicable toward SME recommendation structuring. A walkthrough of AHP-based recommendation encoding via the ontology-based approach is then provided. The paper concludes with a short summary of proposed future directions seen as essential for this research area.

  7. Analytical method for the identification and assay of 12 phthalates in cosmetic products: application of the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques".

    PubMed

    Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine

    2012-08-31

    Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices.

  8. An uncovered XIII century icon: particular use of organic pigments and gilding techniques highlighted by analytical methods.

    PubMed

    Daveri, Alessia; Doherty, Brenda; Moretti, Patrizia; Grazia, Chiara; Romani, Aldo; Fiorin, Enrico; Brunetti, Brunetto Giovanni; Vagnini, Manuela

    2015-01-25

    The restoration of a panel painting depicting a Madonna and Child listed as an unknown Tuscan artist of the nineteenth century, permitted the hidden original version, a XIII century Medieval icon to be uncovered. It is discovery provided the opportunity for an extensive in situ campaign of non-invasive analytical investigations by portable imaging and spectroscopic techniques (infrared, X-ray fluorescence and diffraction, UV-Vis absorption and emission), followed by aimed micro-destructive investigations (Raman and SEM-EDS). This approach permitted characterization of the original ground and paint layers by complementary techniques. Furthermore, this protocol allowed supplementary particularities of great interest to be highlighted. Namely, numerous original gilding techniques have been accentuated in diverse areas and include the use of surrogate gold (disulphur tin), orpiment as a further false gold and an area with an original silver rich layer. Moreover, pigments including azurite mixed with indigo have been non-invasively identified. Micro-invasive analyses also allowed the diagnosis of organic colorants, namely, an animal anthraquinone lake, kermes and an unusual vegetal chalcone pigment, possibly safflower. The identification of the latter is extremely rare as a painting pigment and has been identified using an innovative adaption to surface enhanced Raman techniques on a cross-section. The resulting data contributes new hypotheses to the historic and artistic knowledge of materials and techniques utilized in XIII century icon paintings and ultimately provides scientific technical support of the recent restoration. PMID:25105261

  9. An uncovered XIII century icon: Particular use of organic pigments and gilding techniques highlighted by analytical methods

    NASA Astrophysics Data System (ADS)

    Daveri, Alessia; Doherty, Brenda; Moretti, Patrizia; Grazia, Chiara; Romani, Aldo; Fiorin, Enrico; Brunetti, Brunetto Giovanni; Vagnini, Manuela

    2015-01-01

    The restoration of a panel painting depicting a Madonna and Child listed as an unknown Tuscan artist of the nineteenth century, permitted the hidden original version, a XIII century Medieval icon to be uncovered. It is discovery provided the opportunity for an extensive in situ campaign of non-invasive analytical investigations by portable imaging and spectroscopic techniques (infrared, X-ray fluorescence and diffraction, UV-Vis absorption and emission), followed by aimed micro-destructive investigations (Raman and SEM-EDS). This approach permitted characterization of the original ground and paint layers by complementary techniques. Furthermore, this protocol allowed supplementary particularities of great interest to be highlighted. Namely, numerous original gilding techniques have been accentuated in diverse areas and include the use of surrogate gold (disulphur tin), orpiment as a further false gold and an area with an original silver rich layer. Moreover, pigments including azurite mixed with indigo have been non-invasively identified. Micro-invasive analyses also allowed the diagnosis of organic colorants, namely, an animal anthraquinone lake, kermes and an unusual vegetal chalcone pigment, possibly safflower. The identification of the latter is extremely rare as a painting pigment and has been identified using an innovative adaption to surface enhanced Raman techniques on a cross-section. The resulting data contributes new hypotheses to the historic and artistic knowledge of materials and techniques utilized in XIII century icon paintings and ultimately provides scientific technical support of the recent restoration.

  10. Use of the Relaxometry Technique for Quantification of Paramagnetic Ions in Aqueous Solutions and a Comparison with Other Analytical Methods

    PubMed Central

    Burato, Juliana Soares da Silva; Silva Lobo, Carlos Manuel; Colnago, Luiz Alberto

    2016-01-01

    We have demonstrated that the relaxometry technique is very efficient to quantify paramagnetic ions during in situ electrolysis measurements. Therefore, the goal of this work was to validate the relaxometry technique in the determination of the concentration of the ions contained in electrolytic solutions, Cu2+, Ni2+, Cr3+, and Mn2+, and compare it with other analytical methods. Two different NMR spectrometers were used: a commercial spectrometer with a homogeneous magnetic field and a home-built unilateral sensor with an inhomogeneous magnetic field. Without pretreatment, manganese ions do not have absorption bands in the UV-Visible region, but it is possible to quantify them using relaxometry (the limit of quantification is close to 10−5 mol L−1). Therefore, since the technique does not require chemical indicators and is a cheap and robust method, it can be used as a replacement for some conventional quantification techniques. The relaxometry technique could be applied to evaluate the corrosion of metallic surfaces. PMID:27293437

  11. The influence of surface chemistry on GSR particles: using XPS to complement SEM/EDS analytical techniques

    NASA Astrophysics Data System (ADS)

    Schwoeble, A. J.; Strohmeier, Brian R.; Piasecki, John D.

    2010-06-01

    Gunshot residue particles (GSR) were examined using scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) to illustrate the size, shape, morphology, and elemental composition normally observed in particulate resulting from a discharged firearm. Determining the presence of lead (Pb), antimony (Sb), and barium (Ba), barring other elemental tags, fused together in a single particle with the correct morphology, is all that is required for the positive identification of GSR. X-ray photoelectron spectroscopy (XPS), however, can reveal more detailed information on surface chemistry than SEM/EDS. XPS is a highly surface-sensitive (<= ~10 nm), non-destructive, analytical technique that provides qualitative information for all elements except hydrogen and helium. Nanometer-scale sampling depth and its ability to provide unique chemical state information make XPS a potential technique for providing important knowledge on the surface chemistry of GSR that complements results obtained from SEM/EDS analysis.

  12. Analytical techniques for identification and study of organic matter in returned lunar samples

    NASA Technical Reports Server (NTRS)

    Burlingame, A. L.

    1974-01-01

    The results of geochemical research are reviewed. Emphasis is placed on the contribution of mass spectrometric data to the solution of specific structural problems. Information on the mass spectrometric behavior of compounds of geochemical interest is reviewed and currently available techniques of particular importance to geochemistry, such as gas chromatograph-mass spectrometer coupling, modern sample introduction methods, and computer application in high resolution mass spectrometry, receive particular attention.

  13. An analytical investigation of NO sub x control techniques for methanol fueled spark ignition engines

    NASA Technical Reports Server (NTRS)

    Browning, L. H.; Argenbright, L. A.

    1983-01-01

    A thermokinetic SI engine simulation was used to study the effects of simple nitrogen oxide control techniques on performance and emissions of a methanol fueled engine. As part of this simulation, a ring crevice storage model was formulated to predict UBF emissions. The study included spark retard, two methods of compression ratio increase and EGR. The study concludes that use of EGR in high turbulence, high compression engines will both maximize power and thermal efficiency while minimizing harmful exhaust pollutants.

  14. Computational Diagnostic Techniques for Electromagnetic Scattering: Analytical Imaging, Near Fields, and Surface Currents

    NASA Technical Reports Server (NTRS)

    Hom, Kam W.; Talcott, Noel A., Jr.; Shaeffer, John

    1997-01-01

    This paper presents three techniques and the graphics implementations which can be used as diagnostic aides in the design and understanding of scattering structures: Imaging, near fields, and surface current displays. The imaging analysis is a new bistatic k space approach which has potential for much greater information than standard experimental approaches. The near field and current analysis are implementations of standard theory while the diagnostic graphics displays are implementations exploiting recent computer engineering work station graphics libraries.

  15. Mass Spectrometry as a Powerful Analytical Technique for the Structural Characterization of Synthesized and Natural Products

    NASA Astrophysics Data System (ADS)

    Es-Safi, Nour-Eddine; Essassi, El Mokhtar; Massoui, Mohamed; Banoub, Joseph

    Mass spectrometry is an important tool for the identification and structural elucidation of natural and synthesized compounds. Its high sensitivity and the possibility of coupling liquid chromatography with mass spectrometry detection make it a technique of choice for the investigation of complex mixtures like raw natural extracts. The mass spectrometer is a universal detector that can achieve very high sensitivity and provide information on the molecular mass. More detailed information can be subsequently obtained by resorting to collision-induced dissociation tandem mass spectrometry (CID-MS/MS). In this review, the application of mass spectrometric techniques for the identification of natural and synthetic compounds is presented. The gas-phase fragmentation patterns of a series of four natural flavonoid glycosides, three synthesized benzodiazepines and two synthesized quinoxalinone derivatives were investigated using electrospray ionization mass spectrometry (ESI-MS) and tandem mass spectrometry techniques. Exact accurate masses were measured using a modorate resolution quadrupole orthogonal time-of-flight QqTOF-MS/MS hybrid mass spectrometer instrument. Confirmation of the molecular masses and the chemical structures of the studied compounds were achieved by exploring the gas-phase breakdown routes of the ionized molecules. This was rationalized by conducting low-energy collision CID-MS/MS analyses (product ion- and precursor ion scans) using a conventional quadrupole hexapole-quadrupole (QhQ) tandem mass spectrometer.

  16. UPb ages of zircon rims: A new analytical method using the air-abrasion technique

    USGS Publications Warehouse

    Aleinikoff, J.N.; Winegarden, D.L.; Walter, M.

    1990-01-01

    We present a new technique for directly dating, by conventional techniques, the rims of zircons. Several circumstances, such as a xenocrystic or inherited component in igneous zircon and metamorphic overgrowths on igneous cores, can result in grains with physically distinct age components. Pneumatic abrasion has been previously shown by Krogh to remove overgrowths and damaged areas of zircon, leaving more resistant and isotopically less disturbed parts available for analysis. A new abrader design, which is capable of very gently grinding only tips and interfacial edges of even needle-like grains, permits easy collection of abraded material for dating. Five examples demonstrate the utility of the "dust-collecting" technique, including two studies that compare conventional, ion microprobe and abrader data. Common Pb may be strongly concentrated in the outermost zones of many zircons and this Pb is not easily removed by leaching (even in weak HF). Thus, the benefit of removing only the outermost zones (and avoiding mixing of age components) is somewhat compromised by the much higher common Pb contents which result in less precise age determinations. A very brief abrasion to remove the high common Pb zones prior to collection of material for dating is selected. ?? 1990.

  17. Re-Paying Attention to Visitor Behavior: A Re-Analysis using Meta-Analytic Techniques.

    PubMed

    Castro, Yone; Botella, Juan; Asensio, Mikel

    2016-01-01

    The present study describes a meta-analytic review of museum visitors' behavior. Although there is a large number of visitor studies available, their cumulative importance has not been determined due to the lack of rigorous methods to determine common causes of visitors' behaviors. We analyzed Serrell's (1998) database of 110 studies, defining a number of variables that measure visitors' behaviors in exhibition spaces which exceeded the most typical and obvious ones. We defined four indexes of effect size and obtained their combined estimates: average time per feature [ATF● = 0.43 (0.49; 0.37)], percentage of diligent visitors [dv● = 30% (0.39; 0.23)], inverse of velocity [Iv● = 4.07 min/100m2 (4.55; 3.59)], and stops per feature [SF● = 0.35 (0.38; 0.33)], and we analyzed the role of relevant moderating variables. Key findings indicate, for example, that the visiting time for each display element relates to the size of the exhibition and its newness, and visitor walking speed is higher in large exhibit areas. The indexes obtained in this study can be understood as references to be used for comparison with new evaluations. They may help to predict people's behavior and appreciation of new exhibitions, identifying important problems in museum designs, and providing new research tools for this field. PMID:27319781

  18. A polarization-based Thomson scattering technique for burning plasmas

    NASA Astrophysics Data System (ADS)

    Parke, E.; Mirnov, V. V.; Den Hartog, D. J.

    2014-02-01

    The traditional Thomson scattering diagnostic is based on measurement of the wavelength spectrum of scattered light, where electron temperature measurements are inferred from thermal broadening of the spectrum. At sufficiently high temperatures, especially those predicted for ITER and other burning plasmas, relativistic effects cause a change in the degree of polarization (P) of the scattered light; for fully polarized incident laser light, the scattered light becomes partially polarized. The resulting reduction of polarization is temperature dependent and has been proposed by other authors as a potential alternative to the traditional spectral decomposition technique. Following the previously developed Stokes vector approach, we analytically calculate the degree of polarization for incoherent Thomson scattering. For the first time, we obtain exact results valid for the full range of incident laser polarization states, scattering angles, and electron temperatures. While previous work focused only on linear polarization, we show that circularly polarized incident light optimizes the degree of depolarization for a wide range of temperatures relevant to burning plasmas. We discuss the feasibility of a polarization based Thomson scattering diagnostic for ITER-like plasmas with both linearly and circularly polarized light and compare to the traditional technique.

  19. An insight-based longitudinal study of visual analytics.

    PubMed

    Saraiya, Purvi; North, Chris; Lam, Vy; Duca, Karen A

    2006-01-01

    Visualization tools are typically evaluated in controlled studies that observe the short-term usage of these tools by participants on preselected data sets and benchmark tasks. Though such studies provide useful suggestions, they miss the long-term usage of the tools. A longitudinal study of a bioinformatics data set analysis is reported here. The main focus of this work is to capture the entire analysis process that an analyst goes through from a raw data set to the insights sought from the data. The study provides interesting observations about the use of visual representations and interaction mechanisms provided by the tools, and also about the process of insight generation in general. This deepens our understanding of visual analytics, guides visualization developers in creating more effective visualization tools in terms of user requirements, and guides evaluators in designing future studies that are more representative of insights sought by users from their data sets.

  20. Proposal for an analytical sequence aimed at establishing sutcco's composition and technique used: research on samples collected in southern Switzerland

    NASA Astrophysics Data System (ADS)

    Cavallo, Giovanni; Moresi, Marco

    2005-06-01

    The paper presents the results of experiments obtained using different analytical techniques (optical and electronic microscopy, infrared spectroscopy, powder X-ray diffraction, X-ray fluorescence, microanalysis) performed on stucco's samples collected in churches and historical buildings in Canton Ticino and Canton Grigioni (Southern Switzerland). The research is principally oriented towards establishing the better analytical sequence for an efficacious characterization of materials and techniques used in making stuccos, in order to satisfy restoration requests. Plastic decorations (stuccoes of 17th and 18th century), imitation marble vertical surfaces - stucco lustro - (19th century) and decorative elements as stucco lustro (17th century) were studied. The experimental data showed the same bottom layer for all the samples; different categories of stucco are distinguishable observing finishing layer characteristics. Petrographic examinations and spectroscopic infrared analyses represent a suitable survey sequence, working on samples of millimetric size (low invasive and high representative criteria for sampling), considering that it is an usual necessity to divide mechanically the different parts of the same material, as for example bottom layer and finishing one, to detect the presence of organic compounds in each layer. More significant results should be obtained employing electron microscope and microanalysis, using the same thin polished section of optical examinations. Mineralogical and chemical analyses performed by X-ray diffraction and X-ray fluorescence require a greater sample availability but in this way it is possible to obtain more complete and representative information specifying compounds bound to alteration processes and/or to previous restoration interventions.

  1. Analytical technique to address terrorist threats by chemical weapons of mass destruction

    NASA Astrophysics Data System (ADS)

    Dempsey, Patrick M.

    1997-01-01

    Terrorism is no longer an issue without effect on the American mind. We now live with the same concerns and fears that have been commonplace in other developed and third world countries for a long time. Citizens of other countries have long lived with the specter of terrorism and now the U.S. needs to be concerned and prepared for terrorist activities.T he terrorist has the ability to cause great destructive effects by focusing their effort on unaware and unprepared civilian populations. Attacks can range from simple explosives to sophisticated nuclear, chemical and biological weapons. Intentional chemical releases of hazardous chemicals or chemical warfare agents pose a great threat because of their ready availability and/or ease of production, and their ability to cause widespread damage. As this battlefront changes from defined conflicts and enemies to unnamed terrorists, we must implement the proper analytical tools to provide a fast and efficient response. Each chemical uses in a terrorists weapon leaves behind a chemical signature that can be used to identify the materials involved and possibly lead investigators to the source and to those responsible. New tools to provide fast and accurate detection for battlefield chemical and biological agent attack are emerging. Gas chromatography/mass spectrometry (GC/MS) is one of these tools that has found increasing use by the military to respond to chemical agent attacks. As the technology becomes smaller and more portable, it can be used by law enforcement personnel to identify suspected terrorist releases and to help prepare the response; define contaminated areas for evacuation and safety concerns, identify the proper treatment of exposed or affected civilians, and suggest decontamination and cleanup procedures.

  2. A quality control technique based on UV-VIS absorption spectroscopy for tequila distillery factories

    NASA Astrophysics Data System (ADS)

    Barbosa Garcia, O.; Ramos Ortiz, G.; Maldonado, J. L.; Pichardo Molina, J.; Meneses Nava, M. A.; Landgrave, Enrique; Cervantes, M. J.

    2006-02-01

    A low cost technique based on the UV-VIS absorption spectroscopy is presented for the quality control of the spirit drink known as tequila. It is shown that such spectra offer enough information to discriminate a given spirit drink from a group of bottled commercial tequilas. The technique was applied to white tequilas. Contrary to the reference analytic methods, such as chromatography, for this technique neither special personal training nor sophisticated instrumentations is required. By using hand-held instrumentation this technique can be applied in situ during the production process.

  3. An accelerated photo-magnetic imaging reconstruction algorithm based on an analytical forward solution and a fast Jacobian assembly method

    NASA Astrophysics Data System (ADS)

    Nouizi, F.; Erkol, H.; Luk, A.; Marks, M.; Unlu, M. B.; Gulsen, G.

    2016-10-01

    We previously introduced photo-magnetic imaging (PMI), an imaging technique that illuminates the medium under investigation with near-infrared light and measures the induced temperature increase using magnetic resonance thermometry (MRT). Using a multiphysics solver combining photon migration and heat diffusion, PMI models the spatiotemporal distribution of temperature variation and recovers high resolution optical absorption images using these temperature maps. In this paper, we present a new fast non-iterative reconstruction algorithm for PMI. This new algorithm uses analytic methods during the resolution of the forward problem and the assembly of the sensitivity matrix. We validate our new analytic-based algorithm with the first generation finite element method (FEM) based reconstruction algorithm previously developed by our team. The validation is performed using, first synthetic data and afterwards, real MRT measured temperature maps. Our new method accelerates the reconstruction process 30-fold when compared to a single iteration of the FEM-based algorithm.

  4. A pass planning method for multi-hit stretching of heavy forgings by integration of a semi-analytical technique and degrees-reduced finite element

    NASA Astrophysics Data System (ADS)

    Cui, Zhenshan; Chen, Wen; Sui, Dashan; Liu, Juan

    2013-05-01

    A pass planning method for multi-pass and multi-hit stretching of heavy forgings is proposed, which composes of a semi-analytical procedure and a degrees-reduced finite element code. The semi-analytical procedure is based on a kinematically admissible velocity and Markov variational principle, and can be applied to roughly calculate the deformed shape and working force for stretch forging process for work-piece which has vertical and lateral symmetrical lines in cross-section. Meanwhile, in order to obtain the distributions of metal flow, temperature, strain and stress in detail, a degrees-reduced thermo-mechanical coupled rigid finite element code is developed. In this code, the instantaneous deformation zone is specially extracted from the total domain and simulated for metal flow, while the total domain is used to simulate the evolution of thermal field. Taking the semi-analytical method as a solver, the pass planning procedure for stretch forging is developed, and the degrees-reduced finite element code is used as a supplement to check the rationality of the planed pass schedule. An example is implemented to demonstrate the application of the proposed technique.

  5. Characterization of Some Iraqi Archaeological Samples Using IBA, Analytical X-ray and Other Complementary Techniques

    NASA Astrophysics Data System (ADS)

    Shihab Al-Sarraj, Ziyad; Roumie, Mohamad; Damboos, Hassan I.

    2012-07-01

    The present work aimed at investigating the compositions and microstructures of some archaeological samples which dated back to various periods of the ancient Iraqi civilizations using PIXE, XRF, XRD, and SEM techniques. The models selected for the study (ceramics, glaze, etc.) were diverse in size and nature, therefore a limited number of samples were then butted from them by a small diamond wheel. Conventional powder metallurgy method was then used to prepare the samples. Dried samples were then coated with a thin layer of carbon, and analyzed using the ion beam accelerator of the LAEC. Three other groups of samples were also prepared for the purpose of analysis by X-ray fluorescence (XRF), X-ray diffraction (XRD), and scanning electron microscope (SEM). Analysis results of the chemical composition showed good agreement between the various techniques as well as for phases, while the fine structure analysis obtained by optical and scanning microscopy exhibited features of a structure where it got an intensified densification in the final stage of sintering and accompanied by quasi-homogeneous distribution of the closed pores. This will lead to the conclusion that the temperature used for sintering by ancient Iraqi was sufficient and it may fall in the range between 950-1200°C, also the mixes and the forming methods used by them, were both suitable to obtain good sintered bodies with even distribution of pores. A ring-shaped trace noticed in SEM micrographs need more work and study to explain what it is?

  6. Studies of ferroelectric heterostructure thin films and interfaces via in situ analytical techniques.

    SciTech Connect

    Auciello, O.; Dhote, A.; Gao, Y.; Gruen, D. M.; Im, J.; Irene, E. A.; Krauss, A. R.; Mueller, A. H.; Ramesh, R.

    1999-08-30

    The science and technology of ferroelectric thin films has experienced an explosive development during the last ten years. Low-density non-volatile ferroelectric random access memories (NVFRAMs) are now incorporated in commercial products such as ''smart cards'', while high permittivity capacitors are incorporated in cellular phones. However, substantial work is still needed to develop materials integration strategies for high-density memories. We have demonstrated that the implementation of complementary in situ characterization techniques is critical to understand film growth and interface processes, which play critical roles in film microstructure and properties. We are using uniquely integrated time of flight ion scattering and recoil spectroscopy (TOF-ISARS) and spectroscopic ellipsometry (SE) techniques to perform in situ, real-time studies of film growth processes in the high background gas pressure required to growth ferroelectric thin films. TOF-ISARS provides information on surface processes, while SE permits the investigation of buried interfaces as they are being formed. Recent studies on SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub x}Sr{sub 1{minus}x}TiO{sub 3} (BST) film growth and interface processes are discussed.

  7. New Insights into Amino Acid Preservation in the Early Oceans using Modern Analytical Techniques

    NASA Astrophysics Data System (ADS)

    Parker, E. T.; Brinton, K. L.; Burton, A. S.; Glavin, D. P.; Dworkin, J. P.; Bada, J.

    2015-12-01

    Protein- and non-protein-amino acids likely occupied the oceans at the time of the origin and evolution of life. Primordial soup-, hydrothermal vent-, and meteoritic-processes likely contributed to this early chemical inventory. Prebiotic synthesis and carbonaceous meteorite studies suggest that non-protein amino acids were likely more abundant than their protein-counterparts. Amino acid preservation before abiotic and biotic destruction is key to biomarker availability in paleoenvironments and remains an important uncertainty. To constrain primitive amino acid lifetimes, a 1992 archived seawater/beach sand mixture was spiked with D,L-alanine, D,L-valine (Val), α-aminoisobutyric acid (α-AIB), D,L-isovaline (Iva), and glycine (Gly). Analysis by high performance liquid chromatography with fluorescence detection (HPLC-FD) showed that only D-Val and non-protein amino acids were abundant after 2250 days. The mixture was re-analyzed in 2012 using HPLC-FD and a triple quadrupole mass spectrometer (QqQ-MS). The analytical results 20 years after the inception of the experiment were strikingly similar to those after 2250 days. To confirm that viable microorganisms were still present, the mixture was re-spiked with Gly in 2012. Aliquots were collected immediately after spiking, and at 5- and 9-month intervals thereafter. Final HPLC-FD/QqQ-MS analyses were performed in 2014. The 2014 analyses revealed that only α-AIB, D,L-Iva, and D-Val remained abundant. The disappearance of Gly indicated that microorganisms still lived in the mixture and were capable of consuming protein amino acids. These findings demonstrate that non-protein amino acids are minimally impacted by biological degradation and thus have very long lifetimes under these conditions. Primitive non-protein amino acids from terrestrial synthesis, or meteorite in-fall, likely experienced greater preservation than protein amino acids in paleo-oceanic environments. Such robust molecules may have reached a steady

  8. New Insights into Amino Acid Preservation in the Early Oceans Using Modern Analytical Techniques

    NASA Technical Reports Server (NTRS)

    Parker, Eric T.; Brinton, Karen L.; Burton, Aaron S.; Glavin, Daniel P.; Dworkin, Jason P.; Bada, Jeffrey L.

    2015-01-01

    Protein- and non-protein-amino acids likely occupied the oceans at the time of the origin and evolution of life. Primordial soup-, hydrothermal vent-, and meteoritic-processes likely contributed to this early chemical inventory. Prebiotic synthesis and carbonaceous meteorite studies suggest that non-protein amino acids were likely more abundant than their protein-counterparts. Amino acid preservation before abiotic and biotic destruction is key to biomarker availability in paleoenvironments and remains an important uncertainty. To constrain primitive amino acid lifetimes, a 1992 archived seawater/beach sand mixture was spiked with D,L-alanine, D,L-valine (Val), alpha-aminoisobutyric acid (alpha-AIB), D,L-isovaline (Iva), and glycine (Gly). Analysis by high performance liquid chromatography with fluorescence detection (HPLC-FD) showed that only D-Val and non-protein amino acids were abundant after 2250 days. The mixture was re-analyzed in 2012 using HPLC-FD and a triple quadrupole mass spectrometer (QqQ-MS). The analytical results 20 years after the inception of the experiment were strikingly similar to those after 2250 days. To confirm that viable microorganisms were still present, the mixture was re-spiked with Gly in 2012. Aliquots were collected immediately after spiking, and at 5- and 9-month intervals thereafter. Final HPLC-FD/QqQ-MS analyses were performed in 2014. The 2014 analyses revealed that only alpha-AIB, D,L-Iva, and D-Val remained abundant. The disappearance of Gly indicated that microorganisms still lived in the mixture and were capable of consuming protein amino acids. These findings demonstrate that non-protein amino acids are minimally impacted by biological degradation and thus have very long lifetimes under these conditions. Primitive non-protein amino acids from terrestrial synthesis, or meteorite in-fall, likely experienced great-er preservation than protein amino acids in paleo-oceanic environments. Such robust molecules may have reached a

  9. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  10. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  11. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  12. Development of analytical techniques to study H2s poisoning of PEMFCs and components

    SciTech Connect

    Brosha, Eric L; Rockward, Tommy; Uribe, Francisco A; Garzon, Fernando H

    2008-01-01

    Polymer electrolyte membrane fuel cells are sensitive to impurities that may be present in either the oxidizer or fuel. H{sub 2}S, even at the ppb level, will have a dramatic and adverse affect on fuel cell performance. Not only is it important to know a particular material's affinity to adsorb H{sub 2}S, when considering materials for PEMFC applications, issues such as permeation and crossover rates also become extremely important Several experimental methods have been developed to quantify H{sub 2}S adsorption onto surfaces and to quantify H{sub 2}S permeation through Nafion(reg.) membranes using readily available and inexpensive Ag/AgS ion probes. In addition to calculating the H{sub 2}S uptake on commonly used XC-72 carbon supports and PtlXC-72 catalysts, the H{sub 2}S permeability through dry and humidified Nafion(reg.) PEMFC membranes was also studied using these specialized techniques. In each ion probe experiment performed, a sulfide anti-oxidant buffer solution was used to trap and concentrate trace quantities of H{sub 2}S during the course of the measurement. Crossover experiments were conducted for up to 24 hours in order to achieve sulfide ion concentrations high enough to be precisely determined by subsequent titration with Pb(NO{sub 3}){sub 2}. By using these techniques, we have confirmed H{sub 2}S crossover in Nafion(reg.) membranes and have calculated preliminary rates of H{sub 2}S crossover.

  13. Web-Based Visual Analytics for Social Media

    SciTech Connect

    Best, Daniel M.; Bruce, Joseph R.; Dowson, Scott T.; Love, Oriana J.; McGrath, Liam R.

    2012-05-20

    Social media provides a rich source of data that reflects current trends and public opinion on a multitude of topics. The data can be harvested from Twitter, Facebook, Blogs, and other social applications. The high rate of adoption of social media has created a domain that has an ever expanding volume of data that make it difficult to use the raw data for analysis. Information visual analytics is key in drawing out features of interest in social media. The Scalable Reasoning System is an application that couples a back end server performing analysis algorithms and an intuitive front end visualization to allow for investigation. We provide a componentized system that can be rapidly adapted to customer needs such that the information they are most interested in is brought to their attention through the application. To this end, we have developed a social media application for use by emergency operations for the city of Seattle to show current weather and traffic trends which is important for their tasks.

  14. Behavior-Based Budget Management Using Predictive Analytics

    SciTech Connect

    Troy Hiltbrand

    2013-03-01

    Historically, the mechanisms to perform forecasting have primarily used two common factors as a basis for future predictions: time and money. While time and money are very important aspects of determining future budgetary spend patterns, organizations represent a complex system of unique individuals with a myriad of associated behaviors and all of these behaviors have bearing on how budget is utilized. When looking to forecasted budgets, it becomes a guessing game about how budget managers will behave under a given set of conditions. This becomes relatively messy when human nature is introduced, as different managers will react very differently under similar circumstances. While one manager becomes ultra conservative during periods of financial austerity, another might be un-phased and continue to spend as they have in the past. Both might revert into a state of budgetary protectionism masking what is truly happening at a budget holder level, in order to keep as much budget and influence as possible while at the same time sacrificing the greater good of the organization. To more accurately predict future outcomes, the models should consider both time and money and other behavioral patterns that have been observed across the organization. The field of predictive analytics is poised to provide the tools and methodologies needed for organizations to do just this: capture and leverage behaviors of the past to predict the future.

  15. Conductivity-Based Detection Techniques in Nanofluidic Devices

    PubMed Central

    Harms, Zachary D.; Haywood, Daniel G.; Kneller, Andrew R.

    2016-01-01

    This review covers conductivity detection in fabricated nanochannels and nanopores. Improvements in nanoscale sensing are a direct result of advances in fabrication techniques, which produce devices with channels and pores with reproducible dimensions and in a variety of materials. Analytes of interest are detected by measuring changes in conductance as the analyte accumulates in the channel or passes transiently through the pore. These detection methods take advantage of phenomena enhanced at the nanoscale, such as ion current rectification, surface conductance, and dimensions comparable to the analytes of interest. The end result is the development of sensing technologies for a broad range of analytes, e.g., ions, small molecules, proteins, nucleic acids, and particles. PMID:25988434

  16. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory.

    ERIC Educational Resources Information Center

    Cancilla, Devon A.

    2001-01-01

    Introduces an undergraduate level problem-based analytical chemistry laboratory course integrated with an environmental law course. Aims to develop an understanding among students on the use of environmental indicators for environmental evaluation. (Contains 30 references.) (YDS)

  17. Implementation of an analytical verification technique on three building energy-analysis codes: SUNCAT 2. 4, DOE 2. 1, and DEROB III

    SciTech Connect

    Wortman, D.; O'Doherty, B.; Judkoff, R.

    1981-01-01

    An analytical verification technique for building energy analysis codes has been developed. For this technique, building models are developed that can be both solved analytically and modeled using the analysis codes. The output of the codes is then compared with the analytical solutions. In this way, the accuracy of selected mechanisms in the codes can be verified. The procedure consists of several tests and was run on SUNCAT 2.4, DOE 2.1, and DEROB III. The results are presented and analyzed.

  18. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  19. Prediction of pressure drop in fluid tuned mounts using analytical and computational techniques

    NASA Astrophysics Data System (ADS)

    Lasher, William C.; Khalilollahi, Amir; Mischler, John; Uhric, Tom

    1993-11-01

    A simplified model for predicting pressure drop in fluid tuned isolator mounts was developed. The model is based on an exact solution to the Navier-Stokes equations and was made more general through the use of empirical coefficients. The values of these coefficients were determined by numerical simulation of the flow using the commercial computational fluid dynamics (CFD) package FIDAP.

  20. categoryCompare, an analytical tool based on feature annotations

    PubMed Central

    Flight, Robert M.; Harrison, Benjamin J.; Mohammad, Fahim; Bunge, Mary B.; Moon, Lawrence D. F.; Petruska, Jeffrey C.; Rouchka, Eric C.

    2014-01-01

    Assessment of high-throughput—omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered “biomarkers.” The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them. We developed a methodology, categoryCompare, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: (1) denervated skin vs. denervated muscle, and (2) colon from Crohn's disease vs. colon from ulcerative colitis (UC). The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined. In the skin vs. muscle denervation comparison, the tissues demonstrated markedly different responses. The Crohn's vs. UC comparison showed gross similarities in inflammatory

  1. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    PubMed

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  2. Comparison of analytical techniques for occupational mortality studies with an empirical example. Doctoral thesis

    SciTech Connect

    Amandus, H.

    1982-07-23

    Seven techniques for analyzing the mortality of an occupational cohort in a follow-up study were compared. These were the standardized mortality ratio calculated by the life table and modified life table methods, the logistic model used in the case-control mode, and five multiplicative proportional hazard survival models. The methods were compared empirically using data on 3,726 U.S. male Appalachian bituminous coal miners whose vital status were traced over a 14 year period from 1962 to 1975 and who were examined by the U.S. Public Health Service between 1963 and 1965. The disease outcome considered in the analyses was death from a non-malignant respiratory disease. Exposure to coal mine dust was defined by the number of years worked underground prior to the examination and the ILO radiographic category of the profusion of small opacities peculiar to coalworkers' pneumoconiosis. Risk factors for nonmalignant respiratory disease which were considered were cigarette smoking, age, exposure to other dusts, urban-rural area and geographic region of residence, and race. All risk factors and exposure data were obtained from the examination at the beginning of the 14 year period of follow-up.

  3. An environmental pressure index proposal for urban development planning based on the analytic network process

    SciTech Connect

    Gomez-Navarro, Tomas; Diaz-Martin, Diego

    2009-09-15

    This paper introduces a new approach to prioritize urban planning projects according to their environmental pressure in an efficient and reliable way. It is based on the combination of three procedures: (i) the use of environmental pressure indicators, (ii) the aggregation of the indicators in an Environmental Pressure Index by means of the Analytic Network Process method (ANP) and (iii) the interpretation of the information obtained from the experts during the decision-making process. The method has been applied to a proposal for urban development of La Carlota airport in Caracas (Venezuela). There are three options which are currently under evaluation. They include a Health Club, a Residential Area and a Theme Park. After a selection process the experts chose the following environmental pressure indicators as ANP criteria for the project life cycle: used land area, population density, energy consumption, water consumption and waste generation. By using goal-oriented questionnaires designed by the authors, the experts determined the importance of the criteria, the relationships among criteria, and the relationships between the criteria and the urban development alternatives. The resulting data showed that water consumption is the most important environmental pressure factor, and the Theme Park project is by far the urban development alternative which exerts the least environmental pressure on the area. The participating experts coincided in appreciating the technique proposed in this paper is useful and, for ranking ordering these alternatives, an improvement from traditional techniques such as environmental impact studies, life-cycle analysis, etc.

  4. An analytic study of near terminal area optimal sequencing and flow control techniques

    NASA Technical Reports Server (NTRS)

    Park, S. K.; Straeter, T. A.; Hogge, J. E.

    1973-01-01

    Optimal flow control and sequencing of air traffic operations in the near terminal area are discussed. The near terminal area model is based on the assumptions that the aircraft enter the terminal area along precisely controlled approach paths and that the aircraft are segregated according to their near terminal area performance. Mathematical models are developed to support the optimal path generation, sequencing, and conflict resolution problems.

  5. Observer-based controller for nonlinear analytical systems

    NASA Astrophysics Data System (ADS)

    Elloumi, S.; Belhouane, M. M.; Benhadj Braiek, N.

    2016-06-01

    In this paper, we propose to design a polynomial observer-based control for nonlinear systems and to determine sufficient linear matrix inequality (LMI) global stabilisation conditions of the polynomial controlled system augmented by its observer. The design of the observer-based control leverages some notations from the Kronecker product and the power of matrices properties for the state space description of polynomial systems. The stability study of the polynomial controlled system augmented by its observer is based on the Lyapunov stability direct method. Intensive simulations are performed to illustrate the validity and the effectiveness of the polynomial approach used to design the control.

  6. Analysis of size-fractionated coal combustion aerosols by PIXE and other analytical techniques

    NASA Astrophysics Data System (ADS)

    Maenhaut, W.; Røyset, O.; Vadset, M.; Kauppinen, E. I.; Lind, T. M.

    1993-04-01

    Particle-induced X-ray emission (PIXE) analysis, instrumental neutron activation analysis (INAA) and inductively coupled plasma mass spectrometry (ICP-MS) were used to study the chemical composition of size-fractionated in-stack fly-ash particles emitted during coal combustion. The samples were collected before the electrostatic precipitator at a gas temperature of 120°C during the combustion of Venezuelan coal in a 81 MW capacity circulating fluidized bed boiler. The sampling device consisted of a Berner low pressure impactor, which was operated with a cyclone precutter. The Nuclepore polycarbonate foils, which were used as collection surfaces in the low pressure impactor, were analyzed by the three techniques and the results of common elements were critically compared. The PIXE results were systematically lower than the INAA data and the percentage difference appeared to be stage-dependent, but virtually independent upon the element. The discrepancies are most likely due to bounce-off effects, particle reentrainment and other sampling artifacts, which may make that a fraction of the aerosol particles is deposited on the impaction foils outside the section analyzed by PIXE. However, by resorting to a "mixed internal standard" approach, accurate PIXE data are obtained. Also in the comparison between the ICP-MS and the INAA data significant discrepancies were observed. These are most likely due to incomplete dissolution of the particulate material and in particular of the alumino-silicate fly-ash matrix, during the acid digestion sample preparation step for ICP-MS. It is suggested that a comparison between ICP-MS data of acid digested samples and INAA can advantageously be used to provide speciation information on the various elements. Selected examples of size distributions are presented and briefly discussed.

  7. An analytical solution to patient prioritisation in radiotherapy based on utilitarian optimisation.

    PubMed

    Ebert, M A; Li, W; Jennings, L

    2014-03-01

    The detrimental impact of a radiotherapy waiting list can in part be compensated by patient prioritisation. Such prioritisation is phrased as an optimisation problem where the probability of local control for the overall population is the objective to be maximised and a simple analytical solution derived. This solution is compared with a simulation of a waiting list for the same population of patients. It is found that the analytical solution can provide an optimal ordering of patients though cannot explicitly constrain optimal waiting times. The simulation-based solution was undertaken using both the analytical solution and a numerical optimisation routine for daily patient ordering. Both solutions provided very similar results with the analytical approach reducing the calculation time of the numerical solution by several orders of magnitude. It is suggested that treatment delays due to resource limitations and resulting waiting lists be incorporated into treatment optimisation and that the derived analytical solution provides a mechanism for this to occur.

  8. Quantification of residual crystallinity in ball milled commercially sourced lactose monohydrate by thermo-analytical techniques and terahertz spectroscopy.

    PubMed

    Smith, Geoff; Hussain, Amjad; Bukhari, Nadeem Irfan; Ermolina, Irina

    2015-05-01

    The quantification of crystallinity is necessary in order to be able to control the milling process. The use of thermal analysis for this assessment presents certain challenges, particularly in the case of crystal hydrates. In this study, the residual crystallinity on ball milling of lactose monohydrate (LMH), for periods up to 90min, was evaluated by thermo-analytical techniques (TGA, DSC) and terahertz spectroscopy (THz). In general, the results from one of the DSC analysis and the THz measurements agree showing a monotonous decrease in relative residual crystallinity with milling time (∼80% reduction after 60min milling) and a slight increase at the 90min time point. However, the estimates from TGA and two other methods of analyzing DSC curve do not agree with the former techniques and show variability with significantly higher estimates for crystallinity. It was concluded that, the thermal techniques require more complex treatment of the data in the evaluation of changes in crystallinity of a milled material (in particular to account for the de-vitrification and mutarotation of the material that inevitably occurs during the measurement cycle) while the analysis of THz data is more straightforward, with the measurement having no impact on the native state of the material. PMID:25784570

  9. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-10-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.

  10. Incorporating Students' Self-Designed, Research-Based Analytical Chemistry Projects into the Instrumentation Curriculum

    ERIC Educational Resources Information Center

    Gao, Ruomei

    2015-01-01

    In a typical chemistry instrumentation laboratory, students learn analytical techniques through a well-developed procedure. Such an approach, however, does not engage students in a creative endeavor. To foster the intrinsic motivation of students' desire to learn, improve their confidence in self-directed learning activities and enhance their…

  11. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  12. Advanced analytical techniques for the extraction and characterization of plant-derived essential oils by gas chromatography with mass spectrometry.

    PubMed

    Waseem, Rabia; Low, Kah Hin

    2015-02-01

    In recent years, essential oils have received a growing interest because of the positive health effects of their novel characteristics such as antibacterial, antifungal, and antioxidant activities. For the extraction of plant-derived essential oils, there is the need of advanced analytical techniques and innovative methodologies. An exhaustive study of hydrodistillation, supercritical fluid extraction, ultrasound- and microwave-assisted extraction, solid-phase microextraction, pressurized liquid extraction, pressurized hot water extraction, liquid-liquid extraction, liquid-phase microextraction, matrix solid-phase dispersion, and gas chromatography (one- and two-dimensional) hyphenated with mass spectrometry for the extraction through various plant species and analysis of essential oils has been provided in this review. Essential oils are composed of mainly terpenes and terpenoids with low-molecular-weight aromatic and aliphatic constituents that are particularly important for public health.

  13. Painted Fiberglass-Reinforced Contemporary Sculpture: Investigating Composite Materials, Techniques and Conservation Using a Multi-Analytical Approach.

    PubMed

    Salvadori, Barbara; Cantisani, Emma; Colombini, Maria Perla; Tognon, Cecilia Gaia Rachele

    2016-01-01

    A multi-analytical approach was used to study the constituent materials, manufacturing technique, and state of conservation of a contemporary sculpture. This sculpture, entitled Nuredduna, was created by Aligi Sassu in 1995 and is located in the "Bellariva garden" in Florence (Italy). Fourier transform infrared spectroscopy (FT-IR), optical and electronic microscopy (OM and SEM-EDS), X-ray diffraction (XRD), and portable X-ray fluorescence (XRF) highlighted the multi-layered structure of the statue: fiberglass and an overlay of different layers (gel coat) applied with an unsaturated polyester resin added with aggregate materials and bromine compounds. A top-coat in acrylic black varnish, used as a finish, was also found. The combination of these materials with their different compositions, environmental impact, and even vandalism have negatively affected the state of conservation of Nuredduna, causing the loss of strata in its lower parts (legs and feet). PMID:26767643

  14. Electrospray Ionization Mass Spectrometry: A Technique to Access the Information beyond the Molecular Weight of the Analyte

    PubMed Central

    Banerjee, Shibdas; Mazumdar, Shyamalava

    2012-01-01

    The Electrospray Ionization (ESI) is a soft ionization technique extensively used for production of gas phase ions (without fragmentation) of thermally labile large supramolecules. In the present review we have described the development of Electrospray Ionization mass spectrometry (ESI-MS) during the last 25 years in the study of various properties of different types of biological molecules. There have been extensive studies on the mechanism of formation of charged gaseous species by the ESI. Several groups have investigated the origin and implications of the multiple charge states of proteins observed in the ESI-mass spectra of the proteins. The charged analytes produced by ESI can be fragmented by activating them in the gas-phase, and thus tandem mass spectrometry has been developed, which provides very important insights on the structural properties of the molecule. The review will highlight recent developments and emerging directions in this fascinating area of research. PMID:22611397

  15. Investigating dissolved organic matter decomposition in northern peatlands using complimentary analytical techniques

    NASA Astrophysics Data System (ADS)

    Tfaily, Malak M.; Hamdan, Rasha; Corbett, Jane E.; Chanton, Jeffrey P.; Glaser, Paul H.; Cooper, William T.

    2013-07-01

    The chemical transformations that govern storage, degradation, and loss of organic matter in northern peatlands are poorly characterized, despite the significance of these peat deposits as pivotal reservoirs in the global carbon cycle. One of the most challenging problems concerns the character of dissolved organic matter (DOM) in peat porewaters, particularly higher-molecular weight compounds that may function either as non-reactive sinks or reactive intermediates for organic byproducts of microbial decay. The complexity of these large molecules has defied attempts to characterize their molecular structure in bulk samples with a high degree of precision. We therefore determined the composition and reactivity of DOM from representative bog and fen sites in the Glacial Lake Agassiz Peatlands (GLAP) in northern Minnesota, USA. We applied four complementary techniques: electrospray ionization Fourier transform ion cyclotron resonance mass spectrometry (ESI-FT-ICR MS), proton nuclear magnetic resonance spectroscopy (1H NMR), specific UV absorbance (SUVA) and excitation-emission matrix (EEM) fluorescence spectroscopy. We observed that the vast majority (>80%) of molecular formulas that appear in the surface bog DOM are also present at 2.9 m depth, indicating that much of DOM in the bog is resistant to microbial degradation. In contrast to bog samples, a considerable number of new compounds with low O/C and high H/C elemental ratios were observed in the 3 m fen horizon relative to surface samples. These results indicate a more pronounced difference in the composition of surface and deep DOM in the fen. SUVA, determined at 254 nm, indicated significantly lower aromaticity in deep fen samples relative to deep bog samples. This trend was verified by 1H NMR. Aromatic and carbohydrate components represented up to 70% of deep bog DOM but comprised a much smaller proportion of deep fen DOM, which was dominated by functionalized and non-functionalized aliphatics. Molecular

  16. Base flow separation: A comparison of analytical and mass balance methods

    NASA Astrophysics Data System (ADS)

    Lott, Darline A.; Stewart, Mark T.

    2016-04-01

    Base flow is the ground water contribution to stream flow. Many activities, such as water resource management, calibrating hydrological and climate models, and studies of basin hydrology, require good estimates of base flow. The base flow component of stream flow is usually determined by separating a stream hydrograph into two components, base flow and runoff. Analytical methods, mathematical functions or algorithms used to calculate base flow directly from discharge, are the most widely used base flow separation methods and are often used without calibration to basin or gage-specific parameters other than basin area. In this study, six analytical methods are compared to a mass balance method, the conductivity mass-balance (CMB) method. The base flow index (BFI) values for 35 stream gages are obtained from each of the seven methods with each gage having at least two consecutive years of specific conductance data and 30 years of continuous discharge data. BFI is cumulative base flow divided by cumulative total discharge over the period of record of analysis. The BFI value is dimensionless, and always varies from 0 to 1. Areas of basins used in this study range from 27 km2 to 68,117 km2. BFI was first determined for the uncalibrated analytical methods. The parameters of each analytical method were then calibrated to produce BFI values as close to the CMB derived BFI values as possible. One of the methods, the power function (aQb + cQ) method, is inherently calibrated and was not recalibrated. The uncalibrated analytical methods have an average correlation coefficient of 0.43 when compared to CMB-derived values, and an average correlation coefficient of 0.93 when calibrated with the CMB method. Once calibrated, the analytical methods can closely reproduce the base flow values of a mass balance method. Therefore, it is recommended that analytical methods be calibrated against tracer or mass balance methods.

  17. Applications of Fractal Analytical Techniques in the Estimation of Operational Scale

    NASA Technical Reports Server (NTRS)

    Emerson, Charles W.; Quattrochi, Dale A.

    2000-01-01

    The observational scale and the resolution of remotely sensed imagery are essential considerations in the interpretation process. Many atmospheric, hydrologic, and other natural and human-influenced spatial phenomena are inherently scale dependent and are governed by different physical processes at different spatial domains. This spatial and operational heterogeneity constrains the ability to compare interpretations of phenomena and processes observed in higher spatial resolution imagery to similar interpretations obtained from lower resolution imagery. This is a particularly acute problem, since longterm global change investigations will require high spatial resolution Earth Observing System (EOS), Landsat 7, or commercial satellite data to be combined with lower resolution imagery from older sensors such as Landsat TM and MSS. Fractal analysis is a useful technique for identifying the effects of scale changes on remotely sensed imagery. The fractal dimension of an image is a non-integer value between two and three which indicates the degree of complexity in the texture and shapes depicted in the image. A true fractal surface exhibits self-similarity, a property of curves or surfaces where each part is indistinguishable from the whole, or where the form of the curve or surface is invariant with respect to scale. Theoretically, if the digital numbers of a remotely sensed image resemble an ideal fractal surface, then due to the self-similarity property, the fractal dimension of the image will not vary with scale and resolution, and the slope of the fractal dimension-resolution relationship would be zero. Most geographical phenomena, however, are not self-similar at all scales, but they can be modeled by a stochastic fractal in which the scaling properties of the image exhibit patterns that can be described by statistics such as area-perimeter ratios and autocovariances. Stochastic fractal sets relax the self-similarity assumption and measure many scales and

  18. Analytical modeling of structure-soil systems for lunar bases

    NASA Technical Reports Server (NTRS)

    Macari-Pasqualino, Jose Emir

    1989-01-01

    The study of the behavior of granular materials in a reduced gravity environment and under low effective stresses became a subject of great interest in the mid 1960's when NASA's Surveyor missions to the Moon began the first extraterrestrial investigation and it was found that Lunar soils exhibited properties quite unlike those on Earth. This subject gained interest during the years of the Apollo missions and more recently due to NASA's plans for future exploration and colonization of Moon and Mars. It has since been clear that a good understanding of the mechanical properties of granular materials under reduced gravity and at low effective stress levels is of paramount importance for the design and construction of surface and buried structures on these bodies. In order to achieve such an understanding it is desirable to develop a set of constitutive equations that describes the response of such materials as they are subjected to tractions and displacements. This presentation examines issues associated with conducting experiments on highly nonlinear granular materials under high and low effective stresses. The friction and dilatancy properties which affect the behavior of granular soils with low cohesion values are assessed. In order to simulate the highly nonlinear strength and stress-strain behavior of soils at low as well as high effective stresses, a versatile isotropic, pressure sensitive, third stress invariant dependent, cone-cap elasto-plastic constitutive model was proposed. The integration of the constitutive relations is performed via a fully implicit Backward Euler technique known as the Closest Point Projection Method. The model was implemented into a finite element code in order to study nonlinear boundary value problems associated with homogeneous as well as nonhomogeneous deformations at low as well as high effective stresses. The effect of gravity (self-weight) on the stress-strain-strength response of these materials is evaluated. The calibration

  19. Recent Applications of Carbon-Based Nanomaterials in Analytical Chemistry: Critical Review

    PubMed Central

    Scida, Karen; Stege, Patricia W.; Haby, Gabrielle; Messina, Germán A.; García, Carlos D.

    2011-01-01

    The objective of this review is to provide a broad overview of the advantages and limitations of carbon-based nanomaterials with respect to analytical chemistry. Aiming to illustrate the impact of nanomaterials on the development of novel analytical applications, developments reported in the 2005–2010 period have been included and divided into sample preparation, separation, and detection. Within each section, fullerenes, carbon nanotubes, graphene, and composite materials will be addressed specifically. Although only briefly discussed, included is a section highlighting nanomaterials with interesting catalytic properties that can be used in the design of future devices for analytical chemistry. PMID:21458626

  20. A measurement-based analytical approach to the bioluminescence tomography problem

    NASA Astrophysics Data System (ADS)

    Erkol, Hakan; Demirkiran, Aytac; Kipergil, Esra-Aytac; Uluc, Nasire; Unlu, Mehmet B.

    2014-03-01

    This work presents an analytical approach for the solution of the tissue diffusion equation based on the bound- ary measurements. We consider a bioluminescent point source in both homogeneous and heterogeneous circular turbid media. The point source is described by the Dirac delta function. Analytical expressions for the strength and position of the point source are obtained introducing boundary measurements and then applying appropriate boundary conditions. In addition, numerical simulations are performed for the position of the source. Calculations show that that the analytical results are in a good accordance with the numerical results.

  1. Analytic Solutions for the Spectral Responses of RCA-Grating-Based Waveguide Devices

    NASA Astrophysics Data System (ADS)

    Zeng, Xiang-Kai; Wei, Lai

    2012-12-01

    Analytic solutions (ASs) for the spectral responses of waveguide devices with raised-cosine-apodized (RCA) gratings are presented. The waveguide devices include short- and long-period RCA-gratings, RCA-grating-based interferometers as Fabry—Perot, Mach—Zehnder and Michelson interferometers. The calculations based on the analytic solutions are demonstrated and compared with those based on the transfer matrix (TM) method preferred, which has confirmed that the AS-based analysis is with enough accuracy and several thousands times the efficiency of the TM method.

  2. A novel non-imaging optics based Raman spectroscopy device for transdermal blood analyte measurement

    PubMed Central

    Kong, Chae-Ryon; Barman, Ishan; Dingari, Narahara Chari; Kang, Jeon Woong; Galindo, Luis; Dasari, Ramachandra R.; Feld, Michael S.

    2011-01-01

    Due to its high chemical specificity, Raman spectroscopy has been considered to be a promising technique for non-invasive disease diagnosis. However, during Raman excitation, less than one out of a million photons undergo spontaneous Raman scattering and such weakness in Raman scattered light often require highly efficient collection of Raman scattered light for the analysis of biological tissues. We present a novel non-imaging optics based portable Raman spectroscopy instrument designed for enhanced light collection. While the instrument was demonstrated on transdermal blood glucose measurement, it can also be used for detection of other clinically relevant blood analytes such as creatinine, urea and cholesterol, as well as other tissue diagnosis applications. For enhanced light collection, a non-imaging optical element called compound hyperbolic concentrator (CHC) converts the wide angular range of scattered photons (numerical aperture (NA) of 1.0) from the tissue into a limited range of angles accommodated by the acceptance angles of the collection system (e.g., an optical fiber with NA of 0.22). A CHC enables collimation of scattered light directions to within extremely narrow range of angles while also maintaining practical physical dimensions. Such a design allows for the development of a very efficient and compact spectroscopy system for analyzing highly scattering biological tissues. Using the CHC-based portable Raman instrument in a clinical research setting, we demonstrate successful transdermal blood glucose predictions in human subjects undergoing oral glucose tolerance tests. PMID:22125761

  3. Downstream processing and chromatography based analytical methods for production of vaccines, gene therapy vectors, and bacteriophages

    PubMed Central

    Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš

    2015-01-01

    Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production. PMID:25751122

  4. Analytical Devices Based on Direct Synthesis of DNA on Paper.

    PubMed

    Glavan, Ana C; Niu, Jia; Chen, Zhen; Güder, Firat; Cheng, Chao-Min; Liu, David; Whitesides, George M

    2016-01-01

    This paper addresses a growing need in clinical diagnostics for parallel, multiplex analysis of biomarkers from small biological samples. It describes a new procedure for assembling arrays of ssDNA and proteins on paper. This method starts with the synthesis of DNA oligonucleotides covalently linked to paper and proceeds to assemble microzones of DNA-conjugated paper into arrays capable of simultaneously capturing DNA, DNA-conjugated protein antigens, and DNA-conjugated antibodies. The synthesis of ssDNA oligonucleotides on paper is convenient and effective with 32% of the oligonucleotides cleaved and eluted from the paper substrate being full-length by HPLC for a 32-mer. These ssDNA arrays can be used to detect fluorophore-linked DNA oligonucleotides in solution, and as the basis for DNA-directed assembly of arrays of DNA-conjugated capture antibodies on paper, detect protein antigens by sandwich ELISAs. Paper-anchored ssDNA arrays with different sequences can be used to assemble paper-based devices capable of detecting DNA and antibodies in the same device and enable simple microfluidic paper-based devices.

  5. Using ArcGIS for correlating multi-technique micro-spatial analytical data: A case study of early solar system carbonates in a carbonaceous chondrite.

    NASA Astrophysics Data System (ADS)

    Tyra, M. A.; Brearley, A.

    2008-12-01

    Meteorites are rare and valuable extraterrestrial materials that are typically studied using multiple micro- and nanoanalytical techniques such as SEM, EPMA, SIMS, SXRF and FIB/TEM. Each of these techniques is frequently used to study the same thin section in detail. Management of the significant amounts of spatial and analytical data obtained at various scales from the millimeter to nanometer-scales over a ~3 cm2 thin section is a major challenge. Here we demonstrate that a geographical information system, or GIS, typically used for much larger scale spatial data manipulation can be used equally successfully to store and analyze spatially correlated petrographic and mineralogical data. The advantages of using GIS techniques at the microscale are multifold. For example, querying various types of analytical data can be made with ease by the researcher. Furthermore, posted geodatabase meteorite data can be analyzed by other researchers concurrently or years after a project has been completed. This facilitates comparisons between other meteorite samples of differing classification, within a classification, or samples of the same meteorite. Here we demonstrate the application of a GIS to a correlate data obtained from a thin section of the ALH84051 CM1 meteorite, a carbonaceous chondrite that has experienced extensive aqueous alteration. Mosaiced images obtained by optical microscopy of the entire thin section are used as a base "map" and are overlain with SEM and CL images obtained at different magnifications, compositional data (EPMA), and other spatial data. The overall objectives of this study are to gain insights into the processes of aqueous alteration using carbonate mineral assemblages, morphology, abundance, and chemical composition (major, minor and trace elements). Future work will also include Mn-Cr chronometry and oxygen isotopic analysis using SIMS to examine carbonate emplacement and fluid evolution within the meteorite parent body.

  6. DCT-based cyber defense techniques

    NASA Astrophysics Data System (ADS)

    Amsalem, Yaron; Puzanov, Anton; Bedinerman, Anton; Kutcher, Maxim; Hadar, Ofer

    2015-09-01

    With the increasing popularity of video streaming services and multimedia sharing via social networks, there is a need to protect the multimedia from malicious use. An attacker may use steganography and watermarking techniques to embed malicious content, in order to attack the end user. Most of the attack algorithms are robust to basic image processing techniques such as filtering, compression, noise addition, etc. Hence, in this article two novel, real-time, defense techniques are proposed: Smart threshold and anomaly correction. Both techniques operate at the DCT domain, and are applicable for JPEG images and H.264 I-Frames. The defense performance was evaluated against a highly robust attack, and the perceptual quality degradation was measured by the well-known PSNR and SSIM quality assessment metrics. A set of defense techniques is suggested for improving the defense efficiency. For the most aggressive attack configuration, the combination of all the defense techniques results in 80% protection against cyber-attacks with PSNR of 25.74 db.

  7. A Vocal-Based Analytical Method for Goose Behaviour Recognition

    PubMed Central

    Steen, Kim Arild; Therkildsen, Ole Roland; Karstoft, Henrik; Green, Ole

    2012-01-01

    Since human-wildlife conflicts are increasing, the development of cost-effective methods for reducing damage or conflict levels is important in wildlife management. A wide range of devices to detect and deter animals causing conflict are used for this purpose, although their effectiveness is often highly variable, due to habituation to disruptive or disturbing stimuli. Automated recognition of behaviours could form a critical component of a system capable of altering the disruptive stimuli to avoid this. In this paper we present a novel method to automatically recognise goose behaviour based on vocalisations from flocks of free-living barnacle geese (Branta leucopsis). The geese were observed and recorded in a natural environment, using a shielded shotgun microphone. The classification used Support Vector Machines (SVMs), which had been trained with labeled data. Greenwood Function Cepstral Coefficients (GFCC) were used as features for the pattern recognition algorithm, as they can be adjusted to the hearing capabilities of different species. Three behaviours are classified based in this approach, and the method achieves a good recognition of foraging behaviour (86–97% sensitivity, 89–98% precision) and a reasonable recognition of flushing (79–86%, 66–80%) and landing behaviour(73–91%, 79–92%). The Support Vector Machine has proven to be a robust classifier for this kind of classification, as generality and non-linear capabilities are important. We conclude that vocalisations can be used to automatically detect behaviour of conflict wildlife species, and as such, may be used as an integrated part of a wildlife management system. PMID:22737037

  8. A vocal-based analytical method for goose behaviour recognition.

    PubMed

    Steen, Kim Arild; Therkildsen, Ole Roland; Karstoft, Henrik; Green, Ole

    2012-01-01

    Since human-wildlife conflicts are increasing, the development of cost-effective methods for reducing damage or conflict levels is important in wildlife management. A wide range of devices to detect and deter animals causing conflict are used for this purpose, although their effectiveness is often highly variable, due to habituation to disruptive or disturbing stimuli. Automated recognition of behaviours could form a critical component of a system capable of altering the disruptive stimuli to avoid this. In this paper we present a novel method to automatically recognise goose behaviour based on vocalisations from flocks of free-living barnacle geese (Branta leucopsis). The geese were observed and recorded in a natural environment, using a shielded shotgun microphone. The classification used Support Vector Machines (SVMs), which had been trained with labeled data. Greenwood Function Cepstral Coefficients (GFCC) were used as features for the pattern recognition algorithm, as they can be adjusted to the hearing capabilities of different species. Three behaviours are classified based in this approach, and the method achieves a good recognition of foraging behaviour (86-97% sensitivity, 89-98% precision) and a reasonable recognition of flushing (79-86%, 66-80%) and landing behaviour(73-91%, 79-92%). The Support Vector Machine has proven to be a robust classifier for this kind of classification, as generality and non-linear capabilities are important. We conclude that vocalisations can be used to automatically detect behaviour of conflict wildlife species, and as such, may be used as an integrated part of a wildlife management system.

  9. Convergence and Robustness of Analytic Techniques for Qualitative Data. Final Report, July 1, 1978 to September 30, 1979.

    ERIC Educational Resources Information Center

    Borgen, Fred H.; Muchinsky, Paul M.

    The purposes of this study are to compare alternative statistical techniques for analysis of the kinds of qualitative data commonly obtained in educational settings. Distinctions between chi-square based measures for contingency tables and proportional reduction of error measures of the strength of association are reviewed, and it is argued that…

  10. Analytical Approaches Based on Gas Chromatography Mass Spectrometry (GC/MS) to Study Organic Materials in Artworks and Archaeological Objects.

    PubMed

    Bonaduce, Ilaria; Ribechini, Erika; Modugno, Francesca; Colombini, Maria Perla

    2016-02-01

    Gas chromatography/mass spectrometry (GC/MS), after appropriate wet chemical sample pre-treatments or pyrolysis, is one of the most commonly adopted analytical techniques in the study of organic materials from cultural heritage objects. Organic materials in archaeological contexts, in classical art objects, or in modern and contemporary works of art may be the same or belong to the same classes, but can also vary considerably, often presenting different ageing pathways and chemical environments. This paper provides an overview of the literature published in the last 10 years on the research based on the use of GC/MS for the analysis of organic materials in artworks and archaeological objects. The latest progresses in advancing analytical approaches, characterising materials and understanding their degradation, and developing methods for monitoring their stability are discussed. Case studies from the literature are presented to examine how the choice of the working conditions and the analytical approaches is driven by the analytical and technical question to be answered, as well as the nature of the object from which the samples are collected. PMID:27572989

  11. Analytical Approaches Based on Gas Chromatography Mass Spectrometry (GC/MS) to Study Organic Materials in Artworks and Archaeological Objects.

    PubMed

    Bonaduce, Ilaria; Ribechini, Erika; Modugno, Francesca; Colombini, Maria Perla

    2016-02-01

    Gas chromatography/mass spectrometry (GC/MS), after appropriate wet chemical sample pre-treatments or pyrolysis, is one of the most commonly adopted analytical techniques in the study of organic materials from cultural heritage objects. Organic materials in archaeological contexts, in classical art objects, or in modern and contemporary works of art may be the same or belong to the same classes, but can also vary considerably, often presenting different ageing pathways and chemical environments. This paper provides an overview of the literature published in the last 10 years on the research based on the use of GC/MS for the analysis of organic materials in artworks and archaeological objects. The latest progresses in advancing analytical approaches, characterising materials and understanding their degradation, and developing methods for monitoring their stability are discussed. Case studies from the literature are presented to examine how the choice of the working conditions and the analytical approaches is driven by the analytical and technical question to be answered, as well as the nature of the object from which the samples are collected.

  12. The Impact of Chemical Abrasion on Trace Element Analysis of Zircon by In Situ Micro-Analytical Techniques

    NASA Astrophysics Data System (ADS)

    Romanoski, A.; Coint, N.; Cottle, J. M.; Hetherington, C. J.; Barnes, C. G.

    2011-12-01

    Introduction of the chemical abrasion technique has significantly increased the precision and accuracy of ID-TIMS U-Pb dating of zircon. The chemical abrasion technique, coupled with thermal annealing, removes inclusions and metamict domains from zircon reducing the impact of Pb-loss leading to more concordant analyses.In this study, zircon from the Red Bluff Granitic Suite (TX) (ID-TIMS age 1120 ± 35 Ma) has been thermally annealed and chemically abraded prior to SHRIMP-RG and LA-MC-ICP-MS analysis.Chemically abraded zircon gives a date of 1109 ± 22 Ma with an average of 3% discordancy. This compares with dates of 1137 ± 48 Ma with an average of 39% discordancy for non-abraded zircon from the same sample. The dates overlap within uncertainty, but the age from chemically abraded zircon has a lower population uncertainty. Other petrographic and analytical observations of the chemically abraded zircon include brighter CL intensity, lower REE abundances, more consistent (smaller scatter) negative Eu/Eu* anomalies, less scatter in the chondrite-normalized LREE values, and a slightly less-steep chondrite normalized HREE slope. The data show that thermal annealing and chemical abrasion of zircon prior to analysis by in situ ion-beam or laser ablation techniques may result in better accuracy and greater concordance in U-Pb analysis of zircon. However, while improving the quality of some components of the trace element dataset (e.g. Eu anomalies) the process may prejudice the interpretation of zircon trace element data (e.g. HREECN slopes).

  13. Applying knowledge compilation techniques to model-based reasoning

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    Researchers in the area of knowledge compilation are developing general purpose techniques for improving the efficiency of knowledge-based systems. In this article, an attempt is made to define knowledge compilation, to characterize several classes of knowledge compilation techniques, and to illustrate how some of these techniques can be applied to improve the performance of model-based reasoning systems.

  14. Flood alert system based on bayesian techniques

    NASA Astrophysics Data System (ADS)

    Gulliver, Z.; Herrero, J.; Viesca, C.; Polo, M. J.

    2012-04-01

    The problem of floods in the Mediterranean regions is closely linked to the occurrence of torrential storms in dry regions, where even the water supply relies on adequate water management. Like other Mediterranean basins in Southern Spain, the Guadalhorce River Basin is a medium sized watershed (3856 km2) where recurrent yearly floods occur , mainly in autumn and spring periods, driven by cold front phenomena. The torrential character of the precipitation in such small basins, with a concentration time of less than 12 hours, produces flash flood events with catastrophic effects over the city of Malaga (600000 inhabitants). From this fact arises the need for specific alert tools which can forecast these kinds of phenomena. Bayesian networks (BN) have been emerging in the last decade as a very useful and reliable computational tool for water resources and for the decision making process. The joint use of Artificial Neural Networks (ANN) and BN have served us to recognize and simulate the two different types of hydrological behaviour in the basin: natural and regulated. This led to the establishment of causal relationships between precipitation, discharge from upstream reservoirs, and water levels at a gauging station. It was seen that a recurrent ANN model working at an hourly scale, considering daily precipitation and the two previous hourly values of reservoir discharge and water level, could provide R2 values of 0.86. BN's results slightly improve this fit, but contribute with uncertainty to the prediction. In our current work to Design a Weather Warning Service based on Bayesian techniques the first steps were carried out through an analysis of the correlations between the water level and rainfall at certain representative points in the basin, along with the upstream reservoir discharge. The lower correlation found between precipitation and water level emphasizes the highly regulated condition of the stream. The autocorrelations of the variables were also

  15. Biological Matrix Effects in Quantitative Tandem Mass Spectrometry-Based Analytical Methods: Advancing Biomonitoring

    PubMed Central

    Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd

    2015-01-01

    The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585

  16. Continuous Metabolic Monitoring Based on Multi-Analyte Biomarkers to Predict Exhaustion

    PubMed Central

    Kastellorizios, Michail; Burgess, Diane J.

    2015-01-01

    This work introduces the concept of multi-analyte biomarkers for continuous metabolic monitoring. The importance of using more than one marker lies in the ability to obtain a holistic understanding of the metabolism. This is showcased for the detection and prediction of exhaustion during intense physical exercise. The findings presented here indicate that when glucose and lactate changes over time are combined into multi-analyte biomarkers, their monitoring trends are more sensitive in the subcutaneous tissue, an implantation-friendly peripheral tissue, compared to the blood. This unexpected observation was confirmed in normal as well as type 1 diabetic rats. This study was designed to be of direct value to continuous monitoring biosensor research, where single analytes are typically monitored. These findings can be implemented in new multi-analyte continuous monitoring technologies for more accurate insulin dosing, as well as for exhaustion prediction studies based on objective data rather than the subject’s perception. PMID:26028477

  17. Continuous metabolic monitoring based on multi-analyte biomarkers to predict exhaustion.

    PubMed

    Kastellorizios, Michail; Burgess, Diane J

    2015-01-01

    This work introduces the concept of multi-analyte biomarkers for continuous metabolic monitoring. The importance of using more than one marker lies in the ability to obtain a holistic understanding of the metabolism. This is showcased for the detection and prediction of exhaustion during intense physical exercise. The findings presented here indicate that when glucose and lactate changes over time are combined into multi-analyte biomarkers, their monitoring trends are more sensitive in the subcutaneous tissue, an implantation-friendly peripheral tissue, compared to the blood. This unexpected observation was confirmed in normal as well as type 1 diabetic rats. This study was designed to be of direct value to continuous monitoring biosensor research, where single analytes are typically monitored. These findings can be implemented in new multi-analyte continuous monitoring technologies for more accurate insulin dosing, as well as for exhaustion prediction studies based on objective data rather than the subject's perception. PMID:26028477

  18. Continuous metabolic monitoring based on multi-analyte biomarkers to predict exhaustion.

    PubMed

    Kastellorizios, Michail; Burgess, Diane J

    2015-06-01

    This work introduces the concept of multi-analyte biomarkers for continuous metabolic monitoring. The importance of using more than one marker lies in the ability to obtain a holistic understanding of the metabolism. This is showcased for the detection and prediction of exhaustion during intense physical exercise. The findings presented here indicate that when glucose and lactate changes over time are combined into multi-analyte biomarkers, their monitoring trends are more sensitive in the subcutaneous tissue, an implantation-friendly peripheral tissue, compared to the blood. This unexpected observation was confirmed in normal as well as type 1 diabetic rats. This study was designed to be of direct value to continuous monitoring biosensor research, where single analytes are typically monitored. These findings can be implemented in new multi-analyte continuous monitoring technologies for more accurate insulin dosing, as well as for exhaustion prediction studies based on objective data rather than the subject's perception.

  19. Calculation Of Position And Velocity Of GLONASS Satellite Based On Analytical Theory Of Motion

    NASA Astrophysics Data System (ADS)

    Góral, W.; Skorupa, B.

    2015-09-01

    The presented algorithms of computation of orbital elements and positions of GLONASS satellites are based on the asymmetric variant of the generalized problem of two fixed centers. The analytical algorithm embraces the disturbing acceleration due to the second J2 and third J3 coefficients, and partially fourth zonal harmonics in the expansion of the Earth's gravitational potential. Other main disturbing accelerations - due to the Moon and the Sun attraction - are also computed analytically, where the geocentric position vector of the Moon and the Sun are obtained by evaluating known analytical expressions for their motion. The given numerical examples show that the proposed analytical method for computation of position and velocity of GLONASS satellites can be an interesting alternative for presently used numerical methods.

  20. Approaching near real-time biosensing: microfluidic microsphere based biosensor for real-time analyte detection.

    PubMed

    Cohen, Noa; Sabhachandani, Pooja; Golberg, Alexander; Konry, Tania

    2015-04-15

    In this study we describe a simple lab-on-a-chip (LOC) biosensor approach utilizing well mixed microfluidic device and a microsphere-based assay capable of performing near real-time diagnostics of clinically relevant analytes such cytokines and antibodies. We were able to overcome the adsorption kinetics reaction rate-limiting mechanism, which is diffusion-controlled in standard immunoassays, by introducing the microsphere-based assay into well-mixed yet simple microfluidic device with turbulent flow profiles in the reaction regions. The integrated microsphere-based LOC device performs dynamic detection of the analyte in minimal amount of biological specimen by continuously sampling micro-liter volumes of sample per minute to detect dynamic changes in target analyte concentration. Furthermore we developed a mathematical model for the well-mixed reaction to describe the near real time detection mechanism observed in the developed LOC method. To demonstrate the specificity and sensitivity of the developed real time monitoring LOC approach, we applied the device for clinically relevant analytes: Tumor Necrosis Factor (TNF)-α cytokine and its clinically used inhibitor, anti-TNF-α antibody. Based on the reported results herein, the developed LOC device provides continuous sensitive and specific near real-time monitoring method for analytes such as cytokines and antibodies, reduces reagent volumes by nearly three orders of magnitude as well as eliminates the washing steps required by standard immunoassays.

  1. Sensor Based on Aptamer Folding to Detect Low-Molecular Weight Analytes.

    PubMed

    Osypova, Alina; Thakar, Dhruv; Dejeu, Jérôme; Bonnet, Hugues; Van der Heyden, Angéline; Dubacheva, Galina V; Richter, Ralf P; Defrancq, Eric; Spinelli, Nicolas; Coche-Guérente, Liliane; Labbé, Pierre

    2015-08-01

    Aptamers have emerged as promising biorecognition elements in the development of biosensors. The present work focuses on the application of quartz crystal microbalance with dissipation monitoring (QCM-D) for the enantioselective detection of a low molecular weight target molecule (less than 200 Da) by aptamer-based sensors. While QCM-D is a powerful technique for label-free, real-time characterization and quantification of molecular interactions at interfaces, the detection of small molecules interacting with immobilized receptors still remains a challenge. In the present study, we take advantage of the aptamer conformational changes upon the target binding that induces displacement of water acoustically coupled to the sensing layer. As a consequence, this phenomenon leads to a significant enhancement of the detection signal. The methodology is exemplified with the enantioselective recognition of a low molecular weight model compound, L-tyrosinamide (L-Tym). QCM-D monitoring of L-Tym interaction with the aptamer monolayer leads to an appreciable signal that can be further exploited for analytical purposes or thermodynamics studies. Furthermore, in situ combination of QCM-D with spectroscopic ellipsometry unambiguously demonstrates that the conformational change induces a nanometric decrease of the aptamer monolayer thickness. Since QCM-D is sensitive to the whole mass of the sensing layer including water that is acoustically coupled, a decrease in thickness of the highly hydrated aptamer layer induces a sizable release of water that can be easily detected by QCM-D. PMID:26122480

  2. Sensor Based on Aptamer Folding to Detect Low-Molecular Weight Analytes.

    PubMed

    Osypova, Alina; Thakar, Dhruv; Dejeu, Jérôme; Bonnet, Hugues; Van der Heyden, Angéline; Dubacheva, Galina V; Richter, Ralf P; Defrancq, Eric; Spinelli, Nicolas; Coche-Guérente, Liliane; Labbé, Pierre

    2015-08-01

    Aptamers have emerged as promising biorecognition elements in the development of biosensors. The present work focuses on the application of quartz crystal microbalance with dissipation monitoring (QCM-D) for the enantioselective detection of a low molecular weight target molecule (less than 200 Da) by aptamer-based sensors. While QCM-D is a powerful technique for label-free, real-time characterization and quantification of molecular interactions at interfaces, the detection of small molecules interacting with immobilized receptors still remains a challenge. In the present study, we take advantage of the aptamer conformational changes upon the target binding that induces displacement of water acoustically coupled to the sensing layer. As a consequence, this phenomenon leads to a significant enhancement of the detection signal. The methodology is exemplified with the enantioselective recognition of a low molecular weight model compound, L-tyrosinamide (L-Tym). QCM-D monitoring of L-Tym interaction with the aptamer monolayer leads to an appreciable signal that can be further exploited for analytical purposes or thermodynamics studies. Furthermore, in situ combination of QCM-D with spectroscopic ellipsometry unambiguously demonstrates that the conformational change induces a nanometric decrease of the aptamer monolayer thickness. Since QCM-D is sensitive to the whole mass of the sensing layer including water that is acoustically coupled, a decrease in thickness of the highly hydrated aptamer layer induces a sizable release of water that can be easily detected by QCM-D.

  3. A Microfluidic Paper-Based Analytical Device for Rapid Quantification of Particulate Chromium

    PubMed Central

    Rattanarat, Poomrat; Dungchai, Wijitar; Cate, David M.; Siangproh, Weena; Volckens, John; Chailapakul, Orawon; Henry, Charles S.

    2013-01-01

    Occupational exposure to Cr is concerning because of its myriad of health effects. Assessing chromium exposure is also cost and resource intensive because the analysis typically uses sophisticated instrumental techniques like Inductively-Coupled Plasma-Mass Spectrometry (ICP-MS). Here, we report a novel, simple, inexpensive microfluidic paper-based analytical device (µPAD) for measuring total Cr in airborne particulate matter. In the µPAD, tetravalent cerium (Ce(IV)) was used in a pretreatment zone to oxidize all soluble Cr to Cr(VI). After elution to the detection zone, Cr(VI) reacts with 1,5-diphenylcarbazide (1,5- DPC) forming 1,5-diphenylcarbazone (DPCO) and Cr(III). The resulting Cr(III) forms a distinct purple colored complex with the DPCO. As proof-of-principle, particulate matter (PM) collected on a sample filter was analyzed with the µPAD to quantify the mass of total Cr. A log-linear working range (0.23–3.75 µg; r2=0.998) between Cr and color intensity was obtained with a detection limit of 0.12 µg. For validation, a certified reference containing multiple competing metals was analyzed. Quantitative agreement was obtained between known Cr levels in the sample and the Cr measured using the µPAD. PMID:24120167

  4. Combination of analytical quality specifications based on biological within- and between-subject variation.

    PubMed

    Petersen, Per Hyltoft; Fraser, Callum G; Jørgensen, Lone; Brandslund, Ivan; Stahl, Marta; Gowans, Elizabeth; Libeer, Jean-Claude; Ricós, Carmen

    2002-11-01

    At a conference on 'Strategies to Set Global Analytical Quality Specifications in Laboratory Medicine' in Stockholm 1999, a hierarchy of models to set analytical quality specifications was decided. The consensus agreement from the conference defined the highest level as 'evaluation of the effect of analytical performance on clinical outcomes in specific clinical settings' and the second level as 'data based on components of biological variation'. Here, the many proposals for analytical quality specifications based on biological variation are examined and the outcomes of the different models for maximum allowable combined analytical imprecision and bias are illustrated graphically. The following models were investigated. (1) The Cotlove et al. (1970) model defining analytical imprecision (%CVA) in relation to the within-subject biological variation (%CV(W-S)) as: %CVA < or = 0.5 x %CV(W-S) (where %CV is percentage coefficient of variation). (2) The Gowans et al. (1988) concept, which defines a functional relationship between analytical imprecision and bias for the maximum allowable combination of errors for the purpose of sharing common reference intervals. (3) The European Group for the Evaluation of Reagents and Analytical Systems in Laboratory Medicine (EGE Lab) Working Group concept, which combines the Cotlove model with the Gowans concept using the maximal acceptable bias. (4) The External Quality Assessment (EQA) Organizers Working Group concept, which is close to the EGE Lab Working Group concept, but follows the Gowans et al. concept of imprecision up to the limit defined by the model of Cotlove et al. (5) The 'three-level' concept classifying analytical quality into three levels: optimum, desirable and minimum. The figures created clearly demonstrated that the results obtained were determined by the basic assumptions made. When %CV(W-S) is small compared with the population-based coefficient of variation [%CV(P) = (%CV2(W-S) +%CV2(B-S))(1/2)], the EGE Lab

  5. A smog chamber comparison of a microfluidic derivatisation measurement of gas-phase glyoxal and methylglyoxal with other analytical techniques

    NASA Astrophysics Data System (ADS)

    Pang, X.; Lewis, A. C.; Rickard, A. R.; Baeza-Romero, M. T.; Adams, T. J.; Ball, S. M.; Daniels, M. J. S.; Goodall, I. C. A.; Monks, P. S.; Peppe, S.; Ródenas García, M.; Sánchez, P.; Muñoz, A.

    2014-02-01

    A microfluidic lab-on-a-chip derivatisation technique has been developed to measure part per billion (ppbV) mixing ratios of gaseous glyoxal (GLY) and methylglyoxal (MGLY), and the method is compared with other techniques in a smog chamber experiment. The method uses o-(2, 3, 4, 5, 6-pentafluorobenzyl) hydroxylamine (PFBHA) as a derivatisation reagent and a microfabricated planar glass micro-reactor comprising an inlet, gas and fluid splitting and combining channels, mixing junctions, and a heated capillary reaction microchannel. The enhanced phase contact area-to-volume ratio and the high heat transfer rate in the micro-reactor resulted in a fast and highly efficient derivatisation reaction, generating an effluent stream ready for direct introduction to a gas chromatograph-mass spectrometer (GC-MS). A linear response for GLY was observed over a calibration range 0.7 to 400 ppbV, and for MGLY of 1.2 to 300 ppbV, when derivatised under optimal reaction conditions. The analytical performance shows good accuracy (6.6% for GLY and 7.5% for MGLY), suitable precision (<12.0%) with method detection limits (MDLs) of 75 pptV for GLY and 185 pptV for MGLY, with a time resolution of 30 min. These MDLs are below or close to typical concentrations of these compounds observed in ambient air. The feasibility of the technique was assessed by applying the methodology to quantify α-dicarbonyls formed during the photo-oxidation of isoprene in the EUPHORE chamber. Good correlations were found between microfluidic measurements and Fourier Transform InfraRed spectroscopy (FTIR) with a correlation coefficient (r2) of 0.84, Broadband Cavity Enhanced Absorption Spectroscopy (BBCEAS) (r2 = 0.75), solid phase micro extraction (SPME) (r2 = 0.89), and a photochemical chamber box modelling calculation (r2 = 0.79) for GLY measurements. For MGLY measurements, the microfluidic technique showed good agreement with BBCEAS (r2 = 0.87), SPME (r2 = 0.76), and the modeling simulation (r2 = 0.83), FTIR

  6. A method based on stochastic resonance for the detection of weak analytical signal.

    PubMed

    Wu, Xiaojing; Guo, Weiming; Cai, Wensheng; Shao, Xueguang; Pan, Zhongxiao

    2003-12-23

    An effective method for detection of weak analytical signals with strong noise background is proposed based on the theory of stochastic resonance (SR). Compared with the conventional SR-based algorithms, the proposed algorithm is simplified by changing only one parameter to realize the weak signal detection. Simulation studies revealed that the method performs well in detection of analytical signals in very high level of noise background and is suitable for detecting signals with the different noise level by changing the parameter. Applications of the method to experimental weak signals of X-ray diffraction and Raman spectrum are also investigated. It is found that reliable results can be obtained.

  7. Development and application of analytical techniques to chemistry of donor solvent liquefaction. Quarterly progress report, June 1979-December 1979

    SciTech Connect

    Squires, A.M.; Dorn, H.C.; Taylor, L.T.; Dillard, J.G.; Rony, P.R.

    1980-03-01

    It is clear from the limited results in this report that flow LC-NMR is a viable approach for rapid analysis of complex mixtures encountered in petroleum, shale, and coal products. Some of the initial results and implications of this feasibility study are summarized below: (1) The time savings gained by the flow LC-NMR approach is enormous when compared with normal chromatographic procedures of fraction collection solvent evaporation and preparation for spectroscopic examination. (2) The present results indicate limits of detection which require semi-preparative chromatographic loads. State-of-the-art NMR instrumentation should extend this approach to truly analytical columns for HNMR. We are continuing this development at the present time. (3) The flow LC-NMR approach has been extended to /sup 19/F NMR. This complements the fluorine tagging work which is also a major part of the present contract. An additional advantage is the wider scope of chromatographic solvents which can be utilized. (4) Although the present study focused on relatively nonpolar solvent systems, this approach can be extended to more polar solents which would allow ready examination of more polar constituents in coal products. (5) The flow LC-NMR approach is compatible with the other on-line LC detection techniques being developed at VPI (e.g., FT-IR, ICP, etc.).

  8. The Ophidia framework: toward cloud-based data analytics for climate change

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Elia, Donatello; Mancini, Marco; Mariello, Andrea; Mirto, Maria; Palazzo, Cosimo; Aloisio, Giovanni

    2015-04-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in the climate change domain. It provides parallel (server-side) data analysis, an internal storage model and a hierarchical data organization to manage large amount of multidimensional scientific data. The Ophidia analytics platform provides several MPI-based parallel operators to manipulate large datasets (data cubes) and array-based primitives to perform data analysis on large arrays of scientific data. The most relevant data analytics use cases implemented in national and international projects target fire danger prevention (OFIDIA), interactions between climate change and biodiversity (EUBrazilCC), climate indicators and remote data analysis (CLIP-C), sea situational awareness (TESSA), large scale data analytics on CMIP5 data in NetCDF format, Climate and Forecast (CF) convention compliant (ExArch). Two use cases regarding the EU FP7 EUBrazil Cloud Connect and the INTERREG OFIDIA projects will be presented during the talk. In the former case (EUBrazilCC) the Ophidia framework is being extended to integrate scalable VM-based solutions for the management of large volumes of scientific data (both climate and satellite data) in a cloud-based environment to study how climate change affects biodiversity. In the latter one (OFIDIA) the data analytics framework is being exploited to provide operational support regarding processing chains devoted to fire danger prevention. To tackle the project challenges, data analytics workflows consisting of about 130 operators perform, among the others, parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, import/export of datasets in NetCDF format. Finally, the entire Ophidia software stack has been deployed at CMCC on 24-nodes (16-cores/node) of the Athena HPC cluster. Moreover, a cloud-based release tested with OpenNebula is also available and running in the private

  9. Research and development of LANDSAT-based crop inventory techniques

    NASA Technical Reports Server (NTRS)

    Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)

    1982-01-01

    A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.

  10. Liquid Tunable Microlenses based on MEMS techniques

    PubMed Central

    Zeng, Xuefeng; Jiang, Hongrui

    2013-01-01

    The recent rapid development in microlens technology has provided many opportunities for miniaturized optical systems, and has found a wide range of applications. Of these microlenses, tunable-focus microlenses are of special interest as their focal lengths can be tuned using micro-scale actuators integrated with the lens structure. Realization of such tunable microlens generally relies on the microelectromechanical system (MEMS) technologies. Here, we review the recent progress in tunable liquid microlenses. The underlying physics relevant to these microlenses are first discussed, followed by description of three main categories of tunable microlenses involving MEMS techniques, mechanically driven, electrically driven, and those integrated within microfluidic systems. PMID:24163480

  11. Creating analytically divergence-free velocity fields from grid-based data

    NASA Astrophysics Data System (ADS)

    Ravu, Bharath; Rudman, Murray; Metcalfe, Guy; Lester, Daniel R.; Khakhar, Devang V.

    2016-10-01

    We present a method, based on B-splines, to calculate a C2 continuous analytic vector potential from discrete 3D velocity data on a regular grid. A continuous analytically divergence-free velocity field can then be obtained from the curl of the potential. This field can be used to robustly and accurately integrate particle trajectories in incompressible flow fields. Based on the method of Finn and Chacon (2005) [10] this new method ensures that the analytic velocity field matches the grid values almost everywhere, with errors that are two to four orders of magnitude lower than those of existing methods. We demonstrate its application to three different problems (each in a different coordinate system) and provide details of the specifics required in each case. We show how the additional accuracy of the method results in qualitatively and quantitatively superior trajectories that results in more accurate identification of Lagrangian coherent structures.

  12. Simultaneous determination of antazoline and naphazoline by the net analyte signal standard addition method and spectrophotometric technique.

    PubMed

    Asadpour-Zeynali, Karim; Ghavami, Raoof; Esfandiari, Roghayeh; Soheili-Azad, Payam

    2010-01-01

    A novel net analyte signal standard addition method (NASSAM) was used for simultaneous determination of the drugs anthazoline and naphazoline. The NASSAM can be applied for determination of analytes in the presence of known interferents. The proposed method is used to eliminate the calibration and prediction steps of multivariate calibration methods; the determination is carried out in a single step for each analyte. The accuracy of the predictions against the H-point standard addition method is independent of the shape of the analyte and interferent spectra. The net analyte signal concept was also used to calculate multivariate analytical figures of merit, such as LOD, selectivity, and sensitivity. The method was successfully applied to the simultaneous determination of anthazoline and naphazoline in a commercial eye drop sample.

  13. Effects of Mountain Ultra-Marathon Running on ROS Production and Oxidative Damage by Micro-Invasive Analytic Techniques

    PubMed Central

    Mrakic-Sposta, Simona; Gussoni, Maristella; Moretti, Sarah; Pratali, Lorenza; Giardini, Guido; Tacchini, Philippe; Dellanoce, Cinzia; Tonacci, Alessandro; Mastorci, Francesca; Borghini, Andrea; Montorsi, Michela; Vezzoli, Alessandra

    2015-01-01

    Purpose Aiming to gain a detailed insight into the physiological mechanisms involved under extreme conditions, a group of experienced ultra-marathon runners, performing the mountain Tor des Géants® ultra-marathon: 330 km trail-run in Valle d’Aosta, 24000 m of positive and negative elevation changes, was monitored. ROS production rate, antioxidant capacity, oxidative damage and inflammation markers were assessed, adopting micro-invasive analytic techniques. Methods Forty-six male athletes (45.04±8.75 yr, 72.6±8.4 kg, 1.76±0.05 m) were tested. Capillary blood and urine were collected before (Pre-), in the middle (Middle-) and immediately after (Post-) Race. Samples were analyzed for: Reactive Oxygen Species (ROS) production by Electron Paramagnetic Resonance; Antioxidant Capacity by Electrochemistry; oxidative damage (8-hydroxy-2-deoxy Guanosine: 8-OH-dG; 8-isoprostane: 8-isoPGF2α) and nitric oxide metabolites by enzymatic assays; inflammatory biomarkers (plasma and urine interleukin-6: IL-6-P and IL-6-U) by enzyme-linked immunosorbent assays (ELISA); Creatinine and Neopterin by HPLC, hematologic (lactate, glucose and hematocrit) and urine parameters by standard analyses. Results Twenty-five athletes finished the race, while twenty-one dropped out of it. A significant increase (Post-Race vs Pre) of the ROS production rate (2.20±0.27 vs 1.65±0.22 μmol.min-1), oxidative damage biomarkers (8-OH-dG: 6.32±2.38 vs 4.16±1.25 ng.mg-1 Creatinine and 8-isoPGF2α: 1404.0±518.30 vs 822.51±448.91 pg.mg-1Creatinine), inflammatory state (IL-6-P: 66.42±36.92 vs 1.29±0.54 pg.mL-1 and IL-6-U: 1.33±0.56 vs 0.71±0.17 pg.mL1) and lactate production (+190%), associated with a decrease of both antioxidant capacity (-7%) and renal function (i.e. Creatinine level +76%) was found. Conclusions The used micro-invasive analytic methods allowed us to perform most of them before, during and immediately after the race directly in the field, by passing the need of storing and

  14. Trends and Techniques for Space Base Electronics

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Wade, T. E.; Gassaway, J. D.

    1979-01-01

    Simulations of various phosphorus and boron diffusions in SOS were completed and a sputtering system, furnaces, and photolithography related equipment were set up. Double layer metal experiments initially utilized wet chemistry techniques. By incorporating ultrasonic etching of the vias, premetal cleaning a modified buffered HF, phosphorus doped vapox, and extended sintering, yields of 98% were obtained using the standard test pattern. A two dimensional modeling program was written for simulating short channel MOSFETs with nonuniform substrate doping. A key simplifying assumption used is that the majority carriers can be represented by a sheet charge at the silicon dioxide silicon interface. Although the program is incomplete, the two dimensional Poisson equation for the potential distribution was achieved. The status of other Z-D MOSFET simulation programs is summarized.

  15. Analytical Derivation: An Epistemic Game for Solving Mathematically Based Physics Problems

    ERIC Educational Resources Information Center

    Bajracharya, Rabindra R.; Thompson, John R.

    2016-01-01

    Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the "analytical derivation" game. This game involves deriving an…

  16. Combining Multiple Measures of Students' Opportunities to Develop Analytic, Text-Based Writing Skills

    ERIC Educational Resources Information Center

    Correnti, Richard; Matsumura, Lindsay Clare; Hamilton, Laura S.; Wang, Elaine

    2012-01-01

    Guided by evidence that teachers contribute to student achievement outcomes, researchers have been reexamining how to study instruction and the classroom opportunities teachers create for students. We describe our experience measuring students' opportunities to develop analytic, text-based writing skills. Utilizing multiple methods of data…

  17. The Effect of Brain Based Learning on Academic Achievement: A Meta-Analytical Study

    ERIC Educational Resources Information Center

    Gozuyesil, Eda; Dikici, Ayhan

    2014-01-01

    This study's aim is to measure the effect sizes of the quantitative studies that examined the effectiveness of brain-based learning on students' academic achievement and to examine with the meta-analytical method if there is a significant difference in effect in terms of the factors of education level, subject matter, sampling size, and…

  18. Effects of Computer Based Learning on Students' Attitudes and Achievements towards Analytical Chemistry

    ERIC Educational Resources Information Center

    Akcay, Hüsamettin; Durmaz, Asli; Tüysüz, Cengiz; Feyzioglu, Burak

    2006-01-01

    The aim of this study was to compare the effects of computer-based learning and traditional method on students' attitudes and achievement towards analytical chemistry. Students from Chemistry Education Department at Dokuz Eylul University (D.E.U) were selected randomly and divided into three groups; two experimental (Eg-1 and Eg-2) and a control…

  19. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    ERIC Educational Resources Information Center

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  20. Merging Old and New: An Instrumentation-Based Introductory Analytical Laboratory

    ERIC Educational Resources Information Center

    Jensen, Mark B.

    2015-01-01

    An instrumentation-based laboratory curriculum combining traditional unknown analyses with student-designed projects has been developed for an introductory analytical chemistry course. In the first half of the course, students develop laboratory skills and instrumental proficiency by rotating through six different instruments performing…

  1. Key Point Based Data Analysis Technique

    NASA Astrophysics Data System (ADS)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  2. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets

    PubMed Central

    Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2010-01-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595

  3. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets.

    PubMed

    Hahn, Seung-Yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2009-06-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet.

  4. Rapid and alternative fabrication method for microfluidic paper based analytical devices.

    PubMed

    Malekghasemi, Soheil; Kahveci, Enver; Duman, Memed

    2016-10-01

    A major application of microfluidic paper-based analytical devices (µPADs) includes the field of point-of-care (POC) diagnostics. It is important for POC diagnostics to possess properties such as ease-of-use and low cost. However, µPADs need multiple instruments and fabrication steps. In this study, two different chemicals (Hexamethyldisilazane and Tetra-ethylorthosilicate) were used, and three different methods (heating, plasma treatment, and microwave irradiation) were compared to develop µPADs. Additionally, an inkjet-printing technique was used for generating a hydrophilic channel and printing certain chemical agents on different regions of a modified filter paper. A rapid and effective fabrication method to develop µPADs within 10min was introduced using an inkjet-printing technique in conjunction with a microwave irradiation method. Environmental scanning electron microscope (ESEM) and x-ray photoelectron spectroscopy (XPS) were used for morphology characterization and determining the surface chemical compositions of the modified filter paper, respectively. Contact angle measurements were used to fulfill the hydrophobicity of the treated filter paper. The highest contact angle value (141°±1) was obtained using the microwave irradiation method over a period of 7min, when the filter paper was modified by TEOS. Furthermore, by using this method, the XPS results of TEOS-modified filter paper revealed Si2p (23%) and Si-O bounds (81.55%) indicating the presence of Si-O-Si bridges and Si(OEt) groups, respectively. The ESEM results revealed changes in the porous structures of the papers and decreases in the pore sizes. Washburn assay measurements tested the efficiency of the generated hydrophilic channels in which similar water penetration rates were observed in the TEOS-modified filter paper and unmodified (plain) filter paper. The validation of the developed µPADs was performed by utilizing the rapid urease test as a model test system. The detection limit of

  5. Rapid and alternative fabrication method for microfluidic paper based analytical devices.

    PubMed

    Malekghasemi, Soheil; Kahveci, Enver; Duman, Memed

    2016-10-01

    A major application of microfluidic paper-based analytical devices (µPADs) includes the field of point-of-care (POC) diagnostics. It is important for POC diagnostics to possess properties such as ease-of-use and low cost. However, µPADs need multiple instruments and fabrication steps. In this study, two different chemicals (Hexamethyldisilazane and Tetra-ethylorthosilicate) were used, and three different methods (heating, plasma treatment, and microwave irradiation) were compared to develop µPADs. Additionally, an inkjet-printing technique was used for generating a hydrophilic channel and printing certain chemical agents on different regions of a modified filter paper. A rapid and effective fabrication method to develop µPADs within 10min was introduced using an inkjet-printing technique in conjunction with a microwave irradiation method. Environmental scanning electron microscope (ESEM) and x-ray photoelectron spectroscopy (XPS) were used for morphology characterization and determining the surface chemical compositions of the modified filter paper, respectively. Contact angle measurements were used to fulfill the hydrophobicity of the treated filter paper. The highest contact angle value (141°±1) was obtained using the microwave irradiation method over a period of 7min, when the filter paper was modified by TEOS. Furthermore, by using this method, the XPS results of TEOS-modified filter paper revealed Si2p (23%) and Si-O bounds (81.55%) indicating the presence of Si-O-Si bridges and Si(OEt) groups, respectively. The ESEM results revealed changes in the porous structures of the papers and decreases in the pore sizes. Washburn assay measurements tested the efficiency of the generated hydrophilic channels in which similar water penetration rates were observed in the TEOS-modified filter paper and unmodified (plain) filter paper. The validation of the developed µPADs was performed by utilizing the rapid urease test as a model test system. The detection limit of

  6. Characterization and source term assessments of radioactive particles from Marshall Islands using non-destructive analytical techniques

    NASA Astrophysics Data System (ADS)

    Jernström, J.; Eriksson, M.; Simon, R.; Tamborini, G.; Bildstein, O.; Marquez, R. Carlos; Kehl, S. R.; Hamilton, T. F.; Ranebo, Y.; Betti, M.

    2006-08-01

    Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized by non-destructive analytical and microanalytical methods. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector and with wavelength dispersive system as well as a secondary ion mass spectrometer were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups: particles with pure Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogenously distributed. All of the particles were identified as nuclear fuel fragments of exploded weapon components. As containing plutonium with low 240Pu/ 239Pu atomic ratio, less than 0.065, which corresponds to weapons-grade plutonium or a detonation with low fission yield, the particles were identified to originate from the safety test and low-yield tests conducted in the history of Runit Island. The Si/O-rich particles contained traces of 137Cs ( 239 + 240 Pu/ 137Cs activity ratio higher than 2500), which indicated that a minor fission process occurred during the explosion. The average 241Am/ 239Pu atomic ratio in the six particles was 3.7 × 10 - 3 ± 0.2 × 10 - 3 (February 2006), which indicated that plutonium in the different particles had similar age.

  7. Latent practice profiles of substance abuse treatment counselors: do evidence-based techniques displace traditional techniques?

    PubMed

    Smith, Brenda D; Liu, Junqing

    2014-04-01

    As more substance abuse treatment counselors begin to use evidence-based treatment techniques, questions arise regarding the continued use of traditional techniques. This study aims to (1) assess whether there are meaningful practice profiles among practitioners reflecting distinct combinations of cognitive-behavioral and traditional treatment techniques; and (2) if so, identify practitioner characteristics associated with the distinct practice profiles. Survey data from 278 frontline counselors working in community substance abuse treatment organizations were used to conduct latent profile analysis. The emergent practice profiles illustrate that practitioners vary most in the use of traditional techniques. Multinomial regression models suggest that practitioners with less experience, more education, and less traditional beliefs about treatment and substance abuse are least likely to mix traditional techniques with cognitive-behavioral techniques. Findings add to the understanding of how evidence-based practices are implemented in routine settings and have implications for training and support of substance abuse treatment counselors.

  8. Acid-base titrations using microfluidic paper-based analytical devices.

    PubMed

    Karita, Shingo; Kaneta, Takashi

    2014-12-16

    Rapid and simple acid-base titration was accomplished using a novel microfluidic paper-based analytical device (μPAD). The μPAD was fabricated by wax printing and consisted of ten reservoirs for reaction and detection. The reaction reservoirs contained various amounts of a primary standard substance, potassium hydrogen phthalate (KHPth), whereas a constant amount of phenolphthalein was added to all the detection reservoirs. A sample solution containing NaOH was dropped onto the center of the μPAD and was allowed to spread to the reaction reservoirs where the KHPth neutralized it. When the amount of NaOH exceeded that of the KHPth in the reaction reservoirs, unneutralized hydroxide ion penetrated the detection reservoirs, resulting in a color reaction from the phenolphthalein. Therefore, the number of the detection reservoirs with no color change determined the concentration of the NaOH in the sample solution. The titration was completed within 1 min by visually determining the end point, which required neither instrumentation nor software. The volumes of the KHPth and phenolphthalein solutions added to the corresponding reservoirs were optimized to obtain reproducible and accurate results for the concentration of NaOH. The μPADs determined the concentration of NaOH at orders of magnitude ranging from 0.01 to 1 M. An acid sample, HCl, was also determined using Na2CO3 as a primary standard substance instead of KHPth. Furthermore, the μPAD was applicable to the titrations of nitric acid, sulfuric acid, acetic acid, and ammonia solutions. The μPADs were stable for more than 1 month when stored in darkness at room temperature, although this was reduced to only 5 days under daylight conditions. The analysis of acidic hot spring water was also demonstrated in the field using the μPAD, and the results agreed well with those obtained by classic acid-base titration.

  9. Analytical calculation of intrinsic shielding effectiveness for isotropic and anisotropic materials based on measured electrical parameters

    NASA Astrophysics Data System (ADS)

    Kühn, M.; John, W.; Weigel, R.

    2014-11-01

    This contribution contains the mechanisms for calculation of magnetic shielding effectiveness from material samples, based on measured electrical parameters. For this, measurement systems for the electrical conductivity of high and low conductive material samples with respect to the direction of current flow are presented and discussed. Also a definition of isotropic and anisotropic materials with electrical circuit diagrams is given. For prediction of shielding effectiveness for isotropic and anisotropic materials, several analytical models are presented. Also adaptions to gain a near field solution are part of this contribution. All analytical models will also be validated with an adequate measurement system.

  10. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  11. Accelerator based techniques for contraband detection

    NASA Astrophysics Data System (ADS)

    Vourvopoulos, George

    1994-05-01

    It has been shown that narcotics, explosives, and other contraband materials, contain various chemical elements such as H, C, N, O, P, S, and Cl in quantities and ratios that differentiate them from each other and from other innocuous substances. Neutrons and γ-rays have the ability to penetrate through various materials at large depths. They are thus able, in a non-intrusive way, to interrogate volumes ranging from suitcases to Sea-Land containers, and have the ability to image the object with an appreciable degree of reliability. Neutron induced reactions such as (n, γ), (n, n') (n, p) or proton induced γ-resonance absorption are some of the reactions currently investigated for the identification of the chemical elements mentioned above. Various DC and pulsed techniques are discussed and their advantages, characteristics, and current progress are shown. Areas where use of these methods is currently under evaluation are detection of hidden explosives, illicit drug interdiction, chemical war agents identification, nuclear waste assay, nuclear weapons destruction and others.

  12. Nondestructive atomic compositional analysis of BeMgZnO quaternary alloys using ion beam analytical techniques

    NASA Astrophysics Data System (ADS)

    Zolnai, Z.; Toporkov, M.; Volk, J.; Demchenko, D. O.; Okur, S.; Szabó, Z.; Özgür, Ü.; Morkoç, H.; Avrutin, V.; Kótai, E.

    2015-02-01

    The atomic composition with less than 1-2 atom% uncertainty was measured in ternary BeZnO and quaternary BeMgZnO alloys using a combination of nondestructive Rutherford backscattering spectrometry with 1 MeV He+ analyzing ion beam and non-Rutherford elastic backscattering experiments with 2.53 MeV energy protons. An enhancement factor of 60 in the cross-section of Be for protons has been achieved to monitor Be atomic concentrations. Usually the quantitative analysis of BeZnO and BeMgZnO systems is challenging due to difficulties with appropriate experimental tools for the detection of the light Be element with satisfactory accuracy. As it is shown, our applied ion beam technique, supported with the detailed simulation of ion stopping, backscattering, and detection processes allows of quantitative depth profiling and compositional analysis of wurtzite BeZnO/ZnO/sapphire and BeMgZnO/ZnO/sapphire layer structures with low uncertainty for both Be and Mg. In addition, the excitonic bandgaps of the layers were deduced from optical transmittance measurements. To augment the measured compositions and bandgaps of BeO and MgO co-alloyed ZnO layers, hybrid density functional bandgap calculations were performed with varying the Be and Mg contents. The theoretical vs. experimental bandgaps show linear correlation in the entire bandgap range studied from 3.26 eV to 4.62 eV. The analytical method employed should help facilitate bandgap engineering for potential applications, such as solar blind UV photodetectors and heterostructures for UV emitters and intersubband devices.

  13. Increasing throughput of surface plasmon resonance-based biosensors by multiple analyte injections.

    PubMed

    Mehand, Massinissa Si; De Crescenzo, Gregory; Srinivasan, Bala

    2012-04-01

    Surface plasmon resonance-based biosensors are now acknowledged as robust and reliable instruments to determine the kinetic parameters related to the interactions between biomolecules. These kinetic parameters are used in screening campaigns: there is a considerable interest in reducing the experimental time, thus improving the throughput of the surface plasmon resonance assays. Kinetic parameters are typically obtained by analyzing data from several injections of a given analyte at different concentrations over a surface where its binding partner has been immobilized. It has been already proven that an iterative optimization approach aiming at determining optimal analyte injections to be performed online can significantly reduce the experimentation time devoted to kinetic parameter determination, without any detrimental effect on their standard errors. In this study, we explore the potential of this iterative optimization approach to further reduce experiment duration by combining it with the simultaneous injection of two analytes. PMID:22434710

  14. Aptamer- and nucleic acid enzyme-based systems for simultaneous detection of multiple analytes

    DOEpatents

    Lu, Yi; Liu, Juewen

    2011-11-15

    The present invention provides aptamer- and nucleic acid enzyme-based systems for simultaneously determining the presence and optionally the concentration of multiple analytes in a sample. Methods of utilizing the system and kits that include the sensor components are also provided. The system includes a first reactive polynucleotide that reacts to a first analyte; a second reactive polynucleotide that reacts to a second analyte; a third polynucleotide; a fourth polynucleotide; a first particle, coupled to the third polynucleotide; a second particle, coupled to the fourth polynucleotide; and at least one quencher, for quenching emissions of the first and second quantum dots, coupled to the first and second reactive polynucleotides. The first particle includes a quantum dot having a first emission wavelength. The second particle includes a second quantum dot having a second emission wavelength different from the first emission wavelength. The third polynucleotide and the fourth polynucleotide are different.

  15. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  16. Noise-immune cavity-enhanced analytical atomic spectrometry - NICE-AAS - A technique for detection of elements down to zeptogram amounts

    NASA Astrophysics Data System (ADS)

    Axner, Ove; Ehlers, Patrick; Hausmaninger, Thomas; Silander, Isak; Ma, Weiguang

    2014-10-01

    Noise-immune cavity-enhanced optical heterodyne molecular spectroscopy (NICE-OHMS) is a powerful technique for detection of molecular compounds in gas phase that is based on a combination of two important concepts: frequency modulation spectroscopy (FMS) for reduction of noise, and cavity enhancement, for prolongation of the interaction length between the light and the sample. Due to its unique properties, it has demonstrated unparalleled detection sensitivity when it comes to detection of molecular constituents in the gas phase. However, despite these, it has so far not been used for detection of atoms, i.e. for elemental analysis. The present work presents an assessment of the expected performance of Doppler-broadened (Db) NICE-OHMS for analytical atomic spectrometry, then referred to as noise-immune cavity-enhanced analytical atomic spectrometry (NICE-AAS). After a description of the basic principles of Db-NICE-OHMS, the modulation and detection conditions for optimum performance are identified. Based on a previous demonstrated detection sensitivity of Db-NICE-OHMS of 5 × 10- 12 cm- 1 Hz- 1/2 (corresponding to a single-pass absorbance of 7 × 10- 11 over 10 s), the expected limits of detection (LODs) of Hg and Na by NICE-AAS are estimated. Hg is assumed to be detected in gas phase directly while Na is considered to be atomized in a graphite furnace (GF) prior to detection. It is shown that in the absence of spectral interferences, contaminated sample compartments, and optical saturation, it should be feasible to detect Hg down to 10 zg/cm3 (10 fg/m3 or 10- 5 ng/m3), which corresponds to 25 atoms/cm3, and Na down to 0.5 zg (zg = zeptogram = 10- 21 g), representing 50 zg/mL (parts-per-sextillion, pps, 1:1021) in liquid solution (assuming a sample of 10 μL) or solely 15 atoms injected into the GF, respectively. These LODs are several orders of magnitude lower (better) than any previous laser-based absorption technique previously demonstrated under atmospheric

  17. Analytical modeling and experimental verification of vibration-based piezoelectric bimorph beam with a tip-mass for power harvesting

    NASA Astrophysics Data System (ADS)

    Wang, Hongjin; Meng, Qingfeng

    2013-03-01

    Power harvesting techniques that convert vibration energy into electrical energy through piezoelectric transducers show strong potential for powering smart wireless sensor devices in applications of structural health monitoring. This paper presents an analytical model of the dynamic behavior of an electromechanical piezoelectric bimorph cantilever harvester connected with an AC-DC circuit based on the Euler-Bernoulli beam theory and Hamiltonian theorem. A new cantilevered piezoelectric bimorph structure is proposed in which the plug-type connection between support layer and tip-mass ensures that the gravity center of the tip-mass is collinear with the gravity center of the beam so that the brittle fracture of piezoelectric layers can also be avoided while vibrating with large amplitude. The tip-mass is equated by the inertial force and inertial moment acting at the end of the piezoelectric bimorph beam based on D'Alembert's principle. An AC-DC converting circuit soldered with the piezoelectric elements is also taken into account. A completely new analytic expression of the global behavior of the electromechanical piezoelectric bimorph harvesting system with AC-DC circuit under input base transverse excitation is derived. Moreover, an experimental energy harvester is fabricated and the theoretical analysis and experimental results of the piezoelectric harvester under the input base transverse displacement excitation are validated by using measurements of the absolute tip displacement, electric voltage response, electric current response and electric power harvesting.

  18. Studies of ferroelectric heterostructure thin films, interfaces, and device-related processes via in situ analytical techniques.

    SciTech Connect

    Aggarwal, S.; Auciello, O.; Dhote, A. M.; Gao, Y.; Gruen, D. M.; Im, J.; Irene, E. A.; Krauss, A. R.; Muller, A. H.; Ramesh, R.

    1999-06-29

    The science and technology of ferroelectric thin films has experienced an explosive development during the last ten years. Low-density non-volatile ferroelectric random access memories (NVFRAMS) are now incorporated in commercial products such as ''smart cards'', while high permittivity capacitors are incorporated in cellular phones. However, substantial work is still needed to develop materials integration strategies for high-density memories. We have demonstrated that the implementation of complementary in situ characterization techniques is critical to understand film growth and device processes relevant to device development. We are using uniquely integrated time of flight ion scattering and recoil spectroscopy (TOF-ISARS) and spectroscopic ellipsometry (SE) techniques to perform in situ, real-time studies of film growth processes in the high background gas pressure required to growth ferroelectric thin films. TOF-ISARS provides information on surface processes, while SE permits the investigation of buried interfaces as they are being formed. Recent studies on SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub x}Sr{sub 1{minus}x}TiO{sub 3} (BST) film growth and interface processes are discussed. Direct imaging of ferroelectric domains under applied electric fields can provide valuable information to understand domain dynamics in ferroelectric films. We discuss results of piezoresponse scanning force microscopy (SFM) imaging for nanoscale studies of polarization reversal and retention loss in Pb(Zr{sub x}Ti{sub 1{minus}x})O{sub 3} (PZT)-based capacitors. Another powerful technique suitable for in situ, real-time characterization of film growth processes and ferroelectric film-based device operation is based on synchrotrons X-ray scattering, which is currently being implemented at Argonne National Laboratory.

  19. Understanding wax screen-printing: a novel patterning process for microfluidic cloth-based analytical devices.

    PubMed

    Liu, Min; Zhang, Chunsun; Liu, Feifei

    2015-09-01

    In this work, we first introduce the fabrication of microfluidic cloth-based analytical devices (μCADs) using a wax screen-printing approach that is suitable for simple, inexpensive, rapid, low-energy-consumption and high-throughput preparation of cloth-based analytical devices. We have carried out a detailed study on the wax screen-printing of μCADs and have obtained some interesting results. Firstly, an analytical model is established for the spreading of molten wax in cloth. Secondly, a new wax screen-printing process has been proposed for fabricating μCADs, where the melting of wax into the cloth is much faster (∼5 s) and the heating temperature is much lower (75 °C). Thirdly, the experimental results show that the patterning effects of the proposed wax screen-printing method depend to a certain extent on types of screens, wax melting temperatures and melting time. Under optimized conditions, the minimum printing width of hydrophobic wax barrier and hydrophilic channel is 100 μm and 1.9 mm, respectively. Importantly, the developed analytical model is also well validated by these experiments. Fourthly, the μCADs fabricated by the presented wax screen-printing method are used to perform a proof-of-concept assay of glucose or protein in artificial urine with rapid high-throughput detection taking place on a 48-chamber cloth-based device and being performed by a visual readout. Overall, the developed cloth-based wax screen-printing and arrayed μCADs should provide a new research direction in the development of advanced sensor arrays for detection of a series of analytes relevant to many diverse applications. PMID:26388382

  20. Understanding wax screen-printing: a novel patterning process for microfluidic cloth-based analytical devices.

    PubMed

    Liu, Min; Zhang, Chunsun; Liu, Feifei

    2015-09-01

    In this work, we first introduce the fabrication of microfluidic cloth-based analytical devices (μCADs) using a wax screen-printing approach that is suitable for simple, inexpensive, rapid, low-energy-consumption and high-throughput preparation of cloth-based analytical devices. We have carried out a detailed study on the wax screen-printing of μCADs and have obtained some interesting results. Firstly, an analytical model is established for the spreading of molten wax in cloth. Secondly, a new wax screen-printing process has been proposed for fabricating μCADs, where the melting of wax into the cloth is much faster (∼5 s) and the heating temperature is much lower (75 °C). Thirdly, the experimental results show that the patterning effects of the proposed wax screen-printing method depend to a certain extent on types of screens, wax melting temperatures and melting time. Under optimized conditions, the minimum printing width of hydrophobic wax barrier and hydrophilic channel is 100 μm and 1.9 mm, respectively. Importantly, the developed analytical model is also well validated by these experiments. Fourthly, the μCADs fabricated by the presented wax screen-printing method are used to perform a proof-of-concept assay of glucose or protein in artificial urine with rapid high-throughput detection taking place on a 48-chamber cloth-based device and being performed by a visual readout. Overall, the developed cloth-based wax screen-printing and arrayed μCADs should provide a new research direction in the development of advanced sensor arrays for detection of a series of analytes relevant to many diverse applications.

  1. Studying pigments on painted plaster in Minoan, Roman and early Byzantine Crete. A multi-analytical technique approach.

    PubMed

    Westlake, Polly; Siozos, Panayiotis; Philippidis, Aggelos; Apostolaki, Chryssa; Derham, Brendan; Terlixi, Agni; Perdikatsis, Vasilios; Jones, Richard; Anglos, Demetrios

    2012-02-01

    Wall paintings spanning two millennia of Cretan painting history and technology were analysed in an effort to determine similarities and evolutions of painting materials and technology. A multi-technique approach was employed that combined the use of (a) laser-induced breakdown spectroscopy (LIBS) and Raman microspectroscopy, based on mobile instrumentation, appropriate for rapid, routine-level object characterization, and (b) non-destructive X-ray diffractometry (XRD), performed directly on the wall painting fragment, which provides detailed information on the minerals constituting the paint. Elemental analysis data obtained through LIBS were compared with molecular and crystal structure information from Raman spectroscopy and XRD. Cross-sections from selected samples were also investigated by means of optical microscopy and scanning electron microscopy coupled to micro-probe analysis and X-ray mapping that enabled identification of several mineral components of the paint confirming the results of the XRD analysis. In parallel, replica wall paintings, created with known pigments and binding media for reference purposes, were examined with optical microscopy and stain tested for organic materials. The overall study shows that the LIBS and Raman techniques offer key advantages, such as instrument mobility and speed of data collection and interpretation that are particularly important when dealing with on-site investigations. Thus, they are capable of providing important compositional information in an effective manner that enables quick surveying of wall paintings and permit targeted sample selection for further analysis by advanced laboratory techniques.

  2. Analytical capillary isotachophoresis: a routine technique for the analysis of lipoproteins and lipoprotein subfractions in whole serum.

    PubMed

    Schmitz, G; Borgmann, U; Assmann, G

    1985-02-22

    A capillary isotachophoretic separation technique was developed for lipoproteins in native serum which, compared with previous electrophoretic techniques, has negligible molecular sieve effects, does not need gel casting, is suitable for whole serum and has a high discriminative power for lipoprotein subfractions. The technique is based on pre-staining whole serum lipoproteins for 30 min at 4 degrees C before separation of 0.5 microliter of the sample in a free-flow capillary system (0.5 mm I.D.) with discontinuous buffer system. In normolipidaemic sera, high-density (HDL) and low-density lipoproteins (VLDL) are separated into two major subpopulations according to their net electric mobility. The identification of these fractions was confirmed by substitution with ultracentrifugally isolated lipoproteins and by their complete absence from Tangier and abetalipoproteinaemic serum. Triglyceride-rich very low-density lipoproteins (VLDL) revealed a defined zone between the HDL and LDL subpopulations. Our preliminary results indicate that the separation of human whole serum lipoproteins by capillary isotachophoresis is a promising method for the determination of lipoprotein subfractions.

  3. School-based friendship networks and children's physical activity: A spatial analytical approach.

    PubMed

    Macdonald-Wallis, Kyle; Jago, Russell; Page, Angie S; Brockman, Rowan; Thompson, Janice L

    2011-07-01

    Despite the known health benefits, the majority of children do not meet physical activity guidelines, with past interventions to increase physical activity yielding little success. Social and friendship networks have been shown to influence obesity, smoking and academic achievement, and peer-led interventions have successfully reduced the uptake of adolescent smoking. However, the role of social networks on physical activity is not clear. This paper investigates the extent to which friendship networks influence children's physical activity, and attempts to quantify the association using spatial analytical techniques to account for the social influence. Physical activity data were collected for 986 children, aged 10-11 years old, from 40 schools in Bristol, UK. Data from 559 children were used for analysis. Mean accelerometer counts per minute (CPM) and mean minutes of moderate to vigorous physical activity per day (MVPA) were calculated as objective measures of physical activity. Children nominated up to 4 school-friends, and school-based friendship networks were constructed from these nominations. Networks were tested to assess whether physical activity showed spatial dependence (in terms of social proximity in social space) using Moran's I statistic. Spatial autoregressive modelling was then used to assess the extent of spatial dependence, whilst controlling for other known predictors of physical activity. This model was compared with linear regression models for improvement in goodness-of-fit. Results indicated spatial autocorrelation of both mean MVPA (I = .346) and mean CPM (I = .284) in the data, indicating that children clustered in friendship groups with similar activity levels. Spatial autoregressive modelling of mean MVPA concurred that spatial dependence was present (ρ = .26, p < .001), and improved model fit by 31% on the linear regression model. These results demonstrate an association between physical activity levels of children and their

  4. FDI and Accommodation Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; de Miguel Catoira, Alberto; Sanz, Beatriz Ferreiro

    Massive application of dynamic backpropagation neural networks is used on closed loop control FDI (fault detection and isolation) tasks. The process dynamics is mapped by means of a trained backpropagation NN to be applied on residual generation. Process supervision is then applied to discriminate faults on process sensors, and process plant parameters. A rule based expert system is used to implement the decision making task and the corresponding solution in terms of faults accommodation and/or reconfiguration. Results show an efficient and robust FDI system which could be used as the core of an SCADA or alternatively as a complement supervision tool operating in parallel with the SCADA when applied on a heat exchanger.

  5. Entry guidance with real-time planning of reference based on analytical solutions

    NASA Astrophysics Data System (ADS)

    Yu, Wenbin; Chen, Wanchun

    2015-05-01

    In this paper, first, we develop new analytical solutions to hypersonic gliding problem. In the derivation of these solutions, we propose an innovative method based on spectral decomposition for solving a special type of linear system with variable coefficients, where the system matrix can be expressed as the product of a scale function and a constant matrix. Next, we design an entry guidance based on these analytical solutions. In the guidance, the downrange analytical expression is used to plan the longitudinal reference profile satisfying the downrange requirement in real time. Two bank reversals are needed to eliminate the crossrange error. The first is planned by the crossrange analytical expression such that the second is at a specified point near the end of the flight. After the first bank reversal is performed, the second is slightly corrected using the trajectory simulation. Because the longitudinal reference profile and bank reversals are planned onboard, the entry guidance can handle various urgent tasks and deal well with large dispersions in the initial conditions, aerodynamic model and atmospheric model.

  6. Chemical speciation of arsenic-accumulating mineral in a sedimentary iron deposit by synchrotron radiation multiple X-ray analytical techniques.

    PubMed

    Endo, Satoshi; Terada, Yasuko; Kato, Yasuhiro; Nakai, Izumi

    2008-10-01

    The comprehensive characterization of As(V)-bearing iron minerals from the Gunma iron deposit, which were probably formed by biomineralization, was carried out by utilizing multiple synchrotron radiation (SR)-based analytical techniques at BL37XU at SPring-8. SR microbeam X-ray fluorescence (SR-mu-XRF) imaging showed a high level of arsenic accumulation in the iron ore as dots of ca. 20 microm. Based on SEM observations and SR X-ray powder diffraction (SR-XRD) analysis, it was found that arsenic is selectively accumulated in strengite (FePO4 x 2H2O) with a concentric morphology, which may be produced by a biologically induced process. Furthermore, the X-ray absorption fine structure (XAFS) analysis showed that arsenic in strengite exists in the arsenate (AsO4(3-)) form and is coordinated by four oxygen atoms at 1.68 angstroms. The results suggest that strengite accumulates arsenic by isomorphous substitution of AsO4(3-) for PO4(3-) to form a partial solid-solution of strengite and scorodite (FeAsO4 x 2H2O). The specific correlation between the distribution of As and biominerals indicates that microorganisms seems to play an important role in the mineralization of strengite in combination with an arsenic-accumulating process.

  7. A SPICE model for a phase-change memory cell based on the analytical conductivity model

    NASA Astrophysics Data System (ADS)

    Yiqun, Wei; Xinnan, Lin; Yuchao, Jia; Xiaole, Cui; Jin, He; Xing, Zhang

    2012-11-01

    By way of periphery circuit design of the phase-change memory, it is necessary to present an accurate compact model of a phase-change memory cell for the circuit simulation. Compared with the present model, the model presented in this work includes an analytical conductivity model, which is deduced by means of the carrier transport theory instead of the fitting model based on the measurement. In addition, this model includes an analytical temperature model based on the 1D heat-transfer equation and the phase-transition dynamic model based on the JMA equation to simulate the phase-change process. The above models for phase-change memory are integrated by using Verilog-A language, and results show that this model is able to simulate the I-V characteristics and the programming characteristics accurately.

  8. Characterization and Source Term Assessments of Radioactive Particles from Marshall Islands Using Non-Destructive Analytical Techniques

    SciTech Connect

    Jernstrom, J; Eriksson, M; Simon, R; Tamborini, G; Bildstein, O; Carlos-Marquez, R; Kehl, S R; Betti, M; Hamilton, T

    2005-06-11

    A considerable fraction of radioactivity entering the environment from different nuclear events is associated with particles. The impact of these events can only be fully assessed where there is some knowledge about the mobility of particle bound radionuclides entering the environment. The behavior of particulate radionuclides is dependent on several factors, including the physical, chemical and redox state of the environment, the characteristics of the particles (e.g., the chemical composition, crystallinity and particle size) and on the oxidative state of radionuclides contained in the particles. Six plutonium-containing particles stemming from Runit Island soil (Marshall Islands) were characterized using non-destructive analytical and microanalytical methods. By determining the activity of {sup 239,240}Pu and {sup 241}Am isotopes from their gamma peaks structural information related to Pu matrix was obtained, and the source term was revealed. Composition and elemental distribution in the particles were studied with synchrotron radiation based micro X-ray fluorescence (SR-{mu}-XRF) spectrometry. Scanning electron microscope equipped with energy dispersive X-ray detector (SEMEDX) and secondary ion mass spectrometer (SIMS) were used to examine particle surfaces. Based on the elemental composition the particles were divided into two groups; particles with plain Pu matrix, and particles where the plutonium is included in Si/O-rich matrix being more heterogeneously distributed. All of the particles were identified as fragments of initial weapons material. As containing plutonium with low {sup 240}Pu/{sup 239}Pu atomic ratio, {approx}2-6%, which corresponds to weapons grade plutonium, the source term was identified to be among the safety tests conducted in the history of Runit Island.

  9. Sample injection and electrophoretic separation on a simple laminated paper based analytical device.

    PubMed

    Xu, Chunxiu; Zhong, Minghua; Cai, Longfei; Zheng, Qingyu; Zhang, Xiaojun

    2016-02-01

    We described a strategy to perform multistep operations on a simple laminated paper-based separation device by using electrokinetic flow to manipulate the fluids. A laminated crossed-channel paper-based separation device was fabricated by cutting a filter paper sheet followed by lamination. Multiple function units including sample loading, sample injection, and electrophoretic separation were integrated on a single paper based analytical device for the first time, by applying potential at different reservoirs for sample, sample waste, buffer, and buffer waste. As a proof-of-concept demonstration, mixed sample solution containing carmine and sunset yellow were loaded in the sampling channel, and then injected into separation channel followed by electrophoretic separation, by adjusting the potentials applied at the four terminals of sampling and separation channel. The effects of buffer pH, buffer concentration, channel width, and separation time on resolution of electrophoretic separation were studied. This strategy may be used to perform multistep operations such as reagent dilution, sample injection, mixing, reaction, and separation on a single microfluidic paper based analytical device, which is very attractive for building micro total analysis systems on microfluidic paper based analytical devices.

  10. Parametric design-based modal damped vibrational piezoelectric energy harvesters with arbitrary proof mass offset: Numerical and analytical validations

    NASA Astrophysics Data System (ADS)

    Lumentut, Mikail F.; Howard, Ian M.

    2016-02-01

    This paper focuses on the primary development of novel numerical and analytical techniques of the modal damped vibration energy harvesters with arbitrary proof mass offset. The key equations of electromechanical finite element discretisation using the extended Lagrangian principle are revealed and simplified to give matrix and scalar forms of the coupled system equations, indicating the most relevant numerical technique for the power harvester research. To evaluate the performance of the numerical study, the analytical closed-form boundary value equations have been developed using the extended Hamiltonian principle. The results from the electromechanical frequency response functions (EFRFs) derived from two theoretical studies show excellent agreement with experimental studies. The benefit of the numerical technique is in providing effective and quick predictions for analysing parametric designs and physical properties of piezoelectric materials. Although analytical technique provides a challenging process for analysing the complex smart structure, it shows complementary study for validating the numerical technique.

  11. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  12. Adaptive-mesh-based algorithm for fluorescence molecular tomography using an analytical solution.

    PubMed

    Wang, Daifa; Song, Xiaolei; Bai, Jing

    2007-07-23

    Fluorescence molecular tomography (FMT) has become an important method for in-vivo imaging of small animals. It has been widely used for tumor genesis, cancer detection, metastasis, drug discovery, and gene therapy. In this study, an algorithm for FMT is proposed to obtain accurate and fast reconstruction by combining an adaptive mesh refinement technique and an analytical solution of diffusion equation. Numerical studies have been performed on a parallel plate FMT system with matching fluid. The reconstructions obtained show that the algorithm is efficient in computation time, and they also maintain image quality.

  13. A chromism-based assay (CHROBA) technique for in situ detection of protein kinase activity.

    PubMed

    Tomizaki, Kin-ya; Jie, Xu; Mihara, Hisakazu

    2005-03-15

    A unique chromism-based assay technique (CHROBA) using photochromic spiropyran-containing peptides has been firstly established for detection of protein kinase A-catalyzed phosphorylation. The alternative method has advantages that avoid isolation and/or immobilization of kinase substrates to remove excess reagents including nonreactive isotope-labeled ATP or fluorescently-labeled anti-phosphoamino acid antibodies from the reaction mixture. Such a novel protocol based on thermocoloration of the spiropyran moiety in the peptide can offer not only an efficient screening method of potent kinase substrates but also a versatile analytical tool for monitoring other post-translational modification activities. PMID:15745830

  14. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes.

    PubMed

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-02-23

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques. PMID:26858435

  15. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes

    PubMed Central

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-01-01

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb2+, and connecting our platform with pipelines of tap water and wastewater for real-time online Pb2+ detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques. PMID:26858435

  16. Continuous shading and its fast update in fully analytic triangular-mesh-based computer generated hologram.

    PubMed

    Park, Jae-Hyeung; Kim, Seong-Bok; Yeom, Han-Ju; Kim, Hee-Jae; Zhang, HuiJun; Li, BoNi; Ji, Yeong-Min; Kim, Sang-Hoo; Ko, Seok-Bum

    2015-12-28

    Fully analytic mesh-based computer generated hologram enables efficient and precise representation of three-dimensional scene. Conventional method assigns uniform amplitude inside individual mesh, resulting in reconstruction of the three-dimensional scene of flat shading. In this paper, we report an extension of the conventional method to achieve the continuous shading where the amplitude in each mesh is continuously varying. The proposed method enables the continuous shading, while maintaining fully analytic framework of the conventional method without any sacrifice in the precision. The proposed method can also be extended to enable fast update of the shading for different illumination directions and the ambient-diffuse reflection ratio based on Phong reflection model. The feasibility of the proposed method is confirmed by the numerical and optical reconstruction of the generated hologram.

  17. Analytical derivation: An epistemic game for solving mathematically based physics problems

    NASA Astrophysics Data System (ADS)

    Bajracharya, Rabindra R.; Thompson, John R.

    2016-06-01

    Problem solving, which often involves multiple steps, is an integral part of physics learning and teaching. Using the perspective of the epistemic game, we documented a specific game that is commonly pursued by students while solving mathematically based physics problems: the analytical derivation game. This game involves deriving an equation through symbolic manipulations and routine mathematical operations, usually without any physical interpretation of the processes. This game often creates cognitive obstacles in students, preventing them from using alternative resources or better approaches during problem solving. We conducted hour-long, semi-structured, individual interviews with fourteen introductory physics students. Students were asked to solve four "pseudophysics" problems containing algebraic and graphical representations. The problems required the application of the fundamental theorem of calculus (FTC), which is one of the most frequently used mathematical concepts in physics problem solving. We show that the analytical derivation game is necessary, but not sufficient, to solve mathematically based physics problems, specifically those involving graphical representations.

  18. Current Status of the IAP NASU Accelerator-Based Analytical Facility

    NASA Astrophysics Data System (ADS)

    Buhay, O. M.; Drozdenko, A. A.; Zakharets, M. I.; Ignat'ev, I. G.; Kramchenkov, A. B.; Miroshnichenko, V. I.; Ponomarev, A. G.; Storizhko, V. E.

    Accelerator-based analytical facility (AAF) of the Institute of Applied Physics of the National Academy of Sciences of Ukraine is described. The research facility is based on a compact single ended machine with the maximum accelerating potential of 2 MV. The facility has five analytical end-stations: an scanning ion microprobe end-station with spatial resolution of less than 2 μm, a high-resolution Rutherford backscattering spectrometry end-station with a magnetic spectrometer (ΔE/E<1.5×10-3), end-station for elastic recoil detection analysis equipped with an electrostatic spectrometer (ΔE/E<1.5×10-3), end-station for particle induced gamma ray spectroscopy, and an ion induced luminescence end-station. Key specifications of the end-stations and their potential features are given.

  19. 75 FR 57016 - Maple Analytics, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-17

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Maple Analytics, LLC; Supplemental Notice That Initial Market- Based Rate... notice in the above-referenced proceeding of Maple Analytics, LLC's application for market-based...

  20. A Word-Based Compression Technique for Text Files.

    ERIC Educational Resources Information Center

    Vernor, Russel L., III; Weiss, Stephen F.

    1978-01-01

    Presents a word-based technique for storing natural language text in compact form. The compressed text consists of a dictionary and a text that is a combination of actual running text and pointers to the dictionary. This technique has shown itself to be effective for both text storage and retrieval. (VT)

  1. Characterizing Subcore Heterogeneity: A New Analytical Model and Technique to Observe the Spatial Variation of Transverse Dispersion

    NASA Astrophysics Data System (ADS)

    Boon, Maartje; Niu, Ben; Krevor, Sam

    2015-04-01

    Transverse dispersion, the lateral spread of chemical components in an aqueous solution caused by small heterogeneities in a rock, plays an important role in spreading, mixing and reaction during flow through porous media. Conventionally, transverse dispersion has been determined with the use of an annular core device and concentration measurements of the effluent (Blackwell, 1962; Hassinger and Von Rosenberg, 1968) or concentration measurements at probe locations along the core (Han et al, 1985; Harleman and Rumer, 1963). Both methods were designed around an analytical model of the transport equations assuming a single constant for the transverse dispersion coefficient, which is used to analyse the experimental data. We have developed a new core flood test with the aim of characterising chemical transport and dispersion directly in three dimensions to (1) produce higher precision observations of transverse dispersion than has been possible before and (2) so that the effects of rock heterogeneity on transport can also be observed and summarised using statistical descriptions allowing for a more nuanced picture of transport than allowed by description with a single transverse dispersion coefficient. The dispersion of a NaI aqueous solution injected into a Berea sandstone rock core was visualised in 3D with the use of a medical x-ray CT scanner. A device consisting out of three annular regions was used for injection. Water was injected into the centre and outer annular region and a NaI aqueous solution was injected in the middle annular region. An analytical solution to the flow and transport equations for this new inlet configuration was derived to design the tests. The Berea sandstone core was 20 cm long and had a diameter of 7.62cm. The core flood experiments were carried out for Peclet nr 0.5 and Peclet nr 2. At steady state, x-ray images were taken every 0.2 cm along the core. This resulted in a high quality 3D digital data set of the concentration distribution

  2. Improved Tissue-Based Analytical Test Methods for Orellanine, a Biomarker of Cortinarius Mushroom Intoxication.

    PubMed

    Anantharam, Poojya; Shao, Dahai; Imerman, Paula M; Burrough, Eric; Schrunk, Dwayne; Sedkhuu, Tsevelmaa; Tang, Shusheng; Rumbeiha, Wilson

    2016-01-01

    Orellanine (OR) toxin is produced by mushrooms of the genus Cortinarius which grow in North America and in Europe. OR poisoning is characterized by severe oliguric acute renal failure, with a mortality rate of 10%-30%. Diagnosis of OR poisoning currently hinges on a history of ingestion of Cortinarius mushrooms and histopathology of renal biopsies. A key step in the diagnostic approach is analysis of tissues for OR. Currently, tissue-based analytical methods for OR are nonspecific and lack sensitivity. The objectives of this study were: (1) to develop definitive HPLC and LC-MS/MS tissue-based analytical methods for OR; and (2) to investigate toxicological effects of OR in mice. The HPLC limit of quantitation was 10 µg/g. For fortification levels of 15 µg/g to 50 µg/g OR in kidney, the relative standard deviation was between 1.3% and 9.8%, and accuracy was within 1.5% to 7.1%. A matrix-matched calibration curve was reproduced in this range with a correlation coefficient (r) of 0.97-0.99. The limit of detection was 20 ng/g for LC-MS/MS. In OR-injected mice, kidney OR concentrations were 97 ± 51 µg/g on Day 0 and 17 ± 1 µg/g on termination Day 3. Splenic and liver injuries were novel findings in this mouse model. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology. PMID:27213453

  3. Improved Tissue-Based Analytical Test Methods for Orellanine, a Biomarker of Cortinarius Mushroom Intoxication

    PubMed Central

    Anantharam, Poojya; Shao, Dahai; Imerman, Paula M.; Burrough, Eric; Schrunk, Dwayne; Sedkhuu, Tsevelmaa; Tang, Shusheng; Rumbeiha, Wilson

    2016-01-01

    Orellanine (OR) toxin is produced by mushrooms of the genus Cortinarius which grow in North America and in Europe. OR poisoning is characterized by severe oliguric acute renal failure, with a mortality rate of 10%–30%. Diagnosis of OR poisoning currently hinges on a history of ingestion of Cortinarius mushrooms and histopathology of renal biopsies. A key step in the diagnostic approach is analysis of tissues for OR. Currently, tissue-based analytical methods for OR are nonspecific and lack sensitivity. The objectives of this study were: (1) to develop definitive HPLC and LC-MS/MS tissue-based analytical methods for OR; and (2) to investigate toxicological effects of OR in mice. The HPLC limit of quantitation was 10 µg/g. For fortification levels of 15 µg/g to 50 µg/g OR in kidney, the relative standard deviation was between 1.3% and 9.8%, and accuracy was within 1.5% to 7.1%. A matrix-matched calibration curve was reproduced in this range with a correlation coefficient (r) of 0.97–0.99. The limit of detection was 20 ng/g for LC-MS/MS. In OR-injected mice, kidney OR concentrations were 97 ± 51 µg/g on Day 0 and 17 ± 1 µg/g on termination Day 3. Splenic and liver injuries were novel findings in this mouse model. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology. PMID:27213453

  4. Improved Tissue-Based Analytical Test Methods for Orellanine, a Biomarker of Cortinarius Mushroom Intoxication.

    PubMed

    Anantharam, Poojya; Shao, Dahai; Imerman, Paula M; Burrough, Eric; Schrunk, Dwayne; Sedkhuu, Tsevelmaa; Tang, Shusheng; Rumbeiha, Wilson

    2016-05-21

    Orellanine (OR) toxin is produced by mushrooms of the genus Cortinarius which grow in North America and in Europe. OR poisoning is characterized by severe oliguric acute renal failure, with a mortality rate of 10%-30%. Diagnosis of OR poisoning currently hinges on a history of ingestion of Cortinarius mushrooms and histopathology of renal biopsies. A key step in the diagnostic approach is analysis of tissues for OR. Currently, tissue-based analytical methods for OR are nonspecific and lack sensitivity. The objectives of this study were: (1) to develop definitive HPLC and LC-MS/MS tissue-based analytical methods for OR; and (2) to investigate toxicological effects of OR in mice. The HPLC limit of quantitation was 10 µg/g. For fortification levels of 15 µg/g to 50 µg/g OR in kidney, the relative standard deviation was between 1.3% and 9.8%, and accuracy was within 1.5% to 7.1%. A matrix-matched calibration curve was reproduced in this range with a correlation coefficient (r) of 0.97-0.99. The limit of detection was 20 ng/g for LC-MS/MS. In OR-injected mice, kidney OR concentrations were 97 ± 51 µg/g on Day 0 and 17 ± 1 µg/g on termination Day 3. Splenic and liver injuries were novel findings in this mouse model. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology. The new tissue-based analytical tests will improve diagnosis of OR poisoning, while the mouse model has yielded new data advancing knowledge on OR-induced pathology.

  5. A technique coupling the analyte electrodeposition followed by in-situ stripping with electrothermal atomic absorption spectrometry for analysis of samples with high NaCl contents

    NASA Astrophysics Data System (ADS)

    Čánský, Zdeněk; Rychlovský, Petr; Petrová, Zuzana; Matousek, J. P.

    2007-03-01

    A technique coupling the analyte electrodeposition followed by in-situ stripping with electrothermal atomic absorption spectrometry has been developed for determination of lead and cadmium in samples with high salt contents. To separate the analyte from the sample matrix, the analyte was in-situ quantitatively electrodeposited on a platinum sampling capillary serving as the cathode (sample volume, 20 μL). The spent electrolyte containing the sample matrix was then withdrawn, the capillary with the analyte deposited was washed with deionized water and the analyte was stripped into a chemically simple electrolyte (5 g/L NH 4H 2PO 4) by reversing the polarity of the electrodeposition circuit. Electrothermal atomization using a suitable optimized temperature program followed. A fully automated manifold was designed for this coupled technique and the appropriate control software was developed. The operating conditions for determination of Pb and Cd in samples with high contents of inorganic salts were optimized, the determination was characterized by principal analytical parameters and its applicability was verified on analyses of urine reference samples. The absolute limits of detection for lead and cadmium (3 σ criterion) in a sample containing 30 g/L NaCl were 8.5 pg and 2.3 pg, respectively (peak absorbance) and the RSD values amounted to 1.6% and 1.9% for lead (at the 40 ng mL - 1 level) and cadmium (at the 4.0 ng mL - 1 level), respectively. These values (and also the measuring sensitivity) are superior to the results attained in conventional electrothermal atomic absorption spectrometric determination of Pb and Cd in pure solutions (5 g/L NH 4H 2PO 4). The sensitivity of the Pb and Cd determination is not affected by the NaCl concentration up to a value of 100 g/L, demonstrating an efficient matrix removal during the electrodeposition step.

  6. Lynx: a knowledge base and an analytical workbench for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Xie, Bingqing; Taylor, Andrew; D'Souza, Mark; Balasubramanian, Sandhya; Hashemifar, Somaye; White, Steven; Dave, Utpal J; Agam, Gady; Xu, Jinbo; Wang, Sheng; Gilliam, T Conrad; Maltsev, Natalia

    2016-01-01

    Lynx (http://lynx.ci.uchicago.edu) is a web-based database and a knowledge extraction engine. It supports annotation and analysis of high-throughput experimental data and generation of weighted hypotheses regarding genes and molecular mechanisms contributing to human phenotypes or conditions of interest. Since the last release, the Lynx knowledge base (LynxKB) has been periodically updated with the latest versions of the existing databases and supplemented with additional information from public databases. These additions have enriched the data annotations provided by Lynx and improved the performance of Lynx analytical tools. Moreover, the Lynx analytical workbench has been supplemented with new tools for reconstruction of co-expression networks and feature-and-network-based prioritization of genetic factors and molecular mechanisms. These developments facilitate the extraction of meaningful knowledge from experimental data and LynxKB. The Service Oriented Architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces.

  7. Lynx: a knowledge base and an analytical workbench for integrative medicine.

    PubMed

    Sulakhe, Dinanath; Xie, Bingqing; Taylor, Andrew; D'Souza, Mark; Balasubramanian, Sandhya; Hashemifar, Somaye; White, Steven; Dave, Utpal J; Agam, Gady; Xu, Jinbo; Wang, Sheng; Gilliam, T Conrad; Maltsev, Natalia

    2016-01-01

    Lynx (http://lynx.ci.uchicago.edu) is a web-based database and a knowledge extraction engine. It supports annotation and analysis of high-throughput experimental data and generation of weighted hypotheses regarding genes and molecular mechanisms contributing to human phenotypes or conditions of interest. Since the last release, the Lynx knowledge base (LynxKB) has been periodically updated with the latest versions of the existing databases and supplemented with additional information from public databases. These additions have enriched the data annotations provided by Lynx and improved the performance of Lynx analytical tools. Moreover, the Lynx analytical workbench has been supplemented with new tools for reconstruction of co-expression networks and feature-and-network-based prioritization of genetic factors and molecular mechanisms. These developments facilitate the extraction of meaningful knowledge from experimental data and LynxKB. The Service Oriented Architecture provides public access to LynxKB and its analytical tools via user-friendly web services and interfaces. PMID:26590263

  8. Analytical testing

    NASA Technical Reports Server (NTRS)

    Flannelly, W. G.; Fabunmi, J. A.; Nagy, E. J.

    1981-01-01

    Analytical methods for combining flight acceleration and strain data with shake test mobility data to predict the effects of structural changes on flight vibrations and strains are presented. This integration of structural dynamic analysis with flight performance is referred to as analytical testing. The objective of this methodology is to analytically estimate the results of flight testing contemplated structural changes with minimum flying and change trials. The category of changes to the aircraft includes mass, stiffness, absorbers, isolators, and active suppressors. Examples of applying the analytical testing methodology using flight test and shake test data measured on an AH-1G helicopter are included. The techniques and procedures for vibration testing and modal analysis are also described.

  9. An analytical coupled technique for solving nonlinear large-amplitude oscillation of a conservative system with inertia and static non-linearity.

    PubMed

    Razzak, Md Abdur; Alam, Md Shamsul

    2016-01-01

    Based on a new trial function, an analytical coupled technique (a combination of homotopy perturbation method and variational method) is presented to obtain the approximate frequencies and the corresponding periodic solutions of the free vibration of a conservative oscillator having inertia and static non-linearities. In some of the previous articles, the first and second-order approximations have been determined by the same method of such nonlinear oscillator, but the trial functions have not been satisfied the initial conditions. It seemed to be a big shortcoming of those articles. The new trial function of this paper overcomes aforementioned limitation. The first-order approximation is mainly considered in this paper. The main advantage of this present paper is, the first-order approximation gives better result than other existing second-order harmonic balance methods. The present method is valid for large amplitudes of oscillation. The absolute relative error measures (first-order approximate frequency) in this paper is 0.00 % for large amplitude A = 1000, while the relative error gives two different second-order harmonic balance methods: 10.33 and 3.72 %. Thus the present method is suitable for solving the above-mentioned nonlinear oscillator.

  10. Polynomial optimization techniques for activity scheduling. Optimization based prototype scheduler

    NASA Technical Reports Server (NTRS)

    Reddy, Surender

    1991-01-01

    Polynomial optimization techniques for activity scheduling (optimization based prototype scheduler) are presented in the form of the viewgraphs. The following subject areas are covered: agenda; need and viability of polynomial time techniques for SNC (Space Network Control); an intrinsic characteristic of SN scheduling problem; expected characteristics of the schedule; optimization based scheduling approach; single resource algorithms; decomposition of multiple resource problems; prototype capabilities, characteristics, and test results; computational characteristics; some features of prototyped algorithms; and some related GSFC references.

  11. Analytical Microscopy

    SciTech Connect

    Not Available

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  12. The detection of bulk explosives using nuclear-based techniques

    SciTech Connect

    Morgado, R.E.; Gozani, T.; Seher, C.C.

    1988-01-01

    In 1986 we presented a rationale for the detection of bulk explosives based on nuclear techniques that addressed the requirements of civil aviation security in the airport environment. Since then, efforts have intensified to implement a system based on thermal neutron activation (TNA), with new work developing in fast neutron and energetic photon reactions. In this paper we will describe these techniques and present new results from laboratory and airport testing. Based on preliminary results, we contended in our earlier paper that nuclear-based techniques did provide sufficiently penetrating probes and distinguishable detectable reaction products to achieve the FAA operational goals; new data have supported this contention. The status of nuclear-based techniques for the detection of bulk explosives presently under investigation by the US Federal Aviation Administration (FAA) is reviewed. These include thermal neutron activation (TNA), fast neutron activation (FNA), the associated particle technique, nuclear resonance absorption, and photoneutron activation. The results of comprehensive airport testing of the TNA system performed during 1987-88 are summarized. From a technical point of view, nuclear-based techniques now represent the most comprehensive and feasible approach for meeting the operational criteria of detection, false alarms, and throughput. 9 refs., 5 figs., 2 tabs.

  13. Evaluation of purification procedures of DNA from maize-meal samples by exploiting different analytical techniques for the assessment of DNA quality.

    PubMed

    Vanni, Adriano; Anfossi, Laura; Giovannoli, Cristina; Oddenino, Leila; Giraudi, Gianfranco

    2004-04-01

    Two different approaches generally applied to achieve purification of DNA extracted from cells were compared: precipitation by organic solvents and enzymatic treatments. We investigated various experimental protocols reported in literature by evaluating DNA purity, integrity and yield. Reliability of analytical techniques normally employed to assess DNA purity and quantity was studied and comments and conclusions were suggested by comparing results obtained by different analytical techniques. Enzymatic treatments prove to be unable of increasing DNA purity while determining a significant degradation. In contrast, optimised conditions for solvent precipitation enabled a sharp increase of DNA purity to be obtained, associated with the maintenance of the initial DNA integrity. The application of the optimised protocol to maize-meal samples allowed us to achieve a good PCR amplification even with those samples which gave poor amplification by following the protocol recommended by the Italian legislation in force for GMO detection in food.

  14. Application of glyph-based techniques for multivariate engineering visualization

    NASA Astrophysics Data System (ADS)

    Glazar, Vladimir; Marunic, Gordana; Percic, Marko; Butkovic, Zlatko

    2016-01-01

    This article presents a review of glyph-based techniques for engineering visualization as well as practical application for the multivariate visualization process. Two glyph techniques, Chernoff faces and star glyphs, uncommonly used in engineering practice, are described, applied to the selected data set, run through the chosen optimization methods and user evaluated. As an example of how these techniques function, a set of data for the optimization of a heat exchanger with a microchannel coil is adopted for visualization. The results acquired by the chosen visualization techniques are related to the results of optimization carried out by the response surface method and compared with the results of user evaluation. Based on the data set from engineering research and practice, the advantages and disadvantages of these techniques for engineering visualization are identified and discussed.

  15. Embedded Analytical Solutions Improve Accuracy in Convolution-Based Particle Tracking Models using Python

    NASA Astrophysics Data System (ADS)

    Starn, J. J.

    2013-12-01

    Particle tracking often is used to generate particle-age distributions that are used as impulse-response functions in convolution. A typical application is to produce groundwater solute breakthrough curves (BTC) at endpoint receptors such as pumping wells or streams. The commonly used semi-analytical particle-tracking algorithm based on the assumption of linear velocity gradients between opposing cell faces is computationally very fast when used in combination with finite-difference models. However, large gradients near pumping wells in regional-scale groundwater-flow models often are not well represented because of cell-size limitations. This leads to inaccurate velocity fields, especially at weak sinks. Accurate analytical solutions for velocity near a pumping well are available, and various boundary conditions can be imposed using image-well theory. Python can be used to embed these solutions into existing semi-analytical particle-tracking codes, thereby maintaining the integrity and quality-assurance of the existing code. Python (and associated scientific computational packages NumPy, SciPy, and Matplotlib) is an effective tool because of its wide ranging capability. Python text processing allows complex and database-like manipulation of model input and output files, including binary and HDF5 files. High-level functions in the language include ODE solvers to solve first-order particle-location ODEs, Gaussian kernel density estimation to compute smooth particle-age distributions, and convolution. The highly vectorized nature of NumPy arrays and functions minimizes the need for computationally expensive loops. A modular Python code base has been developed to compute BTCs using embedded analytical solutions at pumping wells based on an existing well-documented finite-difference groundwater-flow simulation code (MODFLOW) and a semi-analytical particle-tracking code (MODPATH). The Python code base is tested by comparing BTCs with highly discretized synthetic steady

  16. Source term identification of environmental radioactive Pu/U particles by their characterization with non-destructive spectrochemical analytical techniques

    NASA Astrophysics Data System (ADS)

    Eriksson, M.; Osán, J.; Jernström, J.; Wegrzynek, D.; Simon, R.; Chinea-Cano, E.; Markowicz, A.; Bamford, S.; Tamborini, G.; Török, S.; Falkenberg, G.; Alsecz, A.; Dahlgaard, H.; Wobrauschek, P.; Streli, C.; Zoeger, N.; Betti, M.

    2005-04-01

    Six radioactive particles stemming from Thule area (NW-Greenland) were investigated by gamma-ray and L X-ray spectrometry based on radioactive disintegration, scanning electron microscopy coupled with energy-dispersive and wavelength-dispersive X-ray spectrometer, synchrotron radiation based techniques as microscopic X-ray fluorescence, microscopic X-ray absorption near-edge structure (μ-XANES) as well as combined X-ray absorption and fluorescence microtomography. Additionally, one particle from Mururoa atoll was examined by microtomography. From the results obtained, it was found out that the U and Pu were mixed in the particles. The U/Pu intensity ratios in the Thule particles varied between 0.05 and 0.36. The results from the microtomography showed that U/Pu ratio was not homogeneously distributed. The 241Am/ 238 + 239 + 240 Pu activity ratios varied between 0.13 and 0.17, indicating that the particles originate from different source terms. The oxidation states of U and Pu as determined by μ-XANES showed that U(IV) is the preponderant species and for Pu, two types of particles could be evidenced. One set had about 90% Pu(IV) while in the other the ratio Pu(IV)/Pu(VI) was about one third.

  17. Learning Analytics and Computational Techniques for Detecting and Evaluating Patterns in Learning: An Introduction to the Special Issue

    ERIC Educational Resources Information Center

    Martin, Taylor; Sherin, Bruce

    2013-01-01

    The learning sciences community's interest in learning analytics (LA) has been growing steadily over the past several years. Three recent symposia on the theme (at the American Educational Research Association 2011 and 2012 annual conferences, and the International Conference of the Learning Sciences 2012), organized by Paulo Blikstein, led…

  18. Transformation of an uncertain video search pipeline to a sketch-based visual analytics loop.

    PubMed

    Legg, Philip A; Chung, David H S; Parry, Matthew L; Bown, Rhodri; Jones, Mark W; Griffiths, Iwan W; Chen, Min

    2013-12-01

    Traditional sketch-based image or video search systems rely on machine learning concepts as their core technology. However, in many applications, machine learning alone is impractical since videos may not be semantically annotated sufficiently, there may be a lack of suitable training data, and the search requirements of the user may frequently change for different tasks. In this work, we develop a visual analytics systems that overcomes the shortcomings of the traditional approach. We make use of a sketch-based interface to enable users to specify search requirement in a flexible manner without depending on semantic annotation. We employ active machine learning to train different analytical models for different types of search requirements. We use visualization to facilitate knowledge discovery at the different stages of visual analytics. This includes visualizing the parameter space of the trained model, visualizing the search space to support interactive browsing, visualizing candidature search results to support rapid interaction for active learning while minimizing watching videos, and visualizing aggregated information of the search results. We demonstrate the system for searching spatiotemporal attributes from sports video to identify key instances of the team and player performance.

  19. National Cancer Institute Biospecimen Evidence-Based Practices: a novel approach to pre-analytical standardization.

    PubMed

    Engel, Kelly B; Vaught, Jim; Moore, Helen M

    2014-04-01

    Variable biospecimen collection, processing, and storage practices may introduce variability in biospecimen quality and analytical results. This risk can be minimized within a facility through the use of standardized procedures; however, analysis of biospecimens from different facilities may be confounded by differences in procedures and inferred biospecimen quality. Thus, a global approach to standardization of biospecimen handling procedures and their validation is needed. Here we present the first in a series of procedural guidelines that were developed and annotated with published findings in the field of human biospecimen science. The series of documents will be known as NCI Biospecimen Evidence-Based Practices, or BEBPs. Pertinent literature was identified via the National Cancer Institute (NCI) Biospecimen Research Database ( brd.nci.nih.gov ) and findings were organized by specific biospecimen pre-analytical factors and analytes of interest (DNA, RNA, protein, morphology). Meta-analysis results were presented as annotated summaries, which highlight concordant and discordant findings and the threshold and magnitude of effects when applicable. The detailed and adaptable format of the document is intended to support the development and execution of evidence-based standard operating procedures (SOPs) for human biospecimen collection, processing, and storage operations.

  20. A behavior-analytic account of depression and a case report using acceptance-based procedures

    PubMed Central

    Dougher, Michael J.; Hackbert, Lucianne

    1994-01-01

    Although roughly 6% of the general population is affected by depression at some time during their lifetime, the disorder has been relatively neglected by behavior analysts. The preponderance of research on the etiology and treatment of depression has been conducted by cognitive behavior theorists and biological psychiatrists and psychopharmacologists interested in the biological substrates of depression. These approaches have certainly been useful, but their reliance on cognitive and biological processes and their lack of attention to environment—behavior relations render them unsatisfactory from a behavior-analytic perspective. The purpose of this paper is to provide a behavior-analytic account of depression and to derive from this account several possible treatment interventions. In addition, case material is presented to illustrate an acceptance-based approach with a depressed client. PMID:22478195

  1. Ultrafast capillary electrophoresis isolation of DNA aptamer for the PCR amplification-based small analyte sensing

    PubMed Central

    Fiore, Emmanuelle; Dausse, Eric; Dubouchaud, Hervé; Peyrin, Eric; Ravelet, Corinne

    2015-01-01

    Here, we report a new homogeneous DNA amplification-based aptamer assay for small analyte sensing. The aptamer of adenosine chosen as the model analyte was split into two fragments able to assemble in the presence of target. Primers were introduced at extremities of one fragment in order to generate the amplifiable DNA component. The amount of amplifiable fragment was quantifiable by Real-Time Polymerase Chain Reaction (RT-PCR) amplification and directly reliable on adenosine concentration. This approach combines the very high separation efficiency and the homogeneous format (without immobilization) of capillary electrophoresis (CE) and the sensitivity of real time PCR amplification. An ultrafast isolation of target-bound split aptamer (60 s) was developed by designing a CE input/ouput scheme. Such method was successfully applied to the determination of adenosine with a LOD of 1 μM. PMID:26322305

  2. Ultrafast capillary electrophoresis isolation of DNA aptamer for the PCR amplification-based small analyte sensing.

    PubMed

    Fiore, Emmanuelle; Dausse, Eric; Dubouchaud, Hervé; Peyrin, Eric; Ravelet, Corinne

    2015-01-01

    Here, we report a new homogeneous DNA amplification-based aptamer assay for small analyte sensing. The aptamer of adenosine chosen as the model analyte was split into two fragments able to assemble in the presence of target. Primers were introduced at extremities of one fragment in order to generate the amplifiable DNA component. The amount of amplifiable fragment was quantifiable by Real-Time Polymerase Chain Reaction (RT-PCR) amplification and directly reliable on adenosine concentration. This approach combines the very high separation efficiency and the homogeneous format (without immobilization) of capillary electrophoresis (CE) and the sensitivity of real time PCR amplification. An ultrafast isolation of target-bound split aptamer (60 s) was developed by designing a CE input/ouput scheme. Such method was successfully applied to the determination of adenosine with a LOD of 1 μM.

  3. Ultrafast Capillary Electrophoresis Isolation of DNA Aptamer for the PCR Amplification-Based Small Analyte Sensing

    NASA Astrophysics Data System (ADS)

    Fiore, Emmanuelle; Dausse, Eric; Dubouchaud, Hervé; Peyrin, Eric; Ravelet, Corinne

    2015-08-01

    Here, we report a new homogeneous DNA amplification-based aptamer assay for small analyte sensing. The aptamer of adenosine chosen as the model analyte was split into two fragments able to assemble in the presence of target. Primers were introduced at extremities of one fragment in order to generate the amplifiable DNA component. The amount of amplifiable fragment was quantifiable by Real-Time Polymerase Chain Reaction (RT-PCR) amplification and directly reliable on adenosine concentration. This approach combines the very high separation efficiency and the homogeneous format (without immobilization) of capillary electrophoresis and the sensitivity of real time PCR amplification. An ultrafast isolation of target-bound split aptamer (60 s) was developed by designing a capillary electrophoresis input/ouput scheme. Such method was successfully applied to the determination of adenosine with a LOD of 1 µM.

  4. Slow gas microflow past a sphere: Analytical solution based on moment equations

    NASA Astrophysics Data System (ADS)

    Torrilhon, Manuel

    2010-07-01

    The regularized 13-moment equations are solved analytically for the microflow of a gas past a sphere in the case of low Mach numbers. The result is given in fully explicit expressions and shows nontrivial behavior for all fluid fields including stress, heat flux, and temperature. Various aspects of the flow such as temperature polarization and total force are reproduced correctly for moderate Knudsen number. The analytical solution allows studying the rise of Knudsen layers and their interaction and coupling to the fluid variables in the bulk. Additionally, based on the regularized 13-moment equations system, hybrid boundary conditions are given for the standard Stokes equations in order to enable them to predict nonequilibrium effects in the flow past a sphere.

  5. Graphene-based materials: fabrication and application for adsorption in analytical chemistry.

    PubMed

    Wang, Xin; Liu, Bo; Lu, Qipeng; Qu, Qishu

    2014-10-01

    Graphene, a single layer of carbon atoms densely packed into a honeycomb crystal lattice with unique electronic, chemical, and mechanical properties, is the 2D allotrope of carbon. Owing to the remarkable properties, graphene and graphene-based materials are likely to find potential applications as a sorbent in analytical chemistry. The current review focuses predominantly on the recent development of graphene-based materials and demonstrates their enhanced performance in adsorption of organic compounds, metal ions, and solid phase extraction as well as in separation science since mostly 2012.

  6. A Rapid, Fluorescence-Based Field Screening Technique for Organic Species in Soil and Water Matrices.

    PubMed

    Russell, Amber L; Martin, David P; Cuddy, Michael F; Bednar, Anthony J

    2016-06-01

    Real-time detection of hydrocarbon contaminants in the environment presents analytical challenges because traditional laboratory-based techniques are cumbersome and not readily field portable. In the current work, a method for rapid and semi-quantitative detection of organic contaminants, primarily crude oil, in natural water and soil matrices has been developed. Detection limits in the parts per million and parts per billion were accomplished when using visual and digital detection methods, respectively. The extraction technique was modified from standard methodologies used for hydrocarbon analysis and provides a straight-forward separation technique that can remove interference from complex natural constituents. For water samples this method is semi-quantitative, with recoveries ranging from 70 % to 130 %, while measurements of soil samples are more qualitative due to lower extraction efficiencies related to the limitations of field-deployable procedures.

  7. Using the Fingerprinting Method to Customize RTLS Based on the AoA Ranging Technique

    PubMed Central

    Jachimczyk, Bartosz; Dziak, Damian; Kulesza, Wlodek J.

    2016-01-01

    Real-time Locating Systems (RTLSs) have the ability to precisely locate the position of things and people in real time. They are needed for security and emergency applications, but also for healthcare and home care appliances. The research aims for developing an analytical method to customize RTLSs, in order to improve localization performance in terms of precision. The proposed method is based on Angle of Arrival (AoA), a ranging technique and fingerprinting method along with an analytically defined uncertainty of AoA, and a localization uncertainty map. The presented solution includes three main concerns: geometry of indoor space, RTLS arrangement, and a statistical approach to localization precision of a pair of location sensors using an AoA signal. An evaluation of the implementation of the customized RTLS validates the analytical model of the fingerprinting map. The results of simulations and physical experiments verify the proposed method. The research confirms that the analytically established fingerprint map is the valid representation of RTLS’ performance in terms of precision. Furthermore, the research demonstrates an impact of workspace geometry and workspace layout onto the RTLS’ performance. Moreover, the studies show how the size and shape of a workspace and the placement of the calibration point affect the fingerprint map. Withal, the performance investigation defines the most effective arrangement of location sensors and its influence on localization precision. PMID:27314354

  8. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  9. Efficient Plant Supervision Strategy Using NN Based Techniques

    NASA Astrophysics Data System (ADS)

    Garcia, Ramon Ferreiro; Rolle, Jose Luis Calvo; Castelo, Francisco Javier Perez

    Most of non-linear type one and type two control systems suffers from lack of detectability when model based techniques are applied on FDI (fault detection and isolation) tasks. In general, all types of processes suffer from lack of detectability also due to the ambiguity to discriminate the process, sensors and actuators in order to isolate any given fault. This work deals with a strategy to detect and isolate faults which include massive neural networks based functional approximation procedures associated to recursive rule based techniques applied to a parity space approach.

  10. Community-Based Mental Health and Behavioral Programs for Low-Income Urban Youth: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Farahmand, Farahnaz K.; Duffy, Sophia N.; Tailor, Megha A.; Dubois, David L.; Lyon, Aaron L.; Grant, Kathryn E.; Zarlinski, Jennifer C.; Masini, Olivia; Zander, Keith J.; Nathanson, Alison M.

    2012-01-01

    A meta-analytic review of 33 studies and 41 independent samples was conducted of the effectiveness of community-based mental health and behavioral programs for low-income urban youth. Findings indicated positive effects, with an overall mean effect of 0.25 at post-test. While this is comparable to previous meta-analytic intervention research with…

  11. The Efficacy of Problem-Based Learning in an Analytical Laboratory Course for Pre-Service Chemistry Teachers

    ERIC Educational Resources Information Center

    Yoon, Heojeong; Woo, Ae Ja; Treagust, David; Chandrasegaran, A. L.

    2014-01-01

    The efficacy of problem-based learning (PBL) in an analytical chemistry laboratory course was studied using a programme that was designed and implemented with 20 students in a treatment group over 10 weeks. Data from 26 students in a traditional analytical chemistry laboratory course were used for comparison. Differences in the creative thinking…

  12. Analytical bunch compression studies for a linac-based electron accelerator

    NASA Astrophysics Data System (ADS)

    Schreck, M.; Wesolowski, P.

    2015-10-01

    The current paper deals with analytical bunch compression studies for FLUTE whose results are compared to simulations. FLUTE is a linac-based electron accelerator with a design energy of approximately 40 MeV currently being constructed at the Karlsruhe Institute of Technology. One of the goals of FLUTE is to generate electron bunches with their length lying in the femtosecond regime. In the first phase this will be accomplished using a magnetic bunch compressor. This compressor forms the subject of the studies presented. The paper is divided into two parts. The first part deals with pure geometric investigations of the bunch compressor where space charge effects and the backreaction of bunches with coherent synchrotron radiation are neglected. The second part is dedicated to the treatment of space charge effects. The upshot is that the analytical results in the two parts agree quite well with what is obtained from simulations. This paper shall form the basis for future analytical studies of the FLUTE bunch compressor and of bunch compression, in general.

  13. Analytical solution of equations describing slow axonal transport based on the stop-and-go hypothesis

    NASA Astrophysics Data System (ADS)

    Kuznetsov, Andrey

    2011-06-01

    This paper presents an analytical solution for slow axonal transport in an axon. The governing equations for slow axonal transport are based on the stop-and-go hypothesis which assumes that organelles alternate between short periods of rapid movement on microtubules (MTs), short on-track pauses, and prolonged off-track pauses, when they temporarily disengage from MTs. The model includes six kinetic states for organelles: two for off-track organelles (anterograde and retrograde), two for running organelles, and two for pausing organelles. An analytical solution is obtained for a steady-state situation. To obtain the analytical solution, the governing equations are uncoupled by using a perturbation method. The solution is validated by comparing it with a high-accuracy numerical solution. Results are presented for neurofilaments (NFs), which are characterized by small diffusivity, and for tubulin oligomers, which are characterized by large diffusivity. The difference in transport modes between these two types of organelles in a short axon is discussed. A comparison between zero-order and first-order approximations makes it possible to obtain a physical insight into the effects of organelle reversals (when organelles change the type of a molecular motor they are attached to, an anterograde versus retrograde motor).

  14. Diode laser based water vapor DIAL using modulated pulse technique

    NASA Astrophysics Data System (ADS)

    Pham, Phong Le Hoai; Abo, Makoto

    2014-11-01

    In this paper, we propose a diode laser based differential absorption lidar (DIAL) for measuring lower-tropospheric water vapor profile using the modulated pulse technique. The transmitter is based on single-mode diode laser and tapered semiconductor optical amplifier with a peak power of 10W around 800nm absorption band, and the receiver telescope diameter is 35cm. The selected wavelengths are compared to referenced wavelengths in terms of random error and systematic errors. The key component of modulated pulse technique, a macropulse, is generated with a repetition rate of 10 kHz, and the modulation within the macropulse is coded according to a pseudorandom sequence with 100ns chip width. As a result, we evaluate both single pulse modulation and pseudorandom coded pulse modulation technique. The water vapor profiles conducted from these modulation techniques are compared to the real observation data in summer in Japan.

  15. Bond strength with custom base indirect bonding techniques.

    PubMed

    Klocke, Arndt; Shi, Jianmin; Kahl-Nieke, Bärbel; Bismayer, Ulrich

    2003-04-01

    Different types of adhesives for indirect bonding techniques have been introduced recently. But there is limited information regarding bond strength with these new materials. In this in vitro investigation, stainless steel brackets were bonded to 100 permanent bovine incisors using the Thomas technique, the modified Thomas technique, and light-cured direct bonding for a control group. The following five groups of 20 teeth each were formed: (1) modified Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Maximum Cure), (2) Thomas technique with thermally cured base composite (Therma Cure) and chemically cured sealant (Custom I Q), (3) Thomas technique with light-cured base composite (Transbond XT) and chemically cured sealant (Sondhi Rapid Set), (4) modified Thomas technique with chemically cured base adhesive (Phase II) and chemically cured sealant (Maximum Cure), and (5) control group directly bonded with light-cured adhesive (Transbond XT). Mean bond strengths in groups 3, 4, and 5 were 14.99 +/- 2.85, 15.41 +/- 3.21, and 13.88 +/- 2.33 MPa, respectively, and these groups were not significantly different from each other. Groups 1 (mean bond strength 7.28 +/- 4.88 MPa) and 2 (mean bond strength 7.07 +/- 4.11 MPa) showed significantly lower bond strengths than groups 3, 4, and 5 and a higher probability of bond failure. Both the original (group 2) and the modified (group 1) Thomas technique were able to achieve bond strengths comparable to the light-cured direct bonded control group.

  16. Structural optimization for the avoidance of self-excited vibrations based on analytical models

    NASA Astrophysics Data System (ADS)

    Spelsberg-Korspeter, Gottfried

    2010-11-01

    Self-excited vibrations are a severe problem in many technical applications. In many cases they are caused by friction as for example in disk and drum brakes, clutches, saws and paper calenders. The goal to suppress self-excited vibrations can be reached by active and passive techniques, the latter ones being preferable due to the lower costs. Among design engineers it is known that breaking the symmetries of structures is sometimes helpful to avoid self-excited vibrations. This has been verified from an analytical point of view in a recent paper. The goal of the present paper is to use this analytical insight for a systematic structural optimization of rotors in frictional contact. The first system investigated is a simple discrete model of a rotor in frictional contact. As a continuous example a rotating beam in frictional contact is considered and optimized with respect to its bending stiffness. Finally a brake disk is optimized giving some attention to the feasibility of the modifications for the production process.

  17. Design optimization of thin-film/wafer-based tandem junction solar cells using analytical modeling

    NASA Astrophysics Data System (ADS)

    Davidson, Lauren; Toor, Fatima

    2016-03-01

    Several research groups are developing solar cells of varying designs and materials that are high efficiency as well as cost competitive with the single junction silicon (Si) solar cells commercially produced today. One of these solar cell designs is a tandem junction solar cell comprised of perovskite (CH3NH3PbI3) and silicon (Si). Loper et al.1 was able to create a 13.4% efficient tandem cell using a perovskite top cell and a Si bottom cell, and researchers are confident that the perovskite/Si tandem cell can be optimized in order to reach higher efficiencies without introducing expensive manufacturing processes. However, there are currently no commercially available software capable of modeling a tandem cell that is based on a thin-film based bottom cell and a wafer-based top cell. While PC1D2 and SCAPS3 are able to model tandem cells comprised solely of thin-film absorbers or solely of wafer-based absorbers, they result in convergence errors if a thin-film/wafer-based tandem cell, such as the perovskite/ Si cell, is modeled. The Matlab-based analytical model presented in this work is capable of modeling a thin-film/wafer-based tandem solar cell. The model allows a user to adjust the top and bottom cell parameters, such as reflectivity, material bandgaps, donor and acceptor densities, and material thicknesses, in order to optimize the short circuit current, open circuit voltage, and quantum efficiency of the tandem solar cell. Using the Matlab-based analytical model, we were able optimize a perovskite/Si tandem cell with an efficiency greater than 30%.

  18. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    ERIC Educational Resources Information Center

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  19. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  20. Thermal modeling for pulsed radiofrequency ablation: analytical study based on hyperbolic heat conduction.

    PubMed

    López Molina, Juan A; Rivera, María J; Trujillo, Macarena; Berjano, Enrique J

    2009-04-01

    The objectives of this study were to model the temperature progress of a pulsed radiofrequency (RF) power during RF heating of biological tissue, and to employ the hyperbolic heat transfer equation (HHTE), which takes the thermal wave behavior into account, and compare the results to those obtained using the heat transfer equation based on Fourier theory (FHTE). A theoretical model was built based on an active spherical electrode completely embedded in the biological tissue, after which HHTE and FHTE were analytically solved. We found three typical waveforms for the temperature progress depending on the relations between the dimensionless duration of the RF pulse delta(a) and the expression square root of lambda(rho-1), with lambda as the dimensionless thermal relaxation time of the tissue and rho as the dimensionless position. In the case of a unique RF pulse, the temperature at any location was the result of the overlapping of two different heat sources delayed for a duration delta(a) (each heat source being produced by a RF pulse of limitless duration). The most remarkable feature in the HHTE analytical solution was the presence of temperature peaks traveling through the medium at a finite speed. These peaks not only occurred during the RF power switch-on period but also during switch off. Finally, a physical explanation for these temperature peaks is proposed based on the interaction of forward and reverse thermal waves. All-purpose analytical solutions for FHTE and HHTE were obtained during pulsed RF heating of biological tissues, which could be used for any value of pulsing frequency and duty cycle.

  1. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  2. Research of the Urban-Rural Integration Evaluation Indicator System Based on Analytic Hierarchy Process

    NASA Astrophysics Data System (ADS)

    Zhe, Wang

    It's a key problem need to be solved in the urban-rural integration research to scientifically evaluate the development level of urban-rural integration. Based on the analysis of many factors influencing the urban-rural integration, this article is conducting an empirical research of the evaluation indicator system, as well as applying analytic hierarchy process (AHP). By the means of structuring the judgment matrix, and conducting a consistency test, both the eigenvectors corresponding to the judgment matrix and the specific index weight can be obtained.

  3. [FUNCTIONAL ANALYTIC PSYCHOTHERAPY: APPROACHES AND SCOPE OF BEHAVIOR THERAPY BASED ON CHANGES IN THE THERAPEUTIC CONTEXT].

    PubMed

    Muñoz-Martínez, Amanda M; Coletti, Juan Pablo

    2015-01-01

    Abstract Functional Analytic Psychotherapy (FAP) is a therapeutic approach developed in context. FAP is characterized by use therapeutic relationship and the behaviors emit into it to improve clients daily life functioning. This therapeutic model is supported in behavior analysis principles and contextual functionalism philosophy. FAP proposes that clients behavior in session are functional equivalent with those out of session; therefore, when therapists respond to clients behaviors in session contingently, they promote and increase improvements in the natural setting. This article poses main features of FAP, its philosophical roots, achievements and research challenges to establish FAP as an independent treatment based on the evidence.

  4. General analytical procedure for determination of acidity parameters of weak acids and bases.

    PubMed

    Pilarski, Bogusław; Kaliszan, Roman; Wyrzykowski, Dariusz; Młodzianowski, Janusz; Balińska, Agata

    2015-01-01

    The paper presents a new convenient, inexpensive, and reagent-saving general methodology for the determination of pK a values for components of the mixture of diverse chemical classes weak organic acids and bases in water solution, without the need to separate individual analytes. The data obtained from simple pH-metric microtitrations are numerically processed into reliable pK a values for each component of the mixture. Excellent agreement has been obtained between the determined pK a values and the reference literature data for compounds studied.

  5. Laser-based direct-write techniques for cell printing

    PubMed Central

    Schiele, Nathan R; Corr, David T; Huang, Yong; Raof, Nurazhani Abdul; Xie, Yubing; Chrisey, Douglas B

    2016-01-01

    Fabrication of cellular constructs with spatial control of cell location (±5 μm) is essential to the advancement of a wide range of applications including tissue engineering, stem cell and cancer research. Precise cell placement, especially of multiple cell types in co- or multi-cultures and in three dimensions, can enable research possibilities otherwise impossible, such as the cell-by-cell assembly of complex cellular constructs. Laser-based direct writing, a printing technique first utilized in electronics applications, has been adapted to transfer living cells and other biological materials (e.g., enzymes, proteins and bioceramics). Many different cell types have been printed using laser-based direct writing, and this technique offers significant improvements when compared to conventional cell patterning techniques. The predominance of work to date has not been in application of the technique, but rather focused on demonstrating the ability of direct writing to pattern living cells, in a spatially precise manner, while maintaining cellular viability. This paper reviews laser-based additive direct-write techniques for cell printing, and the various cell types successfully laser direct-written that have applications in tissue engineering, stem cell and cancer research are highlighted. A particular focus is paid to process dynamics modeling and process-induced cell injury during laser-based cell direct writing. PMID:20814088

  6. Evaluation of the Analytical Performance of the Coulometry-Based Optium Omega Blood Glucose Meter

    PubMed Central

    Solnica, Bogdan; Kusnierz-Cabala, Beata; Slowinska-Solnica, Krystyna; Witek, Przemyslaw; Cempa, Agnieszka; Malecki, Maciej T

    2011-01-01

    Background The goal of diabetes treatment is maintaining near normoglycemia based on self-monitoring of blood glucose (SMBG). In this study, an evaluation of the analytical performance of the coulometry-based Optium Omega™ glucose meter designed for SMBG has been carried out. Methods The assessment of precision and between-lot variability was based on glucose measurements in ethylene-diaminetetraacetic acid venous blood samples. Glucose concentrations measured in 289 fresh capillary blood samples using the Omega glucose meter and the Biosen C_line analyzer were compared. Results Within-run imprecision coefficient of variation for the lower and higher glucose concentrations amounted to 5.09 and 2.1%, respectively. The relative lot-dependent differences found for the lower and higher glucose concentrations were equal to 6.8 and 2.6%, respectively. The glucose meter error calculated for various concentration ranges amounted from 2.22 to 4.48%. The glucose meter error met the accuracy criteria recommended by the International Organization for Standardization and the American Diabetes Association. The Passing-Bablok agreement test and error grid analysis with 96% of results in zone A indicated good concordance of results, including glucose concentrations below 100 mg/dl. Conclusions The evaluated Optium Omega glucose meter fits the analytical requirements for its use in blood glucose monitoring in diabetes patients. PMID:22226286

  7. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  8. Analysis of acoustic damping in duct terminated by porous absorption materials based on analytical models and finite element simulations

    NASA Astrophysics Data System (ADS)

    Guan Qiming

    Acoustic absorption materials are widely used today to dampen and attenuate the noises which exist almost everywhere and have adverse impact upon daily life of human beings. In order to evaluate the absorption performance of such materials, it is necessary to experimentally determine acoustic properties of absorption materials. Two experimental methods, one is Standing Wave Ratio Method and the other is Transfer-Function Method, which also totally called as Impedance Tube Method, are based on two analytical models people have used to evaluate and validate the data obtained from acoustic impedance analyzers. This thesis first reviews the existing analytical models of previous two experimental methods in the literature by looking at their analytical models, respectively. Then a new analytical model is developed is developed based on One-Microphone Method and Three-Microphone Method, which are two novel experimental approaches. Comparisons are made among these analytical models, and their advantages and disadvantages are discussed.

  9. Dissolved Silver in Marine Waters: Reviewing Three Decades of Advances in Analytical Techniques and Understanding its Biogeochemical Cycling

    NASA Astrophysics Data System (ADS)

    Ndungu, K.; Flegal, A. R., Jr.

    2015-12-01

    Although billions of dollars have been spent over the past half-century to reduce contamination of U.S. waters, quantifying parts-per-billion reductions in surface water concentration since has been relatively unsuccessful. The reasons for the failure in identifying the benefits of these remediative efforts include: (i) historic (pre-1980) problems in accurately sampling and analyzing trace element concentrations at parts-per-billion level, so that temporal reductions in trace metal contamination reflected improved sampling and analytical accuracy rather than real decreases in those concentrations; (ii) limited seasonal and long term research. Silver in its ionic form is more toxic to aquatic organisms than any other metal except Hg. Because Ag is not common naturally in the environment, its elevated presence in water, sediment or biological tissues is usually indicative of anthropogenic influences. However, there is very little published data on Ag levels in both water and sediment. The published studies include Ag levels in a few U.S. estuarine waters, including detailed and time series studies for the San Francisco Estuary system by the WIGS lab at UC Santa Cruz. In the open Ocean, Ag measurements are limited to a few studies in the North and South Pacific, The North and South Atlantic. However, as Gallon and Flegal recently noted, there is no available data on Ag concentrations from the Indian Ocean! Most of the dissolved Ag data from the Atlantic was made in WIGS lab at UC Santa Cruz Analytical determination of Ag in seawater has come a long way since Murozumi reported the first dissolved Ag measurements from the N. Pacific in 1981 using isotope dilution MS after solvent extraction. In this presentation I will review analytical developments for Ag determination in the last three decades. I will also highlight the missing data gaps and present new tentative data on dissolved Ag concentration and cycling in polar regions including the Antarctic (Amundsen Sea

  10. Knowledge-based geographic information systems (KBGIS): new analytic and data management tools

    SciTech Connect

    Albert, T.M.

    1988-11-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the US Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved.

  11. Knowledge-based geographic information systems (KBGIS): New analytic and data management tools

    USGS Publications Warehouse

    Albert, T.M.

    1988-01-01

    In its simplest form, a geographic information system (GIS) may be viewed as a data base management system in which most of the data are spatially indexed, and upon which sets of procedures operate to answer queries about spatial entities represented in the data base. Utilization of artificial intelligence (AI) techniques can enhance greatly the capabilities of a GIS, particularly in handling very large, diverse data bases involved in the earth sciences. A KBGIS has been developed by the U.S. Geological Survey which incorporates AI techniques such as learning, expert systems, new data representation, and more. The system, which will be developed further and applied, is a prototype of the next generation of GIS's, an intelligent GIS, as well as an example of a general-purpose intelligent data handling system. The paper provides a description of KBGIS and its application, as well as the AI techniques involved. ?? 1988 International Association for Mathematical Geology.

  12. Evaluation of generic types of drilling fluid using a risk-based analytic hierarchy process.

    PubMed

    Sadiq, Rehan; Husain, Tahir; Veitch, Brian; Bose, Neil

    2003-12-01

    The composition of drilling muds is based on a mixture of clays and additives in a base fluid. There are three generic categories of base fluid--water, oil, and synthetic. Water-based fluids (WBFs) are relatively environmentally benign, but drilling performance is better with oil-based fluids (OBFs). The oil and gas industry developed synthetic-based fluids (SBFs), such as vegetable esters, olefins, ethers, and others, which provide drilling performance comparable to OBFs, but with lower environmental and occupational health effects. The primary objective of this paper is to present a methodology to guide decision-making in the selection and evaluation of three generic types of drilling fluids using a risk-based analytic hierarchy process (AHP). In this paper a comparison of drilling fluids is made considering various activities involved in the life cycle of drilling fluids. This paper evaluates OBFs, WBFs, and SBFs based on four major impacts--operations, resources, economics, and liabilities. Four major activities--drilling, discharging offshore, loading and transporting, and disposing onshore--cause the operational impacts. Each activity involves risks related to occupational injuries (safety), general public health, environmental impact, and energy use. A multicriteria analysis strategy was used for the selection and evaluation of drilling fluids using a risk-based AHP. A four-level hierarchical structure is developed to determine the final relative scores, and the SBFs are found to be the best option.

  13. Evaluation of generic types of drilling fluid using a risk-based analytic hierarchy process.

    PubMed

    Sadiq, Rehan; Husain, Tahir; Veitch, Brian; Bose, Neil

    2003-12-01

    The composition of drilling muds is based on a mixture of clays and additives in a base fluid. There are three generic categories of base fluid--water, oil, and synthetic. Water-based fluids (WBFs) are relatively environmentally benign, but drilling performance is better with oil-based fluids (OBFs). The oil and gas industry developed synthetic-based fluids (SBFs), such as vegetable esters, olefins, ethers, and others, which provide drilling performance comparable to OBFs, but with lower environmental and occupational health effects. The primary objective of this paper is to present a methodology to guide decision-making in the selection and evaluation of three generic types of drilling fluids using a risk-based analytic hierarchy process (AHP). In this paper a comparison of drilling fluids is made considering various activities involved in the life cycle of drilling fluids. This paper evaluates OBFs, WBFs, and SBFs based on four major impacts--operations, resources, economics, and liabilities. Four major activities--drilling, discharging offshore, loading and transporting, and disposing onshore--cause the operational impacts. Each activity involves risks related to occupational injuries (safety), general public health, environmental impact, and energy use. A multicriteria analysis strategy was used for the selection and evaluation of drilling fluids using a risk-based AHP. A four-level hierarchical structure is developed to determine the final relative scores, and the SBFs are found to be the best option. PMID:15160901

  14. Advances in analytical chemistry

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.

    1991-01-01

    Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.

  15. An analytical platform for mass spectrometry-based identification and chemical analysis of RNA in ribonucleoprotein complexes.

    PubMed

    Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki

    2009-11-01

    We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.

  16. An analytical platform for mass spectrometry-based identification and chemical analysis of RNA in ribonucleoprotein complexes

    PubMed Central

    Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki

    2009-01-01

    We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a ∼21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes. PMID:19740761

  17. Determination of the fate of three Fusarium mycotoxins through wet-milling of maize using an improved HPLC analytical technique.

    PubMed

    Lauren, D R; Ringrose, M A

    1997-07-01

    The fate of three Fusarium mycotoxins, nivalenol (NIV), deoxynivalenol (DON) and zearalenone (ZEN), all common contaminants in New Zealand-grown maize, has been measured in fractions of maize after passage through a commercial wet-milling plant. Distribution of the three toxins follows a pattern reasonably expected from their physical solubility characteristics. The highly water-soluble mycotoxins, NIV and DON, were found at high concentrations (up to 8.8 mg/kg) in concentrated steep liquor (CSL) fractions, but at low levels (less than 0.3 mg/kg) in the solid (germ, fibre and gluten) fractions. The converse was true for ZEN, which is relatively insoluble in water. For ZEN, the maximum concentration found in CSL was 0.6 mg/kg compared with 2.2-4.8 mg/kg in germ, fibre and gluten fractions. Accordingly, an animal food byproduct composed mainly of pressed fibre and concentrated steep liquor was usually found to contain concentrations of all three mycotoxins above those existing in the input maize. A single sample of corn oil recovered during the study also had a high concentration (4.6 mg/kg) of ZEN. The analytical clean-up method used converts all trichothecenes present to parent alcohols, therefore results are indicative of total trichothecene content. HPLC analytical conditions suitable for the analysis of NIV and DON in complex process grain products are also described. PMID:9328527

  18. Identifying functional subdivisions in the human brain using meta-analytic activation modeling-based parcellation.

    PubMed

    Yang, Yong; Fan, Lingzhong; Chu, Congying; Zhuo, Junjie; Wang, Jiaojian; Fox, Peter T; Eickhoff, Simon B; Jiang, Tianzi

    2016-01-01

    Parcellation of the human brain into fine-grained units by grouping voxels into distinct clusters has been an effective approach for delineating specific brain regions and their subregions. Published neuroimaging studies employing coordinate-based meta-analyses have shown that the activation foci and their corresponding behavioral categories may contain useful information about the anatomical-functional organization of brain regions. Inspired by these developments, we proposed a new parcellation scheme called meta-analytic activation modeling-based parcellation (MAMP) that uses meta-analytically obtained information. The raw meta data, including the experiments and the reported activation coordinates related to a brain region of interest, were acquired from the Brainmap database. Using this data, we first obtained the "modeled activation" pattern by modeling the voxel-wise activation probability given spatial uncertainty for each experiment that featured at least one focus within the region of interest. Then, we processed these "modeled activation" patterns across the experiments with a K-means clustering algorithm to group the voxels into different subregions. In order to verify the reliability of the method, we employed our method to parcellate the amygdala and the left Brodmann area 44 (BA44). The parcellation results were quite consistent with previous cytoarchitectonic and in vivo neuroimaging findings. Therefore, the MAMP proposed in the current study could be a useful complement to other methods for uncovering the functional organization of the human brain.

  19. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory

    NASA Astrophysics Data System (ADS)

    Cancilla, Devon A.

    2001-12-01

    Environmental chemists face difficult challenges related to generating, interpreting, and communicating complex chemical data in a manner understandable by nonchemists. For this reason, it is essential that environmental chemistry students develop the skills necessary not only to collect and interpret complex data sets, but also to communicate their findings in a credible manner in nonscientific forums. Key to this requirement is an understanding of the quality assurance/quality control (QA/QC) elements used to support specific findings. This paper describes the development of a problem-based undergraduate environmental analytical chemistry laboratory and its integration with an undergraduate environmental law course. The course is designed to introduce students to the principles of performance-based analytical methods and the use of environmental indicators to perform environmental assessments. Conducting a series of chemical and toxicological tests, chemistry students perform an environmental assessment on the watershed of the mythical City of Rowan. Law students use these assessments to develop legal arguments under both the Safe Drinking Water Act and the Clean Water Act.

  20. Microfluidic paper-based analytical devices fabricated by low-cost photolithography and embossing of Parafilm®.

    PubMed

    Yu, Ling; Shi, Zhuan Zhuan

    2015-04-01

    Microfluidic paper-based analytical devices (μPADs) attract tremendous attention as an economical tool for in-field diagnosis, food safety and environmental monitoring. We innovatively fabricated 2D and 3D μPADs by photolithography-patterning microchannels on a Parafilm® and subsequently embossing them to paper. This truly low-cost, wax printer and cutter plotter independent approach offers the opportunity for researchers from resource-limited laboratories to work on paper-based analytical devices.

  1. From Input to Output: Communication-Based Teaching Techniques.

    ERIC Educational Resources Information Center

    Tschirner, Erwin

    1992-01-01

    Communication-based teaching techniques are described that lead German language students from input to output in a stimulating and motivating learning environment. Input activities are most useful for presenting speech acts, vocabulary, and grammar; output activities, for fine-tuning those areas as well as for expanding students' productive…

  2. "Ayeli": Centering Technique Based on Cherokee Spiritual Traditions.

    ERIC Educational Resources Information Center

    Garrett, Michael Tlanusta; Garrett, J. T.

    2002-01-01

    Presents a centering technique called "Ayeli," based on Cherokee spiritual traditions as a way of incorporating spirituality into counseling by helping clients identify where they are in their journey, where they want to be, and how they can get there. Relevant Native cultural traditions and meanings are explored. (Contains 25 references.) (GCP)

  3. Paper-based analytical devices for electrochemical study of the breathing process of red blood cells.

    PubMed

    Lin, Xiang-Yun; Wu, Ling-Ling; Pan, Zhong-Qin; Shi, Chuan-Guo; Bao, Ning; Gu, Hai-Ying

    2015-04-01

    Herein we utilized the filter paper to physically trap red blood cells (RBC) to observe the breathing process of red blood cells based on the permeability of the filter paper. By integrating double-sided conductive carbon tape as the working electrodes, the device could be applied to monitor electrochemical responses of RBC for up to hundreds of minutes. The differential pulse voltammetry (DPV) peak currents increased under oxygen while decreased under nitrogen, indicating that RBC could take in and release oxygen. Further studies demonstrated that the RBC suspension could more effectively take in oxygen than the solution of hemoglobin and the supernatant of RBC, suggesting the natural advantage of RBC on oxygen transportation. This study implied that simple paper-based analytical devices might be effectively applied in the study of gas-participating reactions and biochemical detections.

  4. New analytical techniques for mycotoxins in complex organic matrices. [Aflatoxins B1, B2, G1, and G2

    SciTech Connect

    Bicking, M.K.L.

    1982-07-01

    Air samples are collected for analysis from the Ames Solid Waste Recovery System. The high level of airborne fungi within the processing area is of concern due to the possible presence of toxic mycotoxins, and carcinogenic fungal metabolites. An analytical method has been developed to determine the concentration of aflatoxins B1, B2, G1, and G2 in the air of the plant which produces Refuse Derived Fuel (RDF). After extraction with methanol, some components in the matrix are precipitated by dissolving the sample in 30% acetonitrile/chloroform. An aliquot of this solution is injected onto a Styragel column where the sample components undergo simultaneous size exclusion and reverse phase partitioning. Additional studies have provided a more thorough understanding of solvent related non-exclusion effects on size exclusion gels. The Styragel column appears to have a useable lifetime of more than six months. After elution from Styragel, the sample is diverted to a second column containing Florisil which has been modified with oxalic acid and deactivated with water. Aflatoxins are eluted with 5% water/acetone. After removal of this solvent, the sample is dissolved in 150 ..mu..L of a spotting solvent and the entire sample applied to a thin layer chromatography (TLC) plate using a unique sample applicator developed here. The aflatoxins on the TLC plate are analyzed by laser fluorescence. A detection limit of 10 pg is possible for aflatoxin standards using a nitrogen laser as the excitation source. Sample concentrations are determined by comparing with an internal standard, a specially synthesized aflatoxin derivative. In two separate RDF samples, aflatoxin B1 was found at levels of 6.5 and 17.0 ppB. The analytical method has also proven useful in the analysis of contaminated corn and peanut meal samples. 42 figures, 8 tables.

  5. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  6. Video multiple watermarking technique based on image interlacing using DWT.

    PubMed

    Ibrahim, Mohamed M; Abdel Kader, Neamat S; Zorkany, M

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth.

  7. Video Multiple Watermarking Technique Based on Image Interlacing Using DWT

    PubMed Central

    Ibrahim, Mohamed M.; Abdel Kader, Neamat S.; Zorkany, M.

    2014-01-01

    Digital watermarking is one of the important techniques to secure digital media files in the domains of data authentication and copyright protection. In the nonblind watermarking systems, the need of the original host file in the watermark recovery operation makes an overhead over the system resources, doubles memory capacity, and doubles communications bandwidth. In this paper, a robust video multiple watermarking technique is proposed to solve this problem. This technique is based on image interlacing. In this technique, three-level discrete wavelet transform (DWT) is used as a watermark embedding/extracting domain, Arnold transform is used as a watermark encryption/decryption method, and different types of media (gray image, color image, and video) are used as watermarks. The robustness of this technique is tested by applying different types of attacks such as: geometric, noising, format-compression, and image-processing attacks. The simulation results show the effectiveness and good performance of the proposed technique in saving system resources, memory capacity, and communications bandwidth. PMID:25587570

  8. Estimating flood-frequency curves with scarce data: a physically-based analytic approach

    NASA Astrophysics Data System (ADS)

    Basso, Stefano; Schirmer, Mario; Botter, Gianluca

    2016-04-01

    Predicting magnitude and frequency of floods is a key issue for hazard assessment and mitigation. While observations and statistical methods provide good estimates when long data series are available, their performances deteriorate with limited data. Moreover, the outcome of varying hydroclimatic drivers can hardly be evaluated by these methods. Physically-based approaches embodying mechanics of streamflow generation provide a valuable alternative that may improve purely statistical estimates and cope with human-induced alteration of climate and landscape. In this work, a novel analytic approach is proposed to derive seasonal flood-frequency curves, and to estimate the recurrence intervals of seasonal maxima. The method builds on a stochastic description of daily streamflows, arising from rainfall and soil moisture dynamics in the catchment. The limited number of parameters involved in the formulation embody climate and landscape attributes of the contributing catchment, and can be specified based on daily rainfall and streamflow data. The application to two case studies suggests the model ability to provide reliable estimates of seasonal flood-frequency curves in different climatic settings, and to mimic shapes of flood-frequency curves emerging in persistent and erratic flow regimes. The method is especially valuable when only short data series are available (e.g. newly or temporarily gauged catchments, modified climatic or landscape features). Indeed, estimates provided by the model for high flow events characterized by recurrence times greater than the available sample size do not deteriorate significantly, as compared to performance of purely statistical methods. The proposed physically-based analytic approach represent a first step toward a probabilistic characterization of extremes based on climate and landscape attributes, which may be especially valuable to assess flooding hazard in data scarce regions and support the development of reliable mitigation

  9. Human motion planning based on recursive dynamics and optimal control techniques

    NASA Technical Reports Server (NTRS)

    Lo, Janzen; Huang, Gang; Metaxas, Dimitris

    2002-01-01

    This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.

  10. Effect of temperature on acid-base equilibria in separation techniques. A review.

    PubMed

    Gagliardi, Leonardo G; Tascon, Marcos; Castells, Cecilia B

    2015-08-19

    Studies on the theoretical principles of acid-base equilibria are reviewed and the influence of temperature on secondary chemical equilibria within the context of separation techniques, in water and also in aqueous-organic solvent mixtures, is discussed. In order to define the relationships between the retention in liquid chromatography or the migration velocity in capillary electrophoresis and temperature, the main properties of acid-base equilibria have to be taken into account for both, the analytes and the conjugate pairs chosen to control the solution pH. The focus of this review is based on liquid-liquid extraction (LLE), liquid chromatography (LC) and capillary electrophoresis (CE), with emphasis on the use of temperature as a useful variable to modify selectivity on a predictable basis. Simplified models were evaluated to achieve practical optimizations involving pH and temperature (in LLE and CE) as well as solvent composition in reversed-phase LC.

  11. Application and further development of diffusion based 2D chemical imaging techniques in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Hoefer, Christoph; Santner, Jakob; Borisov, Sergey; Kreuzeder, Andreas; Wenzel, Walter; Puschenreiter, Markus

    2015-04-01

    Two dimensional chemical imaging of root processes refers to novel in situ methods to investigate and map solutes at a high spatial resolution (sub-mm). The visualization of these solutes reveals new insights in soil biogeochemistry and root processes. We derive chemical images by using data from DGT-LA-ICP-MS (Diffusive Gradients in Thin Films and Laser Ablation Inductively Coupled Plasma Mass Spectrometry) and POS (Planar Optode Sensors). Both technologies have shown promising results when applied in aqueous environment but need to be refined and improved for imaging at the soil-plant interface. Co-localized mapping using combined DGT and POS technologies and the development of new gel combinations are in our focus. DGTs are smart and thin (<0.4 mm) hydrogels; containing a binding resin for the targeted analytes (e.g. trace metals, phosphate, sulphide or radionuclides). The measurement principle is passive and diffusion based. The present analytes are diffusing into the gel and are bound by the resin. Thereby, the resin acts as zero sink. After application, DGTs are retrieved, dried, and analysed using LA-ICP-MS. The data is then normalized by an internal standard (e.g. 13C), calibrated using in-house standards and chemical images of the target area are plotted using imaging software. POS are, similar to DGT, thin sensor foils containing a fluorophore coating depending on the target analyte. The measurement principle is based on excitation of the flourophore by a specific wavelength and emission of the fluorophore depending on the presence of the analyte. The emitted signal is captured using optical filters and a DSLR camera. While DGT analysis is destructive, POS measurements can be performed continuously during the application. Both semi-quantitative techniques allow an in situ application to visualize chemical processes directly at the soil-plant interface. Here, we present a summary of results from rhizotron experiments with different plants in metal

  12. Analyst in the Nursery: An Application of Child Analytic Techniques in a Therapeutic Nursery. I. A Schematic Description.

    ERIC Educational Resources Information Center

    Kliman, Gilbert

    The Cornerstone Project is an application of child psychoanalytic techniques in synergy with therapeutic nursery education. The Cornerstone School and the method associated with it provide treatment for children ages three to six years within a nursery classroom group setting. A therapist works six or more hours per week in the classroom, during…

  13. Experiments on Adaptive Techniques for Host-Based Intrusion Detection

    SciTech Connect

    DRAELOS, TIMOTHY J.; COLLINS, MICHAEL J.; DUGGAN, DAVID P.; THOMAS, EDWARD V.; WUNSCH, DONALD

    2001-09-01

    This research explores four experiments of adaptive host-based intrusion detection (ID) techniques in an attempt to develop systems that can detect novel exploits. The technique considered to have the most potential is adaptive critic designs (ACDs) because of their utilization of reinforcement learning, which allows learning exploits that are difficult to pinpoint in sensor data. Preliminary results of ID using an ACD, an Elman recurrent neural network, and a statistical anomaly detection technique demonstrate an ability to learn to distinguish between clean and exploit data. We used the Solaris Basic Security Module (BSM) as a data source and performed considerable preprocessing on the raw data. A detection approach called generalized signature-based ID is recommended as a middle ground between signature-based ID, which has an inability to detect novel exploits, and anomaly detection, which detects too many events including events that are not exploits. The primary results of the ID experiments demonstrate the use of custom data for generalized signature-based intrusion detection and the ability of neural network-based systems to learn in this application environment.

  14. Implementation and robustness of an analytically based battery state of power

    NASA Astrophysics Data System (ADS)

    Wik, Torsten; Fridholm, Björn; Kuusisto, Hannes

    2015-08-01

    Today it is common practice to use simplified equivalent circuit models for predicting the short term behaviour of the voltage and current during charging and discharging battery cells. If the circuit parameters are assumed to be unchanged the response for a given open circuit voltage (OCV) will be the solution to a linear ordinary differential equation. This means that for given voltage limits the maximum charge and discharge powers can be analytically derived. In advanced battery management units, such as those used for hybrid electric vehicles, it is central to know how much that can be charged or discharged within a certain range of time, which is one definition of state of power (SoP). Using the linearizing assumption we derive a method for an adaptive estimation of the state of power based on incremental analysis. The method is easy to implement and have two tuning parameters that are straightforward to relate to. Using frequency analysis the method is analytically proven to have very strong robustness properties. The risk of exceeding voltage limits by effectively applying the maximum charge or discharge currents is marginal in spite of large circuit parameter errors, unmodelled hysteresis, unknown OCV and static nonlinearities.

  15. [Retrieve phycocyanin concentrations based on semi-analytical model in the Dianchi Lake, China].

    PubMed

    Yin, Bin; Lü, Heng; Li, Yun-Mei; Wu, Chuan-Qing; Zhu, Li; Wang, Yan-Fei

    2011-02-01

    Phycocyanin (PC) in the blue-green algae is usually used to detective the quantity of the blue-green algae, because of its special absorption at band 620 nm. A semi-analytical model retrieving phycocyanin concentrations is been built, based on a nested semi-empirical band ratio algorithm, using the data sets collected in September 19 and September 20, 2009 from Dianchi Lake. The empirical relationship between the specific absorption coefficient at band 620 nm [a(PC)* (620)] and the absorption coefficient at band 620 nm [a(PC) (620)] reduces the impact of the variability of a(PC)* (620) in the model built by Simis. The new semi-analytical model is proved well done in retrieving phycocyanin concentrations and has a mean relative error (MRE) 21.63% by the dataset collected on December 1, 2009 from Dianchi Lake. The model error analysis prove that the main reason of the error is caused by the component and concentrations of pigments changing seasonally in the blue-green algae.

  16. Evaluation of FTIR-based analytical methods for the analysis of simulated wastes

    SciTech Connect

    Rebagay, T.V.; Cash, R.J.; Dodd, D.A.; Lockrem, L.L.; Meacham, J.E.; Winkelman, W.D.

    1994-09-30

    Three FTIR-based analytical methods that have potential to characterize simulated waste tank materials have been evaluated. These include: (1) fiber optics, (2) modular transfer optic using light guides equipped with non-contact sampling peripherals, and (3) photoacoustic spectroscopy. Pertinent instrumentation and experimental procedures for each method are described. The results show that the near-infrared (NIR) region of the infrared spectrum is the region of choice for the measurement of moisture in waste simulants. Differentiation of the NIR spectrum, as a preprocessing steps, will improve the analytical result. Preliminary data indicate that prominent combination bands of water and the first overtone band of the ferrocyanide stretching vibration may be utilized to measure water and ferrocyanide species simultaneously. Both near-infrared and mid-infrared spectra must be collected, however, to measure ferrocyanide species unambiguously and accurately. For ease of sample handling and the potential for field or waste tank deployment, the FTIR-Fiber Optic method is preferred over the other two methods. Modular transfer optic using light guides and photoacoustic spectroscopy may be used as backup systems and for the validation of the fiber optic data.

  17. Gel pad array chip for high throughput and multi-analyte microbead-based immunoassays.

    PubMed

    Zhu, Qingdi; Trau, Dieter

    2015-04-15

    We present here a gel pad array chip for high-throughput and multi-analyte microbead-based immunoassays. The chip is fabricated by photo-patterning of two polymeric gels, polyacrylamide gel and polyethylene glycol (PEG) gel, on a glass slide. The resulting chip consists of 40 polyacrylamide gel pad array units for the immobilization of microbeads and each gel pad array is surrounded with a PEG micropillar ring to confine the samples within the microarray. As a proof of concept, this chip was tested for quantitative immunoassays for two model cancer markers, human chorionic gonadotropin (hCG) and prostate specific antigen (PSA), in serum samples. Detection limits below the physiological threshold level for cancer diagnosis were achieved with good inter- and intra-chip reproducibility. Moreover, by using spatial encoded microbeads, simultaneous detection of both hCG and PSA on each gel pad array is achieved with single filter fluorescence imaging. This gel pad array chip is easy to use, easy to fabricate with low cost materials and minimal equipment and reusable. It could be a useful tool for common biolabs to customize their own microbead array for multi-analyte immunoassays.

  18. Scalable multi-variate analytics of seismic and satellite-based observational data.

    PubMed

    Yuan, Xiaoru; He, Xiao; Guo, Hanqi; Guo, Peihong; Kendall, Wesley; Huang, Jian; Zhang, Yongxian

    2010-01-01

    Over the past few years, large human populations around the world have been affected by an increase in significant seismic activities. For both conducting basic scientific research and for setting critical government policies, it is crucial to be able to explore and understand seismic and geographical information obtained through all scientific instruments. In this work, we present a visual analytics system that enables explorative visualization of seismic data together with satellite-based observational data, and introduce a suite of visual analytical tools. Seismic and satellite data are integrated temporally and spatially. Users can select temporal ;and spatial ranges to zoom in on specific seismic events, as well as to inspect changes both during and after the events. Tools for designing high dimensional transfer functions have been developed to enable efficient and intuitive comprehension of the multi-modal data. Spread-sheet style comparisons are used for data drill-down as well as presentation. Comparisons between distinct seismic events are also provided for characterizing event-wise differences. Our system has been designed for scalability in terms of data size, complexity (i.e. number of modalities), and varying form factors of display environments.

  19. Graphene-based terahertz photodetector by noise thermometry technique

    SciTech Connect

    Wang, Ming-Jye; Wang, Ji-Wun; Wang, Chun-Lun; Chiang, Yen-Yu; Chang, Hsian-Hong

    2014-01-20

    We report the characteristics of graphene-based terahertz (THz) photodetector based on noise thermometry technique by measuring its noise power at frequency from 4 to 6 GHz. Hot electron system in graphene microbridge is generated after THz photon pumping and creates extra noise power. The equivalent noise temperature and electron temperature increase rapidly in low THz pumping regime and saturate gradually in high THz power regime which is attributed to a faster energy relaxation process involved by stronger electron-phonon interaction. Based on this detector, a conversion efficiency around 0.15 from THz power to noise power in 4–6 GHz span has been achieved.

  20. Micromachined silicon-based analytical microinstruments for space science and planetary exploration

    SciTech Connect

    Grunthaner, F.J.; Stalder, R.E.; Boumsellek, S.; Van Zandt, T.R.; Kenny, T.W.; Hecht, M.H.; Ksendzov, A.; Homer, M.L.; Terhune, R.W.; Lane, A.L.

    1994-09-01

    For future planetary science missions, the authors are developing a series of microinstruments using the techniques of silicon-based micromachining. Conventional instruments such as chemical sensors, charged particle analyzers and mass spectrometers are reduced in size and effective volume to the dimension of cubic centimeters, while maintaining or enhancing performance. Using wafer/wafer bonding techniques, selective chemical etching, thin Film growth, and high resolution lithography, complex three dimensional structures can be assembled. This paper discusses the design, implementation and performance of two new instruments: The Micromachined Bessel Box Auger Electron Spectrometer, and the Mars Soil Chemistry Experiment (MOx).

  1. Statistics and Machine Learning based Outlier Detection Techniques for Exoplanets

    NASA Astrophysics Data System (ADS)

    Goel, Amit; Montgomery, Michele

    2015-08-01

    Architectures of planetary systems are observable snapshots in time that can indicate formation and dynamic evolution of planets. The observable key parameters that we consider are planetary mass and orbital period. If planet masses are significantly less than their host star masses, then Keplerian Motion is defined as P^2 = a^3 where P is the orbital period in units of years and a is the orbital period in units of Astronomical Units (AU). Keplerian motion works on small scales such as the size of the Solar System but not on large scales such as the size of the Milky Way Galaxy. In this work, for confirmed exoplanets of known stellar mass, planetary mass, orbital period, and stellar age, we analyze Keplerian motion of systems based on stellar age to seek if Keplerian motion has an age dependency and to identify outliers. For detecting outliers, we apply several techniques based on statistical and machine learning methods such as probabilistic, linear, and proximity based models. In probabilistic and statistical models of outliers, the parameters of a closed form probability distributions are learned in order to detect the outliers. Linear models use regression analysis based techniques for detecting outliers. Proximity based models use distance based algorithms such as k-nearest neighbour, clustering algorithms such as k-means, or density based algorithms such as kernel density estimation. In this work, we will use unsupervised learning algorithms with only the proximity based models. In addition, we explore the relative strengths and weaknesses of the various techniques by validating the outliers. The validation criteria for the outliers is if the ratio of planetary mass to stellar mass is less than 0.001. In this work, we present our statistical analysis of the outliers thus detected.

  2. MEMS-based power generation techniques for implantable biosensing applications.

    PubMed

    Lueke, Jonathan; Moussa, Walied A

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.

  3. MEMS-Based Power Generation Techniques for Implantable Biosensing Applications

    PubMed Central

    Lueke, Jonathan; Moussa, Walied A.

    2011-01-01

    Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362

  4. Novel techniques and the future of skull base reconstruction.

    PubMed

    Meier, Joshua C; Bleier, Benjamin S

    2013-01-01

    The field of endoscopic skull base surgery has evolved considerably in recent years fueled largely by advances in both imaging and instrumentation. While the indications for these approaches continue to be extended, the ability to reconstruct the resultant defects has emerged as a rate-limiting obstacle. Postoperative failures with current multilayer grafting techniques remain significant and may increase as the indications for endoscopic resections continue to expand. Laser tissue welding represents a novel method of wound repair in which laser energy is applied to a chromophore doped biologic solder at the wound edge to create a laser weld (fig. 1). These repairs are capable of withstanding forces far exceeding those exerted by intracranial pressure with negligible collateral thermal tissue injury. Recent clinical trials have demonstrated the safety and feasibility of endoscopic laser welding while exposing the limitations of first generation hyaluronic acid based solders. Novel supersaturated gel based solders are currently being tested in clinical trials and appear to possess significantly improved viscoelastic properties. While laser tissue welding remains an experimental technique, continued success with these novel solder formulations may catalyze the widespread adoption of this technique for skull base repair in the near future.

  5. Online state of health estimation on NMC cells based on predictive analytics

    NASA Astrophysics Data System (ADS)

    Berecibar, Maitane; Devriendt, Floris; Dubarry, Matthieu; Villarreal, Igor; Omar, Noshin; Verbeke, Wouter; Van Mierlo, Joeri

    2016-07-01

    Accurate on board state of health estimation is a key battery management system function to provide optimal management of the battery system under control. In this regard, this paper presents an extensive study and comparison of three of commonly used supervised learning methods for state of health estimation in Graphite/Nickel Manganese Cobalt oxide cells. The three methods were based from the study of both incremental capacity and differential voltage curves. According to the ageing evolution of both curves, features were extracted and used as inputs for the estimation techniques. Ordinary Least Squares, Multilayer Perceptron and Support Vector Machine were used as the estimation techniques and accurate results were obtained while requiring a low computational effort. Moreover, this work allows a deep comparison of the different estimation techniques in terms of accuracy, online estimation and BMS applicability. In addition, estimation can be developed by partial charging and/or partial discharging, reducing the required maintenance time.

  6. GraphReduce: Large-Scale Graph Analytics on Accelerator-Based HPC Systems

    SciTech Connect

    Sengupta, Dipanjan; Agarwal, Kapil; Song, Shuaiwen; Schwan, Karsten

    2015-09-30

    Recent work on real-world graph analytics has sought to leverage the massive amount of parallelism offered by GPU devices, but challenges remain due to the inherent irregularity of graph algorithms and limitations in GPU-resident memory for storing large graphs. We present GraphReduce, a highly efficient and scalable GPU-based framework that operates on graphs that exceed the device’s internal memory capacity. GraphReduce adopts a combination of both edge- and vertex-centric implementations of the Gather-Apply-Scatter programming model and operates on multiple asynchronous GPU streams to fully exploit the high degrees of parallelism in GPUs with efficient graph data movement between the host and the device.

  7. Shear deformable deformation of carbon nanotubes based on a new analytical nonlocal Timoshenko beam nodel

    SciTech Connect

    Zhang, Jianming; Yang, Yang

    2015-03-10

    According to Hamilton’s principle, a new mathematical model and analytical solutions for nonlocal Timoshenko beam model (ANT) is established based on nonlocal elastic continuum theory when shear deformation and nonlocal effect are considered. The new ANT equilibrium equations and boundary conditions are derived for bending analysis of carbon nanotubes (CNTs) with simply supported, clamped and cantilever. The ANT deflection solutions demonstrate that the CNT stiffness is enhanced by the presence of nonlocal stress effects. Furthermore, the new ANT model concluded verifiable bending behaviors for a cantilever CNT with point load at the free end, which depends on the strength of nonlocal stress. Therefore, this new model will gives a better prediction for mechanical behaviors of nanostructures.

  8. Electric conductive pattern element fabricated using commercial inkjet printer for paper-based analytical devices.

    PubMed

    Matsuda, Yu; Shibayama, Shobu; Uete, Keigo; Yamaguchi, Hiroki; Niimi, Tomohide

    2015-06-01

    Herein, we proposed the addition of an inkjet-printed conductive pattern to paper-based analytical devices (PADs) in order to expand their applications. An electric conductive pattern was easily, quickly, and inexpensively fabricated using a commercial inkjet printer. The addition of a printed electric element will enhance the applications of PADs without the loss of properties such as cost efficiency, disposability, and portability. In this study, we applied an inkjet-printed heater to a piece of paper and investigated its characteristics. The use of the heater as a valve, concentrator, and heat source for chemical reactions on PADs was investigated. Previously, these functions were difficult to realize with PADs. The inkjet-printed heater was used as a valve and concentrator through evaporation of the working fluid and solvent, and was also found to be useful for providing heat for chemical reactions. Thus, the combination of printed electric circuits and PADs has many potential applications.

  9. A WPS Based Architecture for Climate Data Analytic Services (CDAS) at NASA

    NASA Astrophysics Data System (ADS)

    Maxwell, T. P.; McInerney, M.; Duffy, D.; Carriere, L.; Potter, G. L.; Doutriaux, C.

    2015-12-01

    Faced with unprecedented growth in the Big Data domain of climate science, NASA has developed the Climate Data Analytic Services (CDAS) framework. This framework enables scientists to execute trusted and tested analysis operations in a high performance environment close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using trusted climate data analysis tools (ESMF, CDAT, NCO, etc.). The framework is structured as a set of interacting modules allowing maximal flexibility in deployment choices. The current set of module managers include: Staging Manager: Runs the computation locally on the WPS server or remotely using tools such as celery or SLURM. Compute Engine Manager: Runs the computation serially or distributed over nodes using a parallelization framework such as celery or spark. Decomposition Manger: Manages strategies for distributing the data over nodes. Data Manager: Handles the import of domain data from long term storage and manages the in-memory and disk-based caching architectures. Kernel manager: A kernel is an encapsulated computational unit which executes a processor's compute task. Each kernel is implemented in python exploiting existing analysis packages (e.g. CDAT) and is compatible with all CDAS compute engines and decompositions. CDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be executed using either direct web service calls, a python script or application, or a javascript-based web application. Client packages in python or javascript contain everything needed to make CDAS requests. The CDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the

  10. Efficient techniques for wave-based sound propagation in interactive applications

    NASA Astrophysics Data System (ADS)

    Mehra, Ravish

    Sound propagation techniques model the effect of the environment on sound waves and predict their behavior from point of emission at the source to the final point of arrival at the listener. Sound is a pressure wave produced by mechanical vibration of a surface that propagates through a medium such as air or water, and the problem of sound propagation can be formulated mathematically as a second-order partial differential equation called the wave equation. Accurate techniques based on solving the wave equation, also called the wave-based techniques, are too expensive computationally and memory-wise. Therefore, these techniques face many challenges in terms of their applicability in interactive applications including sound propagation in large environments, time-varying source and listener directivity, and high simulation cost for mid-frequencies. In this dissertation, we propose a set of efficient wave-based sound propagation techniques that solve these three challenges and enable the use of wave-based sound propagation in interactive applications. Firstly, we propose a novel equivalent source technique for interactive wave-based sound propagation in large scenes spanning hundreds of meters. It is based on the equivalent source theory used for solving radiation and scattering problems in acoustics and electromagnetics. Instead of using a volumetric or surface-based approach, this technique takes an object-centric approach to sound propagation. The proposed equivalent source technique generates realistic acoustic effects and takes orders of magnitude less runtime memory compared to prior wave-based techniques. Secondly, we present an efficient framework for handling time-varying source and listener directivity for interactive wave-based sound propagation. The source directivity is represented as a linear combination of elementary spherical harmonic sources. This spherical harmonic-based representation of source directivity can support analytical, data

  11. An analytical method for 14C in environmental water based on a wet-oxidation process.

    PubMed

    Huang, Yan-Jun; Guo, Gui-Yin; Wu, Lian-Sheng; Zhang, Bing; Chen, Chao-Feng; Zhang, Hai-Ying; Qin, Hong-Juan; Shang-Guan, Zhi-Hong

    2015-04-01

    An analytical method for (14)C in environmental water based on a wet-oxidation process was developed. The method can be used to determine the activity concentrations of organic and inorganic (14)C in environmental water, or total (14)C, including in drinking water, surface water, rainwater and seawater. The wet-oxidation of the organic component allows the conversion of organic carbon to an inorganic form, and the extraction of the inorganic (14)C can be achieved by acidification and nitrogen purging. Environmental water with a volume of 20 L can be used for the wet-oxidation and extraction, and a detection limit of about 0.02 Bq/g(C) can be achieved for water with carbon content above 15 mg(C)/L, obviously lower than the natural level of (14)C in the environment. The collected carbon is sufficient for measurement with a low level liquid scintillation counter (LSC) for typical samples. Extraction or recovery experiments for inorganic carbon and organic carbon from typical materials, including analytical reagents of organic benzoquinone, sucrose, glutamic acid, nicotinic acid, humic acid, ethane diol, et cetera., were conducted with excellent results based on measurement on a total organic carbon analyzer and LSC. The recovery rate for inorganic carbon ranged tween 98.7%-99.0% with a mean of 98.9(± 0.1)%, for organic carbon recovery ranged between 93.8% and 100.0% with a mean of 97.1(± 2.6)%. Verification and an uncertainty budget of the method are also presented for a representative environmental water. The method is appropriate for (14)C analysis in environmental water, and can be applied also to the analysis of liquid effluent from nuclear facilities. PMID:25590997

  12. An analytical method for 14C in environmental water based on a wet-oxidation process.

    PubMed

    Huang, Yan-Jun; Guo, Gui-Yin; Wu, Lian-Sheng; Zhang, Bing; Chen, Chao-Feng; Zhang, Hai-Ying; Qin, Hong-Juan; Shang-Guan, Zhi-Hong

    2015-04-01

    An analytical method for (14)C in environmental water based on a wet-oxidation process was developed. The method can be used to determine the activity concentrations of organic and inorganic (14)C in environmental water, or total (14)C, including in drinking water, surface water, rainwater and seawater. The wet-oxidation of the organic component allows the conversion of organic carbon to an inorganic form, and the extraction of the inorganic (14)C can be achieved by acidification and nitrogen purging. Environmental water with a volume of 20 L can be used for the wet-oxidation and extraction, and a detection limit of about 0.02 Bq/g(C) can be achieved for water with carbon content above 15 mg(C)/L, obviously lower than the natural level of (14)C in the environment. The collected carbon is sufficient for measurement with a low level liquid scintillation counter (LSC) for typical samples. Extraction or recovery experiments for inorganic carbon and organic carbon from typical materials, including analytical reagents of organic benzoquinone, sucrose, glutamic acid, nicotinic acid, humic acid, ethane diol, et cetera., were conducted with excellent results based on measurement on a total organic carbon analyzer and LSC. The recovery rate for inorganic carbon ranged tween 98.7%-99.0% with a mean of 98.9(± 0.1)%, for organic carbon recovery ranged between 93.8% and 100.0% with a mean of 97.1(± 2.6)%. Verification and an uncertainty budget of the method are also presented for a representative environmental water. The method is appropriate for (14)C analysis in environmental water, and can be applied also to the analysis of liquid effluent from nuclear facilities.

  13. Integrated separation of blood plasma from whole blood for microfluidic paper-based analytical devices.

    PubMed

    Yang, Xiaoxi; Forouzan, Omid; Brown, Theodore P; Shevkoplyas, Sergey S

    2012-01-21

    Many diagnostic tests in a conventional clinical laboratory are performed on blood plasma because changes in its composition often reflect the current status of pathological processes throughout the body. Recently, a significant research effort has been invested into the development of microfluidic paper-based analytical devices (μPADs) implementing these conventional laboratory tests for point-of-care diagnostics in resource-limited settings. This paper describes the use of red blood cell (RBC) agglutination for separating plasma from finger-prick volumes of whole blood directly in paper, and demonstrates the utility of this approach by integrating plasma separation and a colorimetric assay in a single μPAD. The μPAD was fabricated by printing its pattern onto chromatography paper with a solid ink (wax) printer and melting the ink to create hydrophobic barriers spanning through the entire thickness of the paper substrate. The μPAD was functionalized by spotting agglutinating antibodies onto the plasma separation zone in the center and the reagents of the colorimetric assay onto the test readout zones on the periphery of the device. To operate the μPAD, a drop of whole blood was placed directly onto the plasma separation zone of the device. RBCs in the whole blood sample agglutinated and remained in the central zone, while separated plasma wicked through the paper substrate into the test readout zones where analyte in plasma reacted with the reagents of the colorimetric assay to produce a visible color change. The color change was digitized with a portable scanner and converted to concentration values using a calibration curve. The purity and yield of separated plasma was sufficient for successful operation of the μPAD. This approach to plasma separation based on RBC agglutination will be particularly useful for designing fully integrated μPADs operating directly on small samples of whole blood.

  14. Acid-base chemistry of white wine: analytical characterisation and chemical modelling.

    PubMed

    Prenesti, Enrico; Berto, Silvia; Toso, Simona; Daniele, Pier Giuseppe

    2012-01-01

    A chemical model of the acid-base properties is optimized for each white wine under study, together with the calculation of their ionic strength, taking into account the contributions of all significant ionic species (strong electrolytes and weak one sensitive to the chemical equilibria). Coupling the HPLC-IEC and HPLC-RP methods, we are able to quantify up to 12 carboxylic acids, the most relevant substances responsible of the acid-base equilibria of wine. The analytical concentration of carboxylic acids and of other acid-base active substances was used as input, with the total acidity, for the chemical modelling step of the study based on the contemporary treatment of overlapped protonation equilibria. New protonation constants were refined (L-lactic and succinic acids) with respect to our previous investigation on red wines. Attention was paid for mixed solvent (ethanol-water mixture), ionic strength, and temperature to ensure a thermodynamic level to the study. Validation of the chemical model optimized is achieved by way of conductometric measurements and using a synthetic "wine" especially adapted for testing.

  15. Acid-Base Chemistry of White Wine: Analytical Characterisation and Chemical Modelling

    PubMed Central

    Prenesti, Enrico; Berto, Silvia; Toso, Simona; Daniele, Pier Giuseppe

    2012-01-01

    A chemical model of the acid-base properties is optimized for each white wine under study, together with the calculation of their ionic strength, taking into account the contributions of all significant ionic species (strong electrolytes and weak one sensitive to the chemical equilibria). Coupling the HPLC-IEC and HPLC-RP methods, we are able to quantify up to 12 carboxylic acids, the most relevant substances responsible of the acid-base equilibria of wine. The analytical concentration of carboxylic acids and of other acid-base active substances was used as input, with the total acidity, for the chemical modelling step of the study based on the contemporary treatment of overlapped protonation equilibria. New protonation constants were refined (L-lactic and succinic acids) with respect to our previous investigation on red wines. Attention was paid for mixed solvent (ethanol-water mixture), ionic strength, and temperature to ensure a thermodynamic level to the study. Validation of the chemical model optimized is achieved by way of conductometric measurements and using a synthetic “wine” especially adapted for testing. PMID:22566762

  16. A smog chamber comparison of a microfluidic derivatization measurement of gas-phase glyoxal and methylglyoxal with other analytical techniques

    NASA Astrophysics Data System (ADS)

    Pang, X.; Lewis, A. C.; Richard, A.; Baeza-Romero, M. T.; Adams, T. J.; Ball, S. M.; Daniels, M. J. S.; Goodall, I. C. A.; Monks, P. S.; Peppe, S.; Ródenas García, M.; Sánchez, P.; Muñoz, A.

    2013-06-01

    A microfluidic lab-on-a-chip derivatization technique has been developed to measure part per billion volume (ppbV) mixing ratios of gaseous glyoxal (GLY) and methylglyoxal (MGLY), and the method compared with other techniques in a smog chamber experiment. The method uses o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA) as a derivatization reagent and a microfabricated planar glass micro-reactor comprising an inlet, gas and fluid splitting and combining channels, mixing junctions, and a heated capillary reaction microchannel. The enhanced phase contact area-to-volume ratio and the high heat transfer rate in the micro-reactor result in a fast and highly efficient derivatization reaction, generating an effluent stream ready for direct introduction to a gas chromatograph-mass spectrometer (GC-MS). A linear response for GLY was observed over a calibration range 0.7 to 400 ppbV, and for MGLY of 1.2 to 300 ppbV, when derivatized under optimal reaction conditions. The method detection limits (MDLs) were 80 pptV and 200 pptV for GLY and MGLY respectively, calculated as 3 times the standard deviation of the S/N of the blank sample chromatograms. These MDLs are below or close to typical concentrations in clean ambient air. The feasibility of the technique was assessed by applying the methodology under controlled conditions to quantify of α-dicarbonyls formed during the photo-oxidation of isoprene in a large scale outdoor atmospheric simulation chamber (EUPHORE). Good general agreement was seen between microfluidic measurements and Fourier Transform Infra Red (FTIR), Broad Band Cavity Enhanced Absorption Spectroscopy (BBCEAS) and a detailed photochemical chamber box modelling calculation for both GLY and MGLY. Less good agreement was found with Proton-Transfer Reaction Time-of-Flight Mass Spectrometry (PTR-ToF-MS) and Solid Phase Microextraction (SPME) derivatization methods for MGLY measurement.

  17. Gabor-based fusion technique for Optical Coherence Microscopy.

    PubMed

    Rolland, Jannick P; Meemon, Panomsak; Murali, Supraja; Thompson, Kevin P; Lee, Kye-sung

    2010-02-15

    We recently reported on an Optical Coherence Microscopy technique, whose innovation intrinsically builds on a recently reported - 2 microm invariant lateral resolution by design throughout a 2 mm cubic full-field of view - liquid-lens-based dynamic focusing optical probe [Murali et al., Optics Letters 34, 145-147, 2009]. We shall report in this paper on the image acquisition enabled by this optical probe when combined with an automatic data fusion method developed and described here to produce an in-focus high resolution image throughout the imaging depth of the sample. An African frog tadpole (Xenopus laevis) was imaged with the novel probe and the Gabor-based fusion technique, demonstrating subcellular resolution in a 0.5 mm (lateral) x 0.5 mm (axial) without the need, for the first time, for x-y translation stages, depth scanning, high-cost adaptive optics, or manual intervention. In vivo images of human skin are also presented.

  18. ATLAST ULE mirror segment performance analytical predictions based on thermally induced distortions

    NASA Astrophysics Data System (ADS)

    Eisenhower, Michael J.; Cohen, Lester M.; Feinberg, Lee D.; Matthews, Gary W.; Nissen, Joel A.; Park, Sang C.; Peabody, Hume L.

    2015-09-01

    The Advanced Technology Large-Aperture Space Telescope (ATLAST) is a concept for a 9.2 m aperture space-borne observatory operating across the UV/Optical/NIR spectra. The primary mirror for ATLAST is a segmented architecture with pico-meter class wavefront stability. Due to its extraordinarily low coefficient of thermal expansion, a leading candidate for the primary mirror substrate is Corning's ULE® titania-silicate glass. The ATLAST ULE® mirror substrates will be maintained at `room temperature' during on orbit flight operations minimizing the need for compensation of mirror deformation between the manufacturing temperature and the operational temperatures. This approach requires active thermal management to maintain operational temperature while on orbit. Furthermore, the active thermal control must be sufficiently stable to prevent time-varying thermally induced distortions in the mirror substrates. This paper describes a conceptual thermal management system for the ATLAST 9.2 m segmented mirror architecture that maintains the wavefront stability to less than 10 pico-meters/10 minutes RMS. Thermal and finite element models, analytical techniques, accuracies involved in solving the mirror figure errors, and early findings from the thermal and thermal-distortion analyses are presented.

  19. Attenuated Total Reflectance Fourier Transform Infrared Spectroscopy: An analytical technique to understand therapeutic responses at the molecular level

    PubMed Central

    Kalmodia, Sushma; Parameswaran, Sowmya; Yang, Wenrong; Barrow, Colin J.; Krishnakumar, Subramanian

    2015-01-01

    Rapid monitoring of the response to treatment in cancer patients is essential to predict the outcome of the therapeutic regimen early in the course of the treatment. The conventional methods are laborious, time-consuming, subjective and lack the ability to study different biomolecules and their interactions, simultaneously. Since; mechanisms of cancer and its response to therapy is dependent on molecular interactions and not on single biomolecules, an assay capable of studying molecular interactions as a whole, is preferred. Fourier Transform Infrared (FTIR) spectroscopy has become a popular technique in the field of cancer therapy with an ability to elucidate molecular interactions. The aim of this study, was to explore the utility of the FTIR technique along with multivariate analysis to understand whether the method has the resolution to identify the differences in the mechanism of therapeutic response. Towards achieving the aim, we utilized the mouse xenograft model of retinoblastoma and nanoparticle mediated targeted therapy. The results indicate that the mechanism underlying the response differed between the treated and untreated group which can be elucidated by unique spectral signatures generated by each group. The study establishes the efficiency of non-invasive, label-free and rapid FTIR method in assessing the interactions of nanoparticles with cellular macromolecules towards monitoring the response to cancer therapeutics. PMID:26568521

  20. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    PubMed Central

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  1. An image morphing technique based on optimal mass preserving mapping.

    PubMed

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2007-06-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  2. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  3. Investigation of preparation techniques for δ2H analysis of keratin materials and a proposed analytical protocol

    USGS Publications Warehouse

    Qi, H.; Coplen, T.B.

    2011-01-01

    Accurate hydrogen isotopic measurements of keratin materials have been a challenge due to exchangeable hydrogen in the sample matrix and the paucity of appropriate isotopic reference materials for calibration. We found that the most reproducible δ2HVSMOW-SLAP and mole fraction of exchangeable hydrogen, x(H)ex, of keratin materials were measured with equilibration at ambient temperature using two desiccators and two different equilibration waters with two sets of the keratin materials for 6 days. Following equilibration, drying the keratin materials in a vacuum oven for 4 days at 60 °C was most critical. The δ2H analysis protocol also includes interspersing isotopic reference waters in silver tubes among samples in the carousel of a thermal conversion elemental analyzer (TC/EA) reduction unit. Using this analytical protocol, δ2HVSMOW-SLAP values of the non-exchangeable fractions of USGS42 and USGS43 human-hair isotopic reference materials were determined to be –78.5 ± 2.3 ‰ and –50.3 ± 2.8 ‰, respectively. The measured x(H)ex values of keratin materials analyzed with steam equilibration and N2 drying were substantially higher than those previously published, and dry N2 purging was unable to remove absorbed moisture completely, even with overnight purging. The δ2H values of keratin materials measured with steam equilibration were about 10 ‰ lower than values determined with equilibration in desiccators at ambient temperatures when on-line evacuation was used to dry samples. With steam equilibrations the x(H)ex of commercial keratin powder was as high as 28 %. Using human-hair isotopic reference materials to calibrate other keratin materials, such as hoof or horn, can introduce bias in δ2H measurements because the amount of absorbed water and the x(H)ex values may differ from those of unknown samples. Correct δ2HVSMOW-SLAP values of the non-exchangeable fractions of unknown human-hair samples can be determined with atmospheric moisture

  4. Calcium isotope analytical technique for mafic rocks and its applications on constraining the source of Cenozoic ultra-potassic rocks in the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhang, Z.; Xu, J.

    2013-12-01

    Ca isotope analytical technique for mafic rocks has been recently developed and set up at our lab. About mg level of a mafic rock sample was digested, and then a sub portion of the solution contains about 100ug Ca was spiked with a 42Ca-43Ca double spike and went through the column chemistry. Generally the Ca recovery is almost 100% and the procedure blank is about 50-150ng. Finally, about 5-10ug of the collected Ca cut was measured on our Triton TIMS. The precision of the data was around 0.1 per mil and the data we collected for standards are consistent with those reported by previous studies. There are two groups of Cenozoic ultra-potassic rocks that are widespread in Tibetan Plateau: a northern group in Songpan-Ganzi and Qiangtang Terranes and a southern group in Lhasa Terrane. Previous petrology evidence, such as a relative enrichment in large ion lithophile element (LILE); negative Ta,Nb and Ti anomalies and high LREE/HREE ratio, support that those rocks are both derived from sub-continental lithospheric mantle (SCLM). However, differences between these two groups of rocks do exist: the southern group has higher K2O, Rb, Zr, Th, contents and a higher Rb/Ba, coupled with lower Al2O3, CaO, Na2O, Sr; the southern 87Sr/86Sr ratios are higher while the 143Nd/144Nd ratios are lower, etc. These suggest that the rocks could be derived from different mantle sources or produced by different geological processes. Ca isotope is chosen in this study to better understand the source of the ultra-potassic rocks because Ca isotope has been a great tracer of different geological reservoirs and the isotopic compositions of Ca may represent different genesic processes. We propose that the ultra-potassic rocks in the Tibet should have significant 40Ca enrichments due to the decay from 40K to 40Ca, therefore the variation of Ca isotopic compositions among these ultra-potassic rocks could be obvious. We believe that based on our calcium data together with earlier Sr, Nd, Pb data

  5. Evaluation of site-specific lateral inclusion zone for vapor intrusion based on an analytical approach.

    PubMed

    Yao, Yijun; Wu, Yun; Tang, Mengling; Wang, Yue; Wang, Jianjin; Suuberg, Eric M; Jiang, Lin; Liu, Jing

    2015-11-15

    In 2002, U.S. EPA proposed a general buffer zone of approximately 100 feet (30 m) laterally to determine which buildings to include in vapor intrusion (VI) investigations. However, this screening distance can be threatened by factors such as extensive surface pavements. Under such circumstances, EPA recommended investigating soil vapor migration distance on a site-specific basis. To serve this purpose, we present an analytical model (AAMLPH) as an alternative to estimate lateral VI screening distances at chlorinated compound-contaminated sites. Based on a previously introduced model (AAML), AAMLPH is developed by considering the effects of impervious surface cover and soil geology heterogeneities, providing predictions consistent with the three-dimensional (3-D) numerical simulated results. By employing risk-based and contribution-based screening levels of subslab concentrations (50 and 500 μg/m(3), respectively) and source-to-subslab attenuation factor (0.001 and 0.01, respectively), AAMLPH suggests that buildings greater than 30 m from a plume boundary can still be affected by VI in the presence of any two of the three factors, which are high source vapor concentration, shallow source and significant surface cover. This finding justifies the concern that EPA has expressed about the application of the 30 m lateral separation distance in the presence of physical barriers (e.g., asphalt covers or ice) at the ground surface. PMID:26057584

  6. Analytical modelling of monolayer graphene-based ion-sensitive FET to pH changes

    NASA Astrophysics Data System (ADS)

    Kiani, Mohammad Javad; Ahmadi, Mohammad Taghi; Karimi Feiz Abadi, Hediyeh; Rahmani, Meisam; Hashim, Amin; Che harun, Fauzan Khairi

    2013-04-01

    Graphene has attracted great interest because of unique properties such as high sensitivity, high mobility, and biocompatibility. It is also known as a superior candidate for pH sensing. Graphene-based ion-sensitive field-effect transistor (ISFET) is currently getting much attention as a novel material with organic nature and ionic liquid gate that is intrinsically sensitive to pH changes. pH is an important factor in enzyme stabilities which can affect the enzymatic reaction and broaden the number of enzyme applications. More accurate and consistent results of enzymes must be optimized to realize their full potential as catalysts accordingly. In this paper, a monolayer graphene-based ISFET pH sensor is studied by simulating its electrical measurement of buffer solutions for different pH values. Electrical detection model of each pH value is suggested by conductance modelling of monolayer graphene. Hydrogen ion (H+) concentration as a function of carrier concentration is proposed, and the control parameter ( Ƥ) is defined based on the electro-active ions absorbed by the surface of the graphene with different pH values. Finally, the proposed new analytical model is compared with experimental data and shows good overall agreement.

  7. Analytical modelling of monolayer graphene-based ion-sensitive FET to pH changes

    PubMed Central

    2013-01-01

    Graphene has attracted great interest because of unique properties such as high sensitivity, high mobility, and biocompatibility. It is also known as a superior candidate for pH sensing. Graphene-based ion-sensitive field-effect transistor (ISFET) is currently getting much attention as a novel material with organic nature and ionic liquid gate that is intrinsically sensitive to pH changes. pH is an important factor in enzyme stabilities which can affect the enzymatic reaction and broaden the number of enzyme applications. More accurate and consistent results of enzymes must be optimized to realize their full potential as catalysts accordingly. In this paper, a monolayer graphene-based ISFET pH sensor is studied by simulating its electrical measurement of buffer solutions for different pH values. Electrical detection model of each pH value is suggested by conductance modelling of monolayer graphene. Hydrogen ion (H+) concentration as a function of carrier concentration is proposed, and the control parameter (Ƥ) is defined based on the electro-active ions absorbed by the surface of the graphene with different pH values. Finally, the proposed new analytical model is compared with experimental data and shows good overall agreement. PMID:23590751

  8. Analytical sedimentology

    SciTech Connect

    Lewis, D.W. . Dept. of Geology); McConchie, D.M. . Centre for Coastal Management)

    1994-01-01

    Both a self instruction manual and a cookbook'' guide to field and laboratory analytical procedures, this book provides an essential reference for non-specialists. With a minimum of mathematics and virtually no theory, it introduces practitioners to easy, inexpensive options for sample collection and preparation, data acquisition, analytic protocols, result interpretation and verification techniques. This step-by-step guide considers the advantages and limitations of different procedures, discusses safety and troubleshooting, and explains support skills like mapping, photography and report writing. It also offers managers, off-site engineers and others using sediments data a quick course in commissioning studies and making the most of the reports. This manual will answer the growing needs of practitioners in the field, either alone or accompanied by Practical Sedimentology, which surveys the science of sedimentology and provides a basic overview of the principles behind the applications.

  9. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  10. A Novel, Physics-Based Data Analytics Framework for Reducing Systematic Model Errors

    NASA Astrophysics Data System (ADS)

    Wu, W.; Liu, Y.; Vandenberghe, F. C.; Knievel, J. C.; Hacker, J.

    2015-12-01

    Most climate and weather models exhibit systematic biases, such as under predicted diurnal temperatures in the WRF (Weather Research and Forecasting) model. General approaches to alleviate the systematic biases include improving model physics and numerics, improving data assimilation, and bias correction through post-processing. In this study, we developed a novel, physics-based data analytics framework in post processing by taking advantage of ever-growing high-resolution (spatial and temporal) observational and modeling data. In the framework, a spatiotemporal PCA (Principal Component Analysis) is first applied on the observational data to filter out noise and information on scales that a model may not be able to resolve. The filtered observations are then used to establish regression relationships with archived model forecasts in the same spatiotemporal domain. The regressions along with the model forecasts predict the projected observations in the forecasting period. The pre-regression PCA procedure strengthens regressions, and enhances predictive skills. We then combine the projected observations with the past observations to apply PCA iteratively to derive the final forecasts. This post-regression PCA reconstructs variances and scales of information that are lost in the regression. The framework was examined and validated with 24 days of 5-minute observational data and archives from the WRF model at 27 stations near Dugway Proving Ground, Utah. The validation shows significant bias reduction in the diurnal cycle of predicted surface air temperature compared to the direct output from the WRF model. Additionally, unlike other post-processing bias correction schemes, the data analytics framework does not require long-term historic data and model archives. A week or two of the data is enough to take into account changes in weather regimes. The program, written in python, is also computationally efficient.

  11. Analytical modeling of a single channel nonlinear fiber optic system based on QPSK.

    PubMed

    Kumar, Shiva; Shahi, Sina Naderi; Yang, Dong

    2012-12-01

    A first order perturbation theory is used to develop analytical expressions for the power spectral density (PSD) of the nonlinear distortions due to intra-channel four-wave mixing (IFWM). For non-Gaussian pulses, the PSD can not be calculated analytically. However, using the stationary phase approximations, we found that convolutions become simple multiplications and a simple analytical expression for the PSD of the nonlinear distortion is found. The PSD of the nonlinear distortion is combined with the amplified spontaneous emission (ASE) PSD to obtain the total variance and bit error ratio (BER). The analytically estimated BER is found to be in good agreement with numerical simulations.

  12. Noninvasive in vivo glucose sensing using an iris based technique

    NASA Astrophysics Data System (ADS)

    Webb, Anthony J.; Cameron, Brent D.

    2011-03-01

    Physiological glucose monitoring is important aspect in the treatment of individuals afflicted with diabetes mellitus. Although invasive techniques for glucose monitoring are widely available, it would be very beneficial to make such measurements in a noninvasive manner. In this study, a New Zealand White (NZW) rabbit animal model was utilized to evaluate a developed iris-based imaging technique for the in vivo measurement of physiological glucose concentration. The animals were anesthetized with isoflurane and an insulin/dextrose protocol was used to control blood glucose concentration. To further help restrict eye movement, a developed ocular fixation device was used. During the experimental time frame, near infrared illuminated iris images were acquired along with corresponding discrete blood glucose measurements taken with a handheld glucometer. Calibration was performed using an image based Partial Least Squares (PLS) technique. Independent validation was also performed to assess model performance along with Clarke Error Grid Analysis (CEGA). Initial validation results were promising and show that a high percentage of the predicted glucose concentrations are within 20% of the reference values.

  13. a Frequency Domain Based NUMERIC-ANALYTICAL Method for Non-Linear Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Narayanan, S.; Sekar, P.

    1998-04-01

    In this paper a multiharmonic balancing technique is used to develop certain algorithms to determine periodic orbits of non-liner dynamical systems with external, parametric and self excitations. Essentially, in this method the non-linear differential equations are transformed into a set of non-linear algebraic equations in terms of the Fourier coefficients of the periodic solutions which are solved by using the Newton-Raphson technique. The method is developed such that both fast Fourier transform and discrete Fourier transform algorithms can be used. It is capable of treating all types of non-linearities and higher dimensional systems. The stability of periodic orbits is investigated by obtaining the monodromy matrix. A path following algorithm based on the predictor-corrector method is also presented to enable the bifurcation analysis. The prediction is done with a cubic extrapolation technique with an arc length incrementation while the correction is done with the use of the least square minimisation technique. The under determined system of equations is solved by singular value decomposition. The suitability of the method is demonstrated by obtaining the bifurcational behaviour of rolling contact vibrations modelled by Hertz contact law.

  14. Neutron flux characterization of californium-252 Neutron Research Facility at the University of Texas - Pan American by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Wahid, Kareem; Sanchez, Patrick; Hannan, Mohammad

    2014-03-01

    In the field of nuclear science, neutron flux is an intrinsic property of nuclear reaction facilities that is the basis for experimental irradiation calculations and analysis. In the Rio Grande Valley (Texas), the UTPA Neutron Research Facility (NRF) is currently the only neutron facility available for experimental research purposes. The facility is comprised of a 20-microgram californium-252 neutron source surrounded by a shielding cascade containing different irradiation cavities. Thermal and fast neutron flux values for the UTPA NRF have yet to be fully investigated and may be of particular interest to biomedical studies in low neutron dose applications. Though a variety of techniques exist for the characterization of neutron flux, neutron activation analysis (NAA) of metal and nonmetal foils is a commonly utilized experimental method because of its detection sensitivity and availability. The aim of our current investigation is to employ foil activation in the determination of neutron flux values for the UTPA NSRF for further research purposes. Neutron spectrum unfolding of the acquired experimental data via specialized software and subsequent comparison for consistency with computational models lends confidence to the results.

  15. Optical accelerometer based on grating interferometer with phase modulation technique.

    PubMed

    Zhao, Shuangshuang; Zhang, Juan; Hou, Changlun; Bai, Jian; Yang, Guoguang

    2012-10-10

    In this paper, an optical accelerometer based on grating interferometer with phase modulation technique is proposed. This device architecture consists of a laser diode, a sensing chip and an optoelectronic processing circuit. The sensing chip is a sandwich structure, which is composed of a grating, a piezoelectric translator and a micromachined silicon structure consisting of a proof mass and four cantilevers. The detected signal is intensity-modulated with phase modulation technique and processed with a lock-in amplifier for demodulation. Experimental results show that this optical accelerometer has acceleration sensitivity of 619 V/g and high-resolution acceleration detection of 3 μg in the linear region. PMID:23052079

  16. An osmolyte-based micro-volume ultrafiltration technique.

    PubMed

    Ghosh, Raja

    2014-12-01

    This paper discusses a novel, simple, and inexpensive micro-volume ultrafiltration technique for protein concentration, desalting, buffer exchange, and size-based protein purification. The technique is suitable for processing protein samples in a high-throughput mode. It utilizes a combination of capillary action, and osmosis for drawing water and other permeable species from a micro-volume sample droplet applied on the surface of an ultrafiltration membrane. A macromolecule coated on the permeate side of the membrane functions as the osmolyte. The action of the osmolyte could, if required, be augmented by adding a supersorbent polymer layer over the osmolyte. The mildly hydrophobic surface of the polymeric ultrafiltration membrane used in this study minimized sample droplet spreading, thus making it easy to recover the retained material after separation, without sample interference and cross-contamination. High protein recoveries were observed in the micro-volume ultrafiltration experiments described in the paper. PMID:25284741

  17. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  18. SVPWM Technique with Varying DC-Link Voltage for Common Mode Voltage Reduction in a Matrix Converter and Analytical Estimation of its Output Voltage Distortion

    NASA Astrophysics Data System (ADS)

    Padhee, Varsha

    converter. This conceivably aids the sizing and design of output passive filters. An analytical estimation method has been presented to achieve this purpose for am IMC. Knowledge of the fundamental component in output voltage can be utilized to calculate its Total Harmonic Distortion (THD). The effectiveness of the proposed SVPWM algorithms and the analytical estimation technique is substantiated by simulations in MATLAB / Simulink and experiments on a laboratory prototype of the IMC. Proper comparison plots have been provided to contrast the performance of the proposed methods with the conventional SVPWM method. The behavior of output voltage distortion and CMV with variation in operating parameters like modulation index and output frequency has also been analyzed.

  19. Development of analytical competencies and professional identities through school-based learning in Denmark

    NASA Astrophysics Data System (ADS)

    Andresen, Bent B.

    2015-12-01

    This article presents the main results of a case study on teachers' professional development in terms of competence and identity. The teachers involved in the study are allocated time by their schools to participate in professional "affinity group" meetings. During these meetings, the teachers gather and analyse school-based data about factors which persistently create and sustain challenges in effective student education (grade K-10). This process improves their understanding and undertaking of job-related tasks. The affinity group meetings also influence the teachers' professional identity. The research findings thus illustrate the fact that the analytical approach of affinity groups, based on the analysis of the difficulties in their daily job, provides good results in terms of competencies and identity perception. In general, as a result of meeting in affinity groups, adult learners develop professional competencies and identities which are considered crucial in rapidly changing schools characterised by an increased focus on, among other things, lifelong learning, social inclusion, school digitalisation, and information literacy. The research findings are thus relevant for ministries and school owners, teacher-trainers and supervisors, schools and other educational institutions, as well as teachers and their organisations worldwide.

  20. Nonlinear modelling of high-speed catenary based on analytical expressions of cable and truss elements

    NASA Astrophysics Data System (ADS)

    Song, Yang; Liu, Zhigang; Wang, Hongrui; Lu, Xiaobing; Zhang, Jing

    2015-10-01

    Due to the intrinsic nonlinear characteristics and complex structure of the high-speed catenary system, a modelling method is proposed based on the analytical expressions of nonlinear cable and truss elements. The calculation procedure for solving the initial equilibrium state is proposed based on the Newton-Raphson iteration method. The deformed configuration of the catenary system as well as the initial length of each wire can be calculated. Its accuracy and validity of computing the initial equilibrium state are verified by comparison with the separate model method, absolute nodal coordinate formulation and other methods in the previous literatures. Then, the proposed model is combined with a lumped pantograph model and a dynamic simulation procedure is proposed. The accuracy is guaranteed by the multiple iterative calculations in each time step. The dynamic performance of the proposed model is validated by comparison with EN 50318, the results of the finite element method software and SIEMENS simulation report, respectively. At last, the influence of the catenary design parameters (such as the reserved sag and pre-tension) on the dynamic performance is preliminarily analysed by using the proposed model.

  1. Analytical modelling of a refractive index sensor based on an intrinsic micro Fabry-Perot interferometer.

    PubMed

    Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana D; Cano-Contreras, Martin; Gallegos-Arellano, Eloisa; Jauregui-Vazquez, Daniel; Hernández-García, Juan C; Estudillo-Ayala, Julian M; Rojas-Laguna, Roberto

    2015-10-15

    In this work a refractive index sensor based on a combination of the non-dispersive sensing (NDS) and the Tunable Laser Spectroscopy (TLS) principles is presented. Here, in order to have one reference and one measurement channel a single-beam dual-path configuration is used for implementing the NDS principle. These channels are monitored with a couple of identical optical detectors which are correlated to calculate the overall sensor response, called here the depth of modulation. It is shown that this is useful to minimize drifting errors due to source power variations. Furthermore, a comprehensive analysis of a refractive index sensing setup, based on an intrinsic micro Fabry-Perot Interferometer (FPI) is described. Here, the changes over the FPI pattern as the exit refractive index is varied are analytically modelled by using the characteristic matrix method. Additionally, our simulated results are supported by experimental measurements which are also provided. Finally it is shown that by using this principle a simple refractive index sensor with a resolution in the order of 2.15 × 10(-4) RIU can be implemented by using a couple of standard and low cost photodetectors.

  2. Analytical Modelling of a Refractive Index Sensor Based on an Intrinsic Micro Fabry-Perot Interferometer

    PubMed Central

    Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana D.; Cano-Contreras, Martin; Gallegos-Arellano, Eloisa; Jauregui-Vazquez, Daniel; Hernández-García, Juan C.; Estudillo-Ayala, Julian M.; Rojas-Laguna, Roberto

    2015-01-01

    In this work a refractive index sensor based on a combination of the non-dispersive sensing (NDS) and the Tunable Laser Spectroscopy (TLS) principles is presented. Here, in order to have one reference and one measurement channel a single-beam dual-path configuration is used for implementing the NDS principle. These channels are monitored with a couple of identical optical detectors which are correlated to calculate the overall sensor response, called here the depth of modulation. It is shown that this is useful to minimize drifting errors due to source power variations. Furthermore, a comprehensive analysis of a refractive index sensing setup, based on an intrinsic micro Fabry-Perot Interferometer (FPI) is described. Here, the changes over the FPI pattern as the exit refractive index is varied are analytically modelled by using the characteristic matrix method. Additionally, our simulated results are supported by experimental measurements which are also provided. Finally it is shown that by using this principle a simple refractive index sensor with a resolution in the order of 2.15 × 10−4 RIU can be implemented by using a couple of standard and low cost photodetectors. PMID:26501277

  3. Entropy-based heavy tailed distribution transformation and visual analytics for monitoring massive network traffic

    NASA Astrophysics Data System (ADS)

    Han, Keesook J.; Hodge, Matthew; Ross, Virginia W.

    2011-06-01

    For monitoring network traffic, there is an enormous cost in collecting, storing, and analyzing network traffic datasets. Data mining based network traffic analysis has a growing interest in the cyber security community, but is computationally expensive for finding correlations between attributes in massive network traffic datasets. To lower the cost and reduce computational complexity, it is desirable to perform feasible statistical processing on effective reduced datasets instead of on the original full datasets. Because of the dynamic behavior of network traffic, traffic traces exhibit mixtures of heavy tailed statistical distributions or overdispersion. Heavy tailed network traffic characterization and visualization are important and essential tasks to measure network performance for the Quality of Services. However, heavy tailed distributions are limited in their ability to characterize real-time network traffic due to the difficulty of parameter estimation. The Entropy-Based Heavy Tailed Distribution Transformation (EHTDT) was developed to convert the heavy tailed distribution into a transformed distribution to find the linear approximation. The EHTDT linearization has the advantage of being amenable to characterize and aggregate overdispersion of network traffic in realtime. Results of applying the EHTDT for innovative visual analytics to real network traffic data are presented.

  4. Personal exposure assessment to particulate metals using a paper-based analytical device

    NASA Astrophysics Data System (ADS)

    Cate, David; Volckens, John; Henry, Charles

    2013-03-01

    The development of a paper-based analytical device (PAD) for assessing personal exposure to particulate metals will be presented. Human exposure to metal aerosols, such as those that occur in the mining, construction, and manufacturing industries, has a significant impact on the health of our workforce, costing an estimated $10B in the U.S and causing approximately 425,000 premature deaths world-wide each year. Occupational exposure to particulate metals affects millions of individuals in manufacturing, construction (welding, cutting, blasting), and transportation (combustion, utility maintenance, and repair services) industries. Despite these effects, individual workers are rarely assessed for their exposure to particulate metals, due mainly to the high cost and effort associated with personal exposure measurement. Current exposure assessment methods for particulate metals call for an 8-hour filter sample, after which time, the filter sample is transported to a laboratory and analyzed by inductively-coupled plasma (ICP). The time from sample collection to reporting is typically weeks and costs several hundred dollars per sample. To exacerbate the issue, method detection limits suffer because of sample dilution during digestion. The lack of sensitivity hampers task-based exposure assessment, for which sampling times may be tens of minutes. To address these problems, and as a first step towards using microfluidics for personal exposure assessment, we have developed PADs for measurement of Pb, Cd, Cr, Fe, Ni, and Cu in aerosolized particulate matter.

  5. Analytical modelling of a refractive index sensor based on an intrinsic micro Fabry-Perot interferometer.

    PubMed

    Vargas-Rodriguez, Everardo; Guzman-Chavez, Ana D; Cano-Contreras, Martin; Gallegos-Arellano, Eloisa; Jauregui-Vazquez, Daniel; Hernández-García, Juan C; Estudillo-Ayala, Julian M; Rojas-Laguna, Roberto

    2015-01-01

    In this work a refractive index sensor based on a combination of the non-dispersive sensing (NDS) and the Tunable Laser Spectroscopy (TLS) principles is presented. Here, in order to have one reference and one measurement channel a single-beam dual-path configuration is used for implementing the NDS principle. These channels are monitored with a couple of identical optical detectors which are correlated to calculate the overall sensor response, called here the depth of modulation. It is shown that this is useful to minimize drifting errors due to source power variations. Furthermore, a comprehensive analysis of a refractive index sensing setup, based on an intrinsic micro Fabry-Perot Interferometer (FPI) is described. Here, the changes over the FPI pattern as the exit refractive index is varied are analytically modelled by using the characteristic matrix method. Additionally, our simulated results are supported by experimental measurements which are also provided. Finally it is shown that by using this principle a simple refractive index sensor with a resolution in the order of 2.15 × 10(-4) RIU can be implemented by using a couple of standard and low cost photodetectors. PMID:26501277

  6. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  7. Changes in sample collection and analytical techniques and effects on retrospective comparability of low-level concentrations of trace elements in ground water

    USGS Publications Warehouse

    Ivahnenko, T.; Szabo, Z.; Gibs, J.

    2001-01-01

    Ground-water sampling techniques were modified to reduce random low-level contamination during collection of filtered water samples for determination of trace-element concentrations. The modified sampling techniques were first used in New Jersey by the US Geological Survey in 1994 along with inductively coupled plasma-mass spectrometry (ICP-MS) analysis to determine the concentrations of 18 trace elements at the one microgram-per-liter (??g/L) level in the oxic water of the unconfined sand and gravel Kirkwood-Cohansey aquifer system. The revised technique tested included a combination of the following: collection of samples (1) with flow rates of about 2L per minute, (2) through acid-washed single-use disposable tubing and (3) a single-use disposable 0.45-??m pore size capsule filter, (4) contained within portable glove boxes, (5) in a dedicated clean sampling van, (6) only after turbidity stabilized at values less than 2 nephelometric turbidity units (NTU), when possible. Quality-assurance data, obtained from equipment blanks and split samples, indicated that trace element concentrations, with the exception of iron, chromium, aluminum, and zinc, measured in the samples collected in 1994 were not subject to random contamination at 1??g/L.Results from samples collected in 1994 were compared to those from samples collected in 1991 from the same 12 PVC-cased observation wells using the available sampling and analytical techniques at that time. Concentrations of copper, lead, manganese and zinc were statistically significantly lower in samples collected in 1994 than in 1991. Sampling techniques used in 1994 likely provided trace-element data that represented concentrations in the aquifer with less bias than data from 1991 when samples were collected without the same degree of attention to sample handling. Copyright ?? 2001 .

  8. Improved sampling and analytical techniques for characterization of very-low-level radwaste materials from commercial nuclear power stations

    SciTech Connect

    Robertson, D.E.; Robinson, P.J.

    1989-11-01

    This paper summarizes the unique sampling methods that were utilized in a recently completed project sponsored by the Electric Power Research Institute (EPRI) to perform accurate and precise radiological characterizations of several very-low-level radwaste materials from commercial nuclear power stations. The waste types characterized during this project included dry active waste (DAW), oil, secondary-side ion exchange resin, and soil. Special precautions were taken to insure representative sampling of the DAW. This involved the initial direct, quantitative gamma spectrometric analyses of bulk quantities (208-liter drums) of DAW utilizing a specially constructed barrel scanner employing a collimated intrinsic germanium detector assembly. Subsamples of the DAW for destructive radiochemical analyses of the difficult-to-measure 10CF61 radionuclides were then selected which had the same isotopic composition (to within {+-}25%) as that measured for the entire drum of DAW. The techniques for accomplishing this sampling are described. Oil samples were collected from the top, middle and bottom sections of 208-liter drums for radiochemical analyses. These samples were composited to represent the entire drum of oil. The accuracy of this type of sampling was evaluated by comparisons with direct, quantitative assays of a number of the drums using the barrel scanning gamma-ray spectrometer. The accuracy of sampling drums of spent secondary-side ion exchange resin was evaluated by comparing the radionuclide contents of grab samples taken from the tops of the drums with direct assays performed with the barrel scanner. The results of these sampling evaluations indicated that the sampling methods used were generally adequate for providing a reasonably representative subsample from bulk quantities of DAW, oil, and resin. The study also identified a number of potential pitfalls, in sampling of these materials.

  9. Using Maps in Web Analytics to Evaluate the Impact of Web-Based Extension Programs

    ERIC Educational Resources Information Center

    Veregin, Howard

    2015-01-01

    Maps can be a valuable addition to the Web analytics toolbox for Extension programs that use the Web to disseminate information. Extension professionals use Web analytics tools to evaluate program impacts. Maps add a unique perspective through visualization and analysis of geographic patterns and their relationships to other variables. Maps can…

  10. Development of a Test to Evaluate Students' Analytical Thinking Based on Fact versus Opinion Differentiation

    ERIC Educational Resources Information Center

    Thaneerananon, Taveep; Triampo, Wannapong; Nokkaew, Artorn

    2016-01-01

    Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students' thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students' analytical thinking. The sample was composed of 3,567 6th grade students in 2014…

  11. An analytic linear accelerator source model for GPU-based Monte Carlo dose calculations.

    PubMed

    Tian, Zhen; Li, Yongbao; Folkerts, Michael; Shi, Feng; Jiang, Steve B; Jia, Xun

    2015-10-21

    Recently, there has been a lot of research interest in developing fast Monte Carlo (MC) dose calculation methods on graphics processing unit (GPU) platforms. A good linear accelerator (linac) source model is critical for both accuracy and efficiency considerations. In principle, an analytical source model should be more preferred for GPU-based MC dose engines than a phase-space file-based model, in that data loading and CPU-GPU data transfer can be avoided. In this paper, we presented an analytical field-independent source model specifically developed for GPU-based MC dose calculations, associated with a GPU-friendly sampling scheme. A key concept called phase-space-ring (PSR) was proposed. Each PSR contained a group of particles that were of the same type, close in energy and reside in a narrow ring on the phase-space plane located just above the upper jaws. The model parameterized the probability densities of particle location, direction and energy for each primary photon PSR, scattered photon PSR and electron PSR. Models of one 2D Gaussian distribution or multiple Gaussian components were employed to represent the particle direction distributions of these PSRs. A method was developed to analyze a reference phase-space file and derive corresponding model parameters. To efficiently use our model in MC dose calculations on GPU, we proposed a GPU-friendly sampling strategy, which ensured that the particles sampled and transported simultaneously are of the same type and close in energy to alleviate GPU thread divergences. To test the accuracy of our model, dose distributions of a set of open fields in a water phantom were calculated using our source model and compared to those calculated using the reference phase-space files. For the high dose gradient regions, the average distance-to-agreement (DTA) was within 1 mm and the maximum DTA within 2 mm. For relatively low dose gradient regions, the root-mean-square (RMS) dose difference was within 1.1% and the maximum

  12. An analytic linear accelerator source model for GPU-based Monte Carlo dose calculations.

    PubMed

    Tian, Zhen; Li, Yongbao; Folkerts, Michael; Shi, Feng; Jiang, Steve B; Jia, Xun

    2015-10-21

    Recently, there has been a lot of research interest in developing fast Monte Carlo (MC) dose calculation methods on graphics processing unit (GPU) platforms. A good linear accelerator (linac) source model is critical for both accuracy and efficiency considerations. In principle, an analytical source model should be more preferred for GPU-based MC dose engines than a phase-space file-based model, in that data loading and CPU-GPU data transfer can be avoided. In this paper, we presented an analytical field-independent source model specifically developed for GPU-based MC dose calculations, associated with a GPU-friendly sampling scheme. A key concept called phase-space-ring (PSR) was proposed. Each PSR contained a group of particles that were of the same type, close in energy and reside in a narrow ring on the phase-space plane located just above the upper jaws. The model parameterized the probability densities of particle location, direction and energy for each primary photon PSR, scattered photon PSR and electron PSR. Models of one 2D Gaussian distribution or multiple Gaussian components were employed to represent the particle direction distributions of these PSRs. A method was developed to analyze a reference phase-space file and derive corresponding model parameters. To efficiently use our model in MC dose calculations on GPU, we proposed a GPU-friendly sampling strategy, which ensured that the particles sampled and transported simultaneously are of the same type and close in energy to alleviate GPU thread divergences. To test the accuracy of our model, dose distributions of a set of open fields in a water phantom were calculated using our source model and compared to those calculated using the reference phase-space files. For the high dose gradient regions, the average distance-to-agreement (DTA) was within 1 mm and the maximum DTA within 2 mm. For relatively low dose gradient regions, the root-mean-square (RMS) dose difference was within 1.1% and the maximum

  13. A Label-Free Porous Silicon Immunosensor for Broad Detection of Opiates in a Blind Clinical Study and Result Comparison to Commercial Analytical Chemistry Techniques

    PubMed Central

    Bonanno, Lisa M.; Kwong, Tai C.; DeLouise, Lisa A.

    2010-01-01

    In this work we evaluate for the first time the performance of a label-free porous silicon (PSi) immunosensor assay in a blind clinical study designed to screen authentic patient urine specimens for a broad range of opiates. The PSi opiate immunosensor achieved 96% concordance with liquid chromatography-mass spectrometry/tandem mass spectrometry (LC-MS/MS) results on samples that underwent standard opiate testing (n=50). In addition, successful detection of a commonly abused opiate, oxycodone, resulted in 100% qualitative agreement between the PSi opiate sensor and LC-MS/MS. In contrast, a commercial broad opiate immunoassay technique (CEDIA®) achieved 65% qualitative concordance with LC-MS/MS. Evaluation of important performance attributes including precision, accuracy, and recovery was completed on blank urine specimens spiked with test analytes. Variability of morphine detection as a model opiate target was < 9% both within-run and between-day at and above the cutoff limit of 300 ng ml−1. This study validates the analytical screening capability of label-free PSi opiate immunosensors in authentic patient samples and is the first semi-quantitative demonstration of the technology’s successful clinical use. These results motivate future development of PSi technology to reduce complexity and cost of diagnostic testing particularly in a point-of-care setting. PMID:21062030

  14. Antimisting kerosene: Base fuel effects, blending and quality control techniques

    NASA Technical Reports Server (NTRS)

    Yavrouian, A. H.; Ernest, J.; Sarohia, V.

    1984-01-01

    The problems associated with blending of the AMK additive with Jet A, and the base fuel effects on AMK properties are addressed. The results from the evaluation of some of the quality control techniques for AMK are presented. The principal conclusions of this investigation are: significant compositional differences for base fuel (Jet A) within the ASTM specification DI655; higher aromatic content of the base fuel was found to be beneficial for the polymer dissolution at ambient (20 C) temperature; using static mixer technology, the antimisting additive (FM-9) is in-line blended with Jet A, producing AMK which has adequate fire-protection properties 15 to 20 minutes after blending; degradability of freshly blended and equilibrated AMK indicated that maximum degradability is reached after adequate fire protection is obtained; the results of AMK degradability as measured by filter ratio, confirmed previous RAE data that power requirements to decade freshly blended AMK are significantly higher than equilibrated AMK; blending of the additive by using FM-9 concentrate in Jet A produces equilibrated AMK almost instantly; nephelometry offers a simple continuous monitoring capability and is used as a real time quality control device for AMK; and trajectory (jet thurst) and pressure drop tests are useful laboratory techniques for evaluating AMK quality.

  15. Spectroscopic analysis of solar and cosmic X-ray spectra. 1: The nature of cosmic X-ray spectra and proposed analytical techniques

    NASA Technical Reports Server (NTRS)

    Walker, A. B. C., Jr.

    1975-01-01

    Techniques for the study of the solar corona are reviewed as an introduction to a discussion of modifications required for the study of cosmic sources. Spectroscopic analysis of individual sources and the interstellar medium is considered. The latter was studied via analysis of its effect on the spectra of selected individual sources. The effects of various characteristics of the ISM, including the presence of grains, molecules, and ionization, are first discussed, and the development of ISM models is described. The expected spectral structure of individual cosmic sources is then reviewed with emphasis on supernovae remnants and binary X-ray sources. The observational and analytical requirements imposed by the characteristics of these sources are identified, and prospects for the analysis of abundances and the study of physical parameters within them are assessed. Prospects for the spectroscopic study of other classes of X-ray sources are also discussed.

  16. HydroViewer: Utilizing Web-Based Hydrologic Data And Analytical Services

    NASA Astrophysics Data System (ADS)

    Ye, Z.; Djokic, D.; Armstrong, L.

    2011-12-01

    To conduct a hydrologic study in an area, the hydrologist needs to define the area of interest, collect the spatial, hydrological, and metrological data for the area, and finally perform the desired analysis. Services oriented architecture holds promise that these activities can be done using distributed services combined in a single lightweight application. Hydrologic time series data are collected, stored, and served by multiple agencies in different formats and are often difficult to find, acquire, and mobilize for analysis. CUAHSI's WaterOneFlow web service API provides functions for querying and collecting the temporal data in a consistent manner. Many agencies and Universities publish their time series data using WaterOneFlow services and thus make them available to a broad range of users. ArcGIS server allows publishing of spatial data and mapping services and is widely used in the government, academia, and industry. ArcGIS server can also be used to serve analytical services. By combining spatial services provided by ArcGIS server and temporal data provided through WaterOneFlow services, it is now possible to create web applications to explore the hydrologic time series data available in a given spatial area served by multiple agencies and perform analysis on them. A Web application, HydroViewer, is developed using ArcGIS Silverlight API to allow users to mobilize ArcGIS server and WaterOneFlow services in an integrated fashion. It performs the following tasks: (1) Delineating watershed for a user specified point of interest using Arc Hydro based watershed delineation service, (2) Exploring data collecting sites and data collected and served by different agencies for a given spatial area (watershed or viewing spatial extent) and time domain, (3) Viewing/graphing these data collected by these sites, (4) Collecting the metadata in for the data variables in a form of a data cart, (5) Downloading both spatial and time series data to create an Arc Hydro

  17. Development of a microfluidic paper-based analytical device for the determination of salivary aldehydes.

    PubMed

    Ramdzan, Adlin N; Almeida, M Inês G S; McCullough, Michael J; Kolev, Spas D

    2016-05-01

    A low cost, disposable and easy to use microfluidic paper-based analytical device (μPAD) was developed for simple and non-invasive determination of total aldehydes in saliva with a potential to be used in epidemiological studies to assess oral cancer risk. The μPAD is based on the colour reaction between aldehydes (e.g. acetaldehyde, formaldehyde), 3-methyl-2-benzothiazolinone hydrazone (MBTH) and iron(III) to form an intense blue coloured formazan dye. The newly developed μPAD has a 3D design with two overlapping paper layers. The first layer comprises 15 circular detection zones (8 mm in diameter), each impregnated with 8 μL of MBTH, while the second layer contains 15 reagent zones (4 mm in diameter). Two μL of iron(III) chloride are added to each one of the second layer zones after the addition of sample to the detection zones in the first layer. All hydrophilic zones of the μPAD are defined by wax printing using a commercial wax printer. Due to the 2-step nature of the analytical reaction, the two paper layers are separated by a cellulose acetate interleaving sheet to allow for the reaction between the aldehydes in the saliva sample with MBTH to proceed first with the formation of an azine, followed by a blue coloured reaction between the azine and the oxidized by iron(III) form of MBTH, produced after the removal of the interleaving sheet. After obtaining a high resolution image of the detection side zone of the device using a flatbed scanner, the intensity of the blue colour within each detection zone is measured with Image J software. Under optimal conditions, the μPAD is characterised by a working range of 20.4-114.0 μM, limit of detection of 6.1 μM, and repeatability, expressed as RSD, of less than 12.7% (n = 5). There is no statistically significant difference at the 95% confidence level between the results obtained by the μPAD and the reference method (Student's t-test: 0.090 < 0.38). The optimized μPAD is stable for more than 41 days

  18. Microfluidic paper-based analytical devices fabricated by low-cost photolithography and embossing of Parafilm®.

    PubMed

    Yu, Ling; Shi, Zhuan Zhuan

    2015-04-01

    Microfluidic paper-based analytical devices (μPADs) attract tremendous attention as an economical tool for in-field diagnosis, food safety and environmental monitoring. We innovatively fabricated 2D and 3D μPADs by photolithography-patterning microchannels on a Parafilm® and subsequently embossing them to paper. This truly low-cost, wax printer and cutter plotter independent approach offers the opportunity for researchers from resource-limited laboratories to work on paper-based analytical devices. PMID:25710591

  19. Finite analytic method based on mixed-form Richards' equation for simulating water flow in vadose zone

    NASA Astrophysics Data System (ADS)

    Zhang, Zaiyong; Wang, Wenke; Yeh, Tian-chyi Jim; Chen, Li; Wang, Zhoufeng; Duan, Lei; An, Kedong; Gong, Chengcheng

    2016-06-01

    In this paper, we develop a finite analytic method (FAMM), which combines flexibility of numerical methods and advantages of analytical solutions, to solve the mixed-form Richards' equation. This new approach minimizes mass balance errors and truncation errors associated with most numerical approaches. We use numerical experiments to demonstrate that FAMM can obtain more accurate numerical solutions and control the global mass balance better than modified Picard finite difference method (MPFD) as compared with analytical solutions. In addition, FAMM is superior to the finite analytic method based on head-based Richards' equation (FAMH). Besides, FAMM solutions are compared to analytical solutions for wetting and drying processes in Brindabella Silty Clay Loam and Yolo Light Clay soils. Finally, we demonstrate that FAMM yields comparable results with those from MPFD and Hydrus-1D for simulating infiltration into other different soils under wet and dry conditions. These numerical experiments further confirm the fact that as long as a hydraulic constitutive model captures general behaviors of other models, it can be used to yield flow fields comparable to those based on other models.

  20. Water-based technique to produce porous PZT materials

    NASA Astrophysics Data System (ADS)

    Galassi, C.; Capiani, C.; Craciun, F.; Roncari, E.

    2005-09-01

    Water based colloidal processing of PZT materials was investigated in order to reduce costs and employ more environmental friendly manufacturing. The technique addressed was the production of porous thick samples by the so called “starch consolidation”. PZT “soft” compositions were used. The “starch consolidation” process allows to obtain the green body by raising the temperature of a suspension of PZT powder, soluble starch and water, cast into a metal mould. The influence of the processing parameters and composition on the morphology, pore volumes, pore size distributions and piezoelectric properties are investigated. Zeta potential determination and titration with different deflocculants were essential tools to adjust the slurry formulation.