Sample records for multiple analytical techniques

  1. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  2. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  3. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  4. Multiple Contact Dates and SARS Incubation Periods

    PubMed Central

    2004-01-01

    Many severe acute respiratory syndrome (SARS) patients have multiple possible incubation periods due to multiple contact dates. Multiple contact dates cannot be used in standard statistical analytic techniques, however. I present a simple spreadsheet-based method that uses multiple contact dates to calculate the possible incubation periods of SARS. PMID:15030684

  5. Demonstrations Using a Fabry-Perot. I. Multiple-Slit Interference

    ERIC Educational Resources Information Center

    Roychoudhuri, Chandrasekhar

    1975-01-01

    Describes a demonstration technique for showing multiple-slit interference patterns with the use of a Fabry-Perot etalon and a laser beam. A simple derivation of the analytical expression for such fringes is presented. (Author/CP)

  6. Permeation absorption sampler with multiple detection

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for detecting analytes in air or aqueous systems includes a permeation absorption preconcentrator sampler for the analytes and analyte detectors. The preconcentrator has an inner fluid-permeable container into which a charge of analyte-sorbing liquid is intermittently injected, and a fluid-impermeable outer container. The sample is passed through the outer container and around the inner container for trapping and preconcentrating the analyte in the sorbing liquid. The analyte can be detected photometrically by injecting with the sorbing material a reagent which reacts with the analyte to produce a characteristic color or fluorescence which is detected by illuminating the contents of the inner container with a light source and measuring the absorbed or emitted light, or by producing a characteristic chemiluminescence which can be detected by a suitable light sensor. The analyte can also be detected amperometrically. Multiple inner containers may be provided into which a plurality of sorbing liquids are respectively introduced for simultaneously detecting different analytes. Baffles may be provided in the outer container. A calibration technique is disclosed.

  7. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  8. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  9. Computational overlay metrology with adaptive data analytics

    NASA Astrophysics Data System (ADS)

    Schmitt-Weaver, Emil; Subramony, Venky; Ullah, Zakir; Matsunobu, Masazumi; Somasundaram, Ravin; Thomas, Joel; Zhang, Linmiao; Thul, Klaus; Bhattacharyya, Kaustuve; Goossens, Ronald; Lambregts, Cees; Tel, Wim; de Ruiter, Chris

    2017-03-01

    With photolithography as the fundamental patterning step in the modern nanofabrication process, every wafer within a semiconductor fab will pass through a lithographic apparatus multiple times. With more than 20,000 sensors producing more than 700GB of data per day across multiple subsystems, the combination of a light source and lithographic apparatus provide a massive amount of information for data analytics. This paper outlines how data analysis tools and techniques that extend insight into data that traditionally had been considered unmanageably large, known as adaptive analytics, can be used to show how data collected before the wafer is exposed can be used to detect small process dependent wafer-towafer changes in overlay.

  10. A simulation-based evaluation of methods for inferring linear barriers to gene flow

    Treesearch

    Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol

    2012-01-01

    Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...

  11. Oscillations and Multiple Equilibria in Microvascular Blood Flow.

    PubMed

    Karst, Nathaniel J; Storey, Brian D; Geddes, John B

    2015-07-01

    We investigate the existence of oscillatory dynamics and multiple steady-state flow rates in a network with a simple topology and in vivo microvascular blood flow constitutive laws. Unlike many previous analytic studies, we employ the most biologically relevant models of the physical properties of whole blood. Through a combination of analytic and numeric techniques, we predict in a series of two-parameter bifurcation diagrams a range of dynamical behaviors, including multiple equilibria flow configurations, simple oscillations in volumetric flow rate, and multiple coexistent limit cycles at physically realizable parameters. We show that complexity in network topology is not necessary for complex behaviors to arise and that nonlinear rheology, in particular the plasma skimming effect, is sufficient to support oscillatory dynamics similar to those observed in vivo.

  12. Material property analytical relations for the case of an AFM probe tapping a viscoelastic surface containing multiple characteristic times

    PubMed Central

    López-Guerra, Enrique A

    2017-01-01

    We explore the contact problem of a flat-end indenter penetrating intermittently a generalized viscoelastic surface, containing multiple characteristic times. This problem is especially relevant for nanoprobing of viscoelastic surfaces with the highly popular tapping-mode AFM imaging technique. By focusing on the material perspective and employing a rigorous rheological approach, we deliver analytical closed-form solutions that provide physical insight into the viscoelastic sources of repulsive forces, tip–sample dissipation and virial of the interaction. We also offer a systematic comparison to the well-established standard harmonic excitation, which is the case relevant for dynamic mechanical analysis (DMA) and for AFM techniques where tip–sample sinusoidal interaction is permanent. This comparison highlights the substantial complexity added by the intermittent-contact nature of the interaction, which precludes the derivation of straightforward equations as is the case for the well-known harmonic excitations. The derivations offered have been thoroughly validated through numerical simulations. Despite the complexities inherent to the intermittent-contact nature of the technique, the analytical findings highlight the potential feasibility of extracting meaningful viscoelastic properties with this imaging method. PMID:29114450

  13. Does Independent Research with a Faculty Member Enhance Four-Year Graduation and Graduate/Professional Degree Plans? Convergent Results with Different Analytical Methods

    ERIC Educational Resources Information Center

    Kilgo, Cindy A.; Pascarella, Ernest T.

    2016-01-01

    This study examines the effects of undergraduate students participating in independent research with faculty members on four-year graduation and graduate/professional degree aspirations. We analyzed four-year longitudinal data from the Wabash National Study of Liberal Arts Education using multiple analytic techniques. The findings support the…

  14. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  15. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  16. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  17. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  18. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Performance Analysis of Blind Subspace-Based Signature Estimation Algorithms for DS-CDMA Systems with Unknown Correlated Noise

    NASA Astrophysics Data System (ADS)

    Zarifi, Keyvan; Gershman, Alex B.

    2006-12-01

    We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.

  20. New techniques for the analysis of manual control systems. [mathematical models of human operator behavior

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.

    1971-01-01

    Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.

  1. An orientation soil survey at the Pebble Cu-Au-Mo porphyry deposit, Alaska

    USGS Publications Warehouse

    Smith, Steven M.; Eppinger, Robert G.; Fey, David L.; Kelley, Karen D.; Giles, S.A.

    2009-01-01

    Soil samples were collected in 2007 and 2008 along three traverses across the giant Pebble Cu-Au-Mo porphyry deposit. Within each soil pit, four subsamples were collected following recommended protocols for each of ten commonly-used and proprietary leach/digestion techniques. The significance of geochemical patterns generated by these techniques was classified by visual inspection of plots showing individual element concentration by each analytical method along the 2007 traverse. A simple matrix by element versus method, populated with a value based on the significance classification, provides a method for ranking the utility of methods and elements at this deposit. The interpretation of a complex multi-element dataset derived from multiple analytical techniques is challenging. An example of vanadium results from a single leach technique is used to illustrate the several possible interpretations of the data.

  2. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  3. Developing semi-analytical solution for multiple-zone transient storage model with spatially non-uniform storage

    NASA Astrophysics Data System (ADS)

    Deng, Baoqing; Si, Yinbing; Wang, Jia

    2017-12-01

    Transient storages may vary along the stream due to stream hydraulic conditions and the characteristics of storage. Analytical solutions of transient storage models in literature didn't cover the spatially non-uniform storage. A novel integral transform strategy is presented that simultaneously performs integral transforms to the concentrations in the stream and in storage zones by using the single set of eigenfunctions derived from the advection-diffusion equation of the stream. The semi-analytical solution of the multiple-zone transient storage model with the spatially non-uniform storage is obtained by applying the generalized integral transform technique to all partial differential equations in the multiple-zone transient storage model. The derived semi-analytical solution is validated against the field data in literature. Good agreement between the computed data and the field data is obtained. Some illustrative examples are formulated to demonstrate the applications of the present solution. It is shown that solute transport can be greatly affected by the variation of mass exchange coefficient and the ratio of cross-sectional areas. When the ratio of cross-sectional areas is big or the mass exchange coefficient is small, more reaches are recommended to calibrate the parameter.

  4. Microbiological Detection Systems for Molecular Analysis of Environmental Water and Soil Samples

    EPA Science Inventory

    Multiple detection systems are being targeted to track various species and genotypes of pathogens found in environmental samples with the overreaching goal of developing analytical separation and detection techniques for Salmonella enterica Serovars Typhi, Cryptosporidium parvum,...

  5. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  6. Hybrid-dual-fourier tomographic algorithm for a fast three-dimensionial optical image reconstruction in turbid media

    NASA Technical Reports Server (NTRS)

    Alfano, Robert R. (Inventor); Cai, Wei (Inventor)

    2007-01-01

    A reconstruction technique for reducing computation burden in the 3D image processes, wherein the reconstruction procedure comprises an inverse and a forward model. The inverse model uses a hybrid dual Fourier algorithm that combines a 2D Fourier inversion with a 1D matrix inversion to thereby provide high-speed inverse computations. The inverse algorithm uses a hybrid transfer to provide fast Fourier inversion for data of multiple sources and multiple detectors. The forward model is based on an analytical cumulant solution of a radiative transfer equation. The accurate analytical form of the solution to the radiative transfer equation provides an efficient formalism for fast computation of the forward model.

  7. Focused Ion Beam Recovery and Analysis of Interplanetary Dust Particles (IDPs) and Stardust Analogues

    NASA Technical Reports Server (NTRS)

    Graham, G. A.; Bradley, J. P.; Bernas, M.; Stroud, R. M.; Dai, Z. R.; Floss, C.; Stadermann, F. J.; Snead, C. J.; Westphal, A. J.

    2004-01-01

    Meteoritics research is a major beneficiary of recent developments in analytical instrumentation [1,2]. Integrated studies in which multiple analytical techniques are applied to the same specimen are providing new insight about the nature of IDPs [1]. Such studies are dependent on the ability to prepare specimens that can be analyzed in multiple instruments. Focused ion beam (FIB) microscopy has revolutionized specimen preparation in materials science [3]. Although FIB has successfully been used for a few IDP and meteorite studies [1,4-6], it has yet to be widely utilized in meteoritics. We are using FIB for integrated TEM/NanoSIMS/synchrotron infrared (IR) studies [1].

  8. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    PubMed Central

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  9. Towards a minimally invasive sampling tool for high resolution tissue analytical mapping

    NASA Astrophysics Data System (ADS)

    Gottardi, R.

    2015-09-01

    Multiple spatial mapping techniques of biological tissues have been proposed over the years, but all present limitations either in terms of resolution, analytical capacity or invasiveness. Ren et al (2015 Nanotechnology 26 284001) propose in their most recent work the use of a picosecond infrared laser (PIRL) under conditions of ultrafast desorption by impulsive vibrational excitation (DIVE) to extract small amounts of cellular and molecular components, conserving their viability, structure and activity. The PIRL DIVE technique would then work as a nanobiopsy with minimal damage to the surrounding tissues, which could potentially be applied for high resolution local structural characterization of tissues in health and disease with the spatial limit determined by the laser focus.

  10. Determinations of rare earth element abundance and U-Pb age of zircons using multispot laser ablation-inductively coupled plasma mass spectrometry.

    PubMed

    Yokoyama, Takaomi D; Suzuki, Toshihiro; Kon, Yoshiaki; Hirata, Takafumi

    2011-12-01

    We have developed a new calibration technique for multielement determination and U-Pb dating of zircon samples using laser ablation-inductively coupled plasma mass spectrometry (ICPMS) coupled with galvanometric optics. With the galvanometric optics, laser ablation of two or more sample materials could be achieved in very short time intervals (~10 ms). The resulting sample aerosols released from different ablation pits or different solid samples were mixed and homogenized within the sample cell and then transported into the ICP ion source. Multiple spot laser ablation enables spiking of analytes or internal standard elements directly into the solid samples, and therefore the standard addition calibration method can be applied for the determination of trace elements in solid samples. In this study, we have measured the rare earth element (REE) abundances of two zircon samples (Nancy 91500 and Prešovice) based on the standard addition technique, using a direct spiking of analytes through a multispot laser ablation of the glass standard material (NIST SRM612). The resulting REE abundance data show good agreement with previously reported values within analytical uncertainties achieved in this study (10% for most elements). Our experiments demonstrated that nonspectroscopic interferences on 14 REEs could be significantly reduced by the standard addition technique employed here. Another advantage of galvanometric devices is the accumulation of sample aerosol released from multiple spots. In this study we have measured the U-Pb age of a zircon sample (LMR) using an accumulation of sample aerosols released from 10 separate ablation pits of low diameters (~8 μm). The resulting (238)U-(206)Pb age data for the LMR zircons was 369 ± 64 Ma, which is in good agreement with previously reported age data (367.6 ± 1.5 Ma). (1) The data obtained here clearly demonstrate that the multiple spot laser ablation-ICPMS technique can become a powerful approach for elemental and isotopic ratio measurements in solid materials.

  11. Learning Dashboards

    ERIC Educational Resources Information Center

    Charleer, Sven; Klerkx, Joris; Duval, Erik

    2014-01-01

    This article explores how information visualization techniques can be applied to learning analytics data to help teachers and students deal with the abundance of learner traces. We also investigate how the affordances of large interactive surfaces can facilitate a collaborative sense-making environment for multiple students and teachers to explore…

  12. Recent Advances in Bioprinting and Applications for Biosensing

    PubMed Central

    Dias, Andrew D.; Kingsley, David M.; Corr, David T.

    2014-01-01

    Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write) that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic. PMID:25587413

  13. Multiple microbial activity-based measures reflect effects of cover cropping and tillage on soils

    USDA-ARS?s Scientific Manuscript database

    Agricultural producers, conservation professionals, and policy makers are eager to learn of soil analytical techniques and data that document improvement in soil health by agricultural practices such as no-till and incorporation of cover crops. However, there is considerable uncertainty within the r...

  14. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  15. Teaching Algebraic Equations to Middle School Students with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Baker, Joshua N.; Rivera, Christopher J.; Morgan, Joseph John; Reese, Noelle

    2015-01-01

    The purpose of this study was to replicate similar instructional techniques of Jimenez, Browder, and Courtade (2008) using a single-subject multiple-probe across participants design to investigate the effects of task analytic instruction coupled with semi-concrete representations to teach linear algebraic equations to middle school students with…

  16. Break-even Analysis: Tool for Budget Planning

    ERIC Educational Resources Information Center

    Lohmann, Roger A.

    1976-01-01

    Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)

  17. Analytic Methods for Evaluating Patterns of Multiple Congenital Anomalies in Birth Defect Registries.

    PubMed

    Agopian, A J; Evans, Jane A; Lupo, Philip J

    2018-01-15

    It is estimated that 20 to 30% of infants with birth defects have two or more birth defects. Among these infants with multiple congenital anomalies (MCA), co-occurring anomalies may represent either chance (i.e., unrelated etiologies) or pathogenically associated patterns of anomalies. While some MCA patterns have been recognized and described (e.g., known syndromes), others have not been identified or characterized. Elucidating these patterns may result in a better understanding of the etiologies of these MCAs. This article reviews the literature with regard to analytic methods that have been used to evaluate patterns of MCAs, in particular those using birth defect registry data. A popular method for MCA assessment involves a comparison of the observed to expected ratio for a given combination of MCAs, or one of several modified versions of this comparison. Other methods include use of numerical taxonomy or other clustering techniques, multiple regression analysis, and log-linear analysis. Advantages and disadvantages of these approaches, as well as specific applications, were outlined. Despite the availability of multiple analytic approaches, relatively few MCA combinations have been assessed. The availability of large birth defects registries and computing resources that allow for automated, big data strategies for prioritizing MCA patterns may provide for new avenues for better understanding co-occurrence of birth defects. Thus, the selection of an analytic approach may depend on several considerations. Birth Defects Research 110:5-11, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. A variance-decomposition approach to investigating multiscale habitat associations

    USGS Publications Warehouse

    Lawler, J.J.; Edwards, T.C.

    2006-01-01

    The recognition of the importance of spatial scale in ecology has led many researchers to take multiscale approaches to studying habitat associations. However, few of the studies that investigate habitat associations at multiple spatial scales have considered the potential effects of cross-scale correlations in measured habitat variables. When cross-scale correlations in such studies are strong, conclusions drawn about the relative strength of habitat associations at different spatial scales may be inaccurate. Here we adapt and demonstrate an analytical technique based on variance decomposition for quantifying the influence of cross-scale correlations on multiscale habitat associations. We used the technique to quantify the variation in nest-site locations of Red-naped Sapsuckers (Sphyrapicus nuchalis) and Northern Flickers (Colaptes auratus) associated with habitat descriptors at three spatial scales. We demonstrate how the method can be used to identify components of variation that are associated only with factors at a single spatial scale as well as shared components of variation that represent cross-scale correlations. Despite the fact that no explanatory variables in our models were highly correlated (r < 0.60), we found that shared components of variation reflecting cross-scale correlations accounted for roughly half of the deviance explained by the models. These results highlight the importance of both conducting habitat analyses at multiple spatial scales and of quantifying the effects of cross-scale correlations in such analyses. Given the limits of conventional analytical techniques, we recommend alternative methods, such as the variance-decomposition technique demonstrated here, for analyzing habitat associations at multiple spatial scales. ?? The Cooper Ornithological Society 2006.

  19. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Multiple Disconnected Brain Subcortical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2014-01-01

    We present a novel surface parameterization technique using hyperspherical harmonics (HSH) in representing compact, multiple, disconnected brain subcortical structures as a single analytic function. The proposed hyperspherical harmonic representation (HyperSPHARM) has many advantages over the widely used spherical harmonic (SPHARM) parameterization technique. SPHARM requires flattening 3D surfaces to 3D sphere which can be time consuming for large surface meshes, and can’t represent multiple disconnected objects with single parameterization. On the other hand, HyperSPHARM treats 3D object, via simple stereographic projection, as a surface of 4D hypersphere with extremely large radius, hence avoiding the computationally demanding flattening process. HyperSPHARM is shown to achieve a better reconstruction with only 5 basis compared to SPHARM that requires more than 441. PMID:24505716

  20. Chemical and Biological Dynamics Using Droplet-Based Microfluidics.

    PubMed

    Dressler, Oliver J; Casadevall I Solvas, Xavier; deMello, Andrew J

    2017-06-12

    Recent years have witnessed an increased use of droplet-based microfluidic techniques in a wide variety of chemical and biological assays. Nevertheless, obtaining dynamic data from these platforms has remained challenging, as this often requires reading the same droplets (possibly thousands of them) multiple times over a wide range of intervals (from milliseconds to hours). In this review, we introduce the elemental techniques for the formation and manipulation of microfluidic droplets, together with the most recent developments in these areas. We then discuss a wide range of analytical methods that have been successfully adapted for analyte detection in droplets. Finally, we highlight a diversity of studies where droplet-based microfluidic strategies have enabled the characterization of dynamic systems that would otherwise have remained unexplorable.

  1. Stakeholder perspectives on decision-analytic modeling frameworks to assess genetic services policy.

    PubMed

    Guzauskas, Gregory F; Garrison, Louis P; Stock, Jacquie; Au, Sylvia; Doyle, Debra Lochner; Veenstra, David L

    2013-01-01

    Genetic services policymakers and insurers often make coverage decisions in the absence of complete evidence of clinical utility and under budget constraints. We evaluated genetic services stakeholder opinions on the potential usefulness of decision-analytic modeling to inform coverage decisions, and asked them to identify genetic tests for decision-analytic modeling studies. We presented an overview of decision-analytic modeling to members of the Western States Genetic Services Collaborative Reimbursement Work Group and state Medicaid representatives and conducted directed content analysis and an anonymous survey to gauge their attitudes toward decision-analytic modeling. Participants also identified and prioritized genetic services for prospective decision-analytic evaluation. Participants expressed dissatisfaction with current processes for evaluating insurance coverage of genetic services. Some participants expressed uncertainty about their comprehension of decision-analytic modeling techniques. All stakeholders reported openness to using decision-analytic modeling for genetic services assessments. Participants were most interested in application of decision-analytic concepts to multiple-disorder testing platforms, such as next-generation sequencing and chromosomal microarray. Decision-analytic modeling approaches may provide a useful decision tool to genetic services stakeholders and Medicaid decision-makers.

  2. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  3. MASS SPECTROMETRY IMAGING FOR DRUGS AND METABOLITES

    PubMed Central

    Greer, Tyler; Sturm, Robert; Li, Lingjun

    2011-01-01

    Mass spectrometric imaging (MSI) is a powerful analytical technique that provides two- and three-dimensional spatial maps of multiple compounds in a single experiment. This technique has been routinely applied to protein, peptide, and lipid molecules with much less research reporting small molecule distributions, especially pharmaceutical drugs. This review’s main focus is to provide readers with an up-to-date description of the substrates and compounds that have been analyzed for drug and metabolite composition using MSI technology. Additionally, ionization techniques, sample preparation, and instrumentation developments are discussed. PMID:21515430

  4. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  5. Analytical application of solid contact ion-selective electrodes for determination of copper and nitrate in various food products and drinking water.

    PubMed

    Wardak, Cecylia; Grabarczyk, Malgorzata

    2016-08-02

    A simple, fast and cheap method for monitoring copper and nitrate in drinking water and food products using newly developed solid contact ion-selective electrodes is proposed. Determination of copper and nitrate was performed by application of multiple standard additions technique. The reliability of the obtained results was assessed by comparing them using the anodic stripping voltammetry or spectrophotometry for the same samples. In each case, satisfactory agreement of the results was obtained, which confirms the analytical usefulness of the constructed electrodes.

  6. Analysis of multiple mycotoxins in food.

    PubMed

    Hajslova, Jana; Zachariasova, Milena; Cajka, Tomas

    2011-01-01

    Mycotoxins are secondary metabolites of microscopic filamentous fungi. With regard to the widespread distribution of fungi in the environment, mycotoxins are considered to be one of the most important natural contaminants in foods and feeds. To protect consumers' health and reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities, and researchers worldwide. In this context, availability of reliable analytical methods applicable for this purpose is essential. Since the variety of chemical structures of mycotoxins makes impossible to use one single technique for their analysis, a vast number of analytical methods has been developed and validated. Both a large variability of food matrices and growing demands for a fast, cost-saving and accurate determination of multiple mycotoxins by a single method outline new challenges for analytical research. This strong effort is facilitated by technical developments in mass spectrometry allowing decreasing the influence of matrix effects in spite of omitting sample clean-up step. The current state-of-the-art together with future trends is presented in this chapter. Attention is focused mainly on instrumental method; advances in biosensors and other screening bioanalytical approaches enabling analysis of multiple mycotoxins are not discussed in detail.

  7. Analytical Applications of Transport Through Bulk Liquid Membranes.

    PubMed

    Diaconu, Ioana; Ruse, Elena; Aboul-Enein, Hassan Y; Bunaciu, Andrei A

    2016-07-03

    This review discusses the results of research in the use of bulk liquid membranes in separation processes and preconcentration for analytical purposes. It includes some theoretical aspects, definitions, types of liquid membranes, and transport mechanism, as well as advantages of using liquid membranes in laboratory studies. These concepts are necessary to understand fundamental principles of liquid membrane transport. Due to the multiple advantages of liquid membranes several studies present analytical applications of the transport through liquid membranes in separation or preconcentration processes of metallic cations and some organic compounds, such as phenol and phenolic derivatives, organic acids, amino acids, carbohydrates, and drugs. This review presents coupled techniques such as separation through the liquid membrane coupled with flow injection analysis.

  8. ION COMPOSITION ELUCIDATION (ICE): A HIGH RESOLUTION MASS SPECTROMETRIC TECHNIQUE FOR IDENTIFYING COMPOUNDS IN COMPLEX MIXTURES

    EPA Science Inventory

    When tentatively identifying compounds in complex mixtures using mass spectral libraries, multiple matches or no plausible matches due to a high level of chemical noise or interferences can occur. Worse yet, most analytes are not in the libraries. In each case, Ion Composition El...

  9. Leading singularities and off-shell conformal integrals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drummond, James; Duhr, Claude; Eden, Burkhard

    2013-08-29

    The three-loop four-point function of stress-tensor multiplets in N=4 super Yang-Mills theory contains two so far unknown, off-shell, conformal integrals, in addition to the known, ladder-type integrals. In our paper we evaluate the unknown integrals, thus obtaining the three-loop correlation function analytically. The integrals have the generic structure of rational functions multiplied by (multiple) polylogarithms. We use the idea of leading singularities to obtain the rational coefficients, the symbol — with an appropriate ansatz for its structure — as a means of characterising multiple polylogarithms, and the technique of asymptotic expansion of Feynman integrals to obtain the integrals in certainmore » limits. The limiting behaviour uniquely fixes the symbols of the integrals, which we then lift to find the corresponding polylogarithmic functions. The final formulae are numerically confirmed. Furthermore, we develop techniques that can be applied more generally, and we illustrate this by analytically evaluating one of the integrals contributing to the same four-point function at four loops. This example shows a connection between the leading singularities and the entries of the symbol.« less

  10. Characterization of Compton-scatter imaging with an analytical simulation method

    PubMed Central

    Jones, Kevin C; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V; Chu, James C H

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140–220 keV, and 40–50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min−1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible through comparison of simulated and acquired patient images. PMID:29243663

  11. Characterization of Compton-scatter imaging with an analytical simulation method

    NASA Astrophysics Data System (ADS)

    Jones, Kevin C.; Redler, Gage; Templeton, Alistair; Bernard, Damian; Turian, Julius V.; Chu, James C. H.

    2018-01-01

    By collimating the photons scattered when a megavoltage therapy beam interacts with the patient, a Compton-scatter image may be formed without the delivery of an extra dose. To characterize and assess the potential of the technique, an analytical model for simulating scatter images was developed and validated against Monte Carlo (MC). For three phantoms, the scatter images collected during irradiation with a 6 MV flattening-filter-free therapy beam were simulated. Images, profiles, and spectra were compared for different phantoms and different irradiation angles. The proposed analytical method simulates accurate scatter images up to 1000 times faster than MC. Minor differences between MC and analytical simulated images are attributed to limitations in the isotropic superposition/convolution algorithm used to analytically model multiple-order scattering. For a detector placed at 90° relative to the treatment beam, the simulated scattered photon energy spectrum peaks at 140-220 keV, and 40-50% of the photons are the result of multiple scattering. The high energy photons originate at the beam entrance. Increasing the angle between source and detector increases the average energy of the collected photons and decreases the relative contribution of multiple scattered photons. Multiple scattered photons cause blurring in the image. For an ideal 5 mm diameter pinhole collimator placed 18.5 cm from the isocenter, 10 cGy of deposited dose (2 Hz imaging rate for 1200 MU min-1 treatment delivery) is expected to generate an average 1000 photons per mm2 at the detector. For the considered lung tumor CT phantom, the contrast is high enough to clearly identify the lung tumor in the scatter image. Increasing the treatment beam size perpendicular to the detector plane decreases the contrast, although the scatter subject contrast is expected to be greater than the megavoltage transmission image contrast. With the analytical method, real-time tumor tracking may be possible through comparison of simulated and acquired patient images.

  12. Analysis of polymeric phenolics in red wines using different techniques combined with gel permeation chromatography fractionation.

    PubMed

    Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén

    2006-04-21

    A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.

  13. A new basaltic glass microanalytical reference material for multiple techniques

    USGS Publications Warehouse

    Wilson, Steve; Koenig, Alan; Lowers, Heather

    2012-01-01

    The U.S. Geological Survey (USGS) has been producing reference materials since the 1950s. Over 50 materials have been developed to cover bulk rock, sediment, and soils for the geological community. These materials are used globally in geochemistry, environmental, and analytical laboratories that perform bulk chemistry and/or microanalysis for instrument calibration and quality assurance testing. To answer the growing demand for higher spatial resolution and sensitivity, there is a need to create a new generation of microanalytical reference materials suitable for a variety of techniques, such as scanning electron microscopy/X-ray spectrometry (SEM/EDS), electron probe microanalysis (EPMA), laser ablation inductively coupled mass spectrometry (LA-ICP-MS), and secondary ion mass spectrometry (SIMS). As such, the microanalytical reference material (MRM) needs to be stable under the beam, be homogeneous at scales of better than 10–25 micrometers for the major to ultra-trace element level, and contain all of the analytes (elements or isotopes) of interest. Previous development of basaltic glasses intended for LA-ICP-MS has resulted in a synthetic basaltic matrix series of glasses (USGS GS-series) and a natural basalt series of glasses (BCR-1G, BHVO-2G, and NKT-1G). These materials have been useful for the LA-ICP-MS community but were not originally intended for use by the electron or ion beam community. A material developed from start to finish with intended use in multiple microanalytical instruments would be useful for inter-laboratory and inter-instrument platform comparisons. This article summarizes the experiments undertaken to produce a basalt glass reference material suitable for distribution as a multiple-technique round robin material. The goal of the analytical work presented here is to demonstrate that the elemental homogeneity of the new glass is acceptable for its use as a reference material. Because the round robin exercise is still underway, only nominal compositional ranges for each element are given in the article.

  14. Investigation of Stainless Steel Corrosion in Ultrahigh-Purity Water and Steam Systems by Surface Analytical Techniques

    NASA Astrophysics Data System (ADS)

    Dong, Xia; Iacocca, Ronald G.; Bustard, Bethany L.; Kemp, Craig A. J.

    2010-02-01

    Stainless steel pipes with different degrees of rouging and a Teflon®-coated rupture disc with severe corrosion were thoroughly investigated by combining multiple surface analytical techniques. The surface roughness and iron oxide layer thickness increase with increasing rouge severity, and the chromium oxide layer coexists with the iron oxide layer in samples with various degrees of rouging. Unlike the rouging observed for stainless steel pipes, the fast degradation of the rupture disc was caused by a crevice corrosion environment created by perforations in the protective Teflon coating. This failure analysis clearly shows the highly corrosive nature of ultrapure water used in the manufacture of pharmaceutical products, and demonstrates some of the unexpected corrosion mechanisms that can be encountered in these environments.

  15. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  16. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  17. Active Control of Inlet Noise on the JT15D Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.

    1999-01-01

    This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.

  18. A New Look at Multiple Goal Pursuit: The Promise of a Person-Centered Approach

    ERIC Educational Resources Information Center

    Wormington, Stephanie Virgine; Linnenbrink-Garcia, Lisa

    2017-01-01

    The current study reviewed and synthesized studies employing a person-centered approach to studying achievement goals. Towards this end, a common labeling scheme was developed for goal profiles. Ten profile types were identified across studies and compared via meta-analytic techniques in terms of academic motivation, social/emotional well-being,…

  19. Using Word Clouds for Fast, Formative Assessment of Students' Short Written Responses

    ERIC Educational Resources Information Center

    Brooks, Bill J.; Gilbuena, Debra M.; Krause, Stephen J.; Koretsky, Milo D.

    2014-01-01

    Active learning in class helps students develop deeper understanding of chemical engineering principles. While the use of multiple-choice ConcepTests is clearly effective, we advocate for including student writing in learning activities as well. In this article, we demonstrate that word clouds can provide a quick analytical technique to assess…

  20. Comparing Methods for Assessing Forest Soil Net Nitrogen Mineralization and Net Nitrification

    Treesearch

    S. S. Jefts; I. J. Fernandez; L.E. Rustad; D. B. Dail

    2004-01-01

    A variety of analytical techniques are used to evaluate rates of nitrogen (N) mineralization and nitrification in soils. The diversity of methods takes on added significance in forest ecosystem research where high soil heterogeneity and multiple soil horizons can make comparisons over time and space even more complex than in agricultural Ap horizons. This study...

  1. Recent Advances in Paper-Based Sensors

    PubMed Central

    Liana, Devi D.; Raguse, Burkhard; Gooding, J. Justin; Chow, Edith

    2012-01-01

    Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed. PMID:23112667

  2. Use of pressure manifestations following the water plasma expansion for phytomass disintegration.

    PubMed

    Maroušek, Josef; Kwan, Jason Tai Hong

    2013-01-01

    A prototype capable of generating underwater high-voltage discharges (3.5 kV) coupled with water plasma expansion was constructed. The level of phytomass disintegration caused by transmission of the pressure shockwaves (50-60 MPa) followed by this expansion was analyzed using gas adsorption techniques. The dynamics of the external surface area and the micropore volume on multiple pretreatment stages of maize silage and sunflower seeds was approximated with robust analytical techniques. The multiple increases on the reaction surface were manifest in up to a 15% increase in cumulative methane production, which was itself manifest in the overall acceleration of the anaerobic fermentation process. Disintegration of the sunflower seeds allowed up to 45% higher oil yields using the same operating pressure.

  3. Government/Industry Workshop on Payload Loads Technology

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A fully operational space shuttle is discussed which will offer science the opportunity to explore near earth orbit and finally interplanetary space on nearly a limitless basis. This multiplicity of payload/experiment combinations and frequency of launches places many burdens on dynamicists to predict launch and landing environments accurately and efficiently. Two major problems are apparent in the attempt to design for the diverse environments: (1) balancing the design criteria (loads, etc.) between launch and orbit operations, and (2) developing analytical techniques that are reliable, accurate, efficient, and low cost to meet the challenge of multiple launches and payloads. This paper deals with the key issues inherent in these problems, the key trades required, the basic approaches needed, and a summary of the state-of-the-art techniques.

  4. Text-based Analytics for Biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Charles, Lauren E.; Smith, William P.; Rounds, Jeremiah

    The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related tomore » biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when). The ability to prevent, mitigate, or control a biological threat depends on how quickly the threat is identified and characterized. Ensuring the timely delivery of data and analytics is an essential aspect of providing adequate situational awareness in the face of a disease outbreak. This chapter outlines an analytic pipeline for supporting an advanced early warning system that can integrate multiple data sources and provide situational awareness of potential and occurring disease situations. The pipeline, includes real-time automated data analysis founded on natural language processing (NLP), semantic concept matching, and machine learning techniques, to enrich content with metadata related to biosurveillance. Online news articles are presented as an example use case for the pipeline, but the processes can be generalized to any textual data. In this chapter, the mechanics of a streaming pipeline are briefly discussed as well as the major steps required to provide targeted situational awareness. The text-based analytic pipeline includes various processing steps as well as identifying article relevance to biosurveillance (e.g., relevance algorithm) and article feature extraction (who, what, where, why, how, and when).« less

  5. Multi-technique quantitative analysis and socioeconomic considerations of lead, cadmium, and arsenic in children's toys and toy jewelry.

    PubMed

    Hillyer, Margot M; Finch, Lauren E; Cerel, Alisha S; Dattelbaum, Jonathan D; Leopold, Michael C

    2014-08-01

    A wide spectrum and large number of children's toys and toy jewelry items were purchased from both bargain and retail vendors and analyzed for arsenic, cadmium, and lead metal content using multiple analytical techniques, including flame and furnace atomic absorption spectroscopy as well as X-ray fluorescence spectroscopy. Particularly dangerous for young children, metal concentrations in toys/toy jewelry were assessed for compliance with current Consumer Safety Product Commission (CPSC) regulations (F963-11). A conservative metric involving multiple analytical techniques was used to categorize compliance: one technique confirmation of metal in excess of CPSC limits indicated a "suspect" item while confirmation on two different techniques warranted a non-compliant designation. Sample matrix-based standard addition provided additional confirmation of non-compliant and suspect products. Results suggest that origin of purchase, rather than cost, is a significant factor in the risk assessment of these materials with 57% of toys/toy jewelry items from bargain stores non-compliant or suspect compared to only 15% from retail outlets and 13% if only low cost items from the retail stores are compared. While jewelry was found to be the most problematic product (73% of non-compliant/suspect samples), lead (45%) and arsenic (76%) were the most dominant toxins found in non-compliant/suspect samples. Using the greater Richmond area as a model, the discrepancy between bargain and retail children's products, along with growing numbers of bargain stores in low-income and urban areas, exemplifies an emerging socioeconomic public health issue. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. XRF, μ-XRD and μ-spectroscopic techniques for revealing the composition and structure of paint layers on polychrome sculptures after multiple restorations.

    PubMed

    Franquelo, M L; Duran, A; Castaing, J; Arquillo, D; Perez-Rodriguez, J L

    2012-01-30

    This paper presents the novel application of recently developed analytical techniques to the study of paint layers on sculptures that have been restored/repainted several times across centuries. Analyses were performed using portable XRF, μ-XRD and μ-Raman instruments. Other techniques, such as optical microscopy, SEM-EDX and μ-FTIR, were also used. Pigments and other materials including vermilion, minium, red lac, ivory black, lead white, barium white, zinc white (zincite), titanium white (rutile and anatase), lithopone, gold and brass were detected. Pigments from both ancient and modern times were found due to the different restorations/repaintings carried out. μ-Raman was very useful to characterise some pigments that were difficult to determine by μ-XRD. In some cases, pigments identification was only possible by combining results from the different analytical techniques used in this work. This work is the first article devoted to the study of sculpture cross-section samples using laboratory-made μ-XRD systems. Copyright © 2011 Elsevier B.V. All rights reserved.

  7. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Discovering, Exploring, and Mapping Spatiotemporal Patterns Across Heterogenous Space-Time Data

    NASA Astrophysics Data System (ADS)

    Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.

    2017-12-01

    Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.

  8. Bayesian meta-analytical methods to incorporate multiple surrogate endpoints in drug development process.

    PubMed

    Bujkiewicz, Sylwia; Thompson, John R; Riley, Richard D; Abrams, Keith R

    2016-03-30

    A number of meta-analytical methods have been proposed that aim to evaluate surrogate endpoints. Bivariate meta-analytical methods can be used to predict the treatment effect for the final outcome from the treatment effect estimate measured on the surrogate endpoint while taking into account the uncertainty around the effect estimate for the surrogate endpoint. In this paper, extensions to multivariate models are developed aiming to include multiple surrogate endpoints with the potential benefit of reducing the uncertainty when making predictions. In this Bayesian multivariate meta-analytic framework, the between-study variability is modelled in a formulation of a product of normal univariate distributions. This formulation is particularly convenient for including multiple surrogate endpoints and flexible for modelling the outcomes which can be surrogate endpoints to the final outcome and potentially to one another. Two models are proposed, first, using an unstructured between-study covariance matrix by assuming the treatment effects on all outcomes are correlated and second, using a structured between-study covariance matrix by assuming treatment effects on some of the outcomes are conditionally independent. While the two models are developed for the summary data on a study level, the individual-level association is taken into account by the use of the Prentice's criteria (obtained from individual patient data) to inform the within study correlations in the models. The modelling techniques are investigated using an example in relapsing remitting multiple sclerosis where the disability worsening is the final outcome, while relapse rate and MRI lesions are potential surrogates to the disability progression. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  9. Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.

    PubMed

    Trentelman, Karen

    2017-06-12

    Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.

  10. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  11. An analytical and experimental study of sound propagation and attenuation in variable-area ducts. [reducing aircraft engine noise

    NASA Technical Reports Server (NTRS)

    Nayfeh, A. H.; Kaiser, J. E.; Marshall, R. L.; Hurst, L. J.

    1978-01-01

    The performance of sound suppression techniques in ducts that produce refraction effects due to axial velocity gradients was evaluated. A computer code based on the method of multiple scales was used to calculate the influence of axial variations due to slow changes in the cross-sectional area as well as transverse gradients due to the wall boundary layers. An attempt was made to verify the analytical model through direct comparison of experimental and computational results and the analytical determination of the influence of axial gradients on optimum liner properties. However, the analytical studies were unable to examine the influence of non-parallel ducts on the optimum linear conditions. For liner properties not close to optimum, the analytical predictions and the experimental measurements were compared. The circumferential variations of pressure amplitudes and phases at several axial positions were examined in straight and variable-area ducts, hard-wall and lined sections with and without a mean flow. Reasonable agreement between the theoretical and experimental results was obtained.

  12. Improved hybridization of Fuzzy Analytic Hierarchy Process (FAHP) algorithm with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW)

    NASA Astrophysics Data System (ADS)

    Zaiwani, B. E.; Zarlis, M.; Efendi, S.

    2018-03-01

    In this research, the improvement of hybridization algorithm of Fuzzy Analytic Hierarchy Process (FAHP) with Fuzzy Technique for Order Preference by Similarity to Ideal Solution (FTOPSIS) in selecting the best bank chief inspector based on several qualitative and quantitative criteria with various priorities. To improve the performance of the above research, FAHP algorithm hybridization with Fuzzy Multiple Attribute Decision Making - Simple Additive Weighting (FMADM-SAW) algorithm was adopted, which applied FAHP algorithm to the weighting process and SAW for the ranking process to determine the promotion of employee at a government institution. The result of improvement of the average value of Efficiency Rate (ER) is 85.24%, which means that this research has succeeded in improving the previous research that is equal to 77.82%. Keywords: Ranking and Selection, Fuzzy AHP, Fuzzy TOPSIS, FMADM-SAW.

  13. A comparison of experiment and theory for sound propagation in variable area ducts

    NASA Technical Reports Server (NTRS)

    Nayfeh, A. H.; Kaiser, J. E.; Marshall, R. L.; Hurst, C. J.

    1980-01-01

    An experimental and analytical program has been carried out to evaluate sound suppression techniques in ducts that produce refraction effects due to axial velocity gradients. The analytical program employs a computer code based on the method of multiple scales to calculate the influence of axial variations due to slow changes in the cross-sectional area as well as transverse gradients due to the wall boundary layers. Detailed comparisons between the analytical predictions and the experimental measurements have been made. The circumferential variations of pressure amplitudes and phases at several axial positions have been examined in straight and variable area ducts, with hard walls and lined sections, and with and without a mean flow. Reasonable agreement between the theoretical and experimental results has been found.

  14. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Terrain modeling for microwave landing system

    NASA Technical Reports Server (NTRS)

    Poulose, M. M.

    1991-01-01

    A powerful analytical approach for evaluating the terrain effects on a microwave landing system (MLS) is presented. The approach combines a multiplate model with a powerful and exhaustive ray tracing technique and an accurate formulation for estimating the electromagnetic fields due to the antenna array in the presence of terrain. Both uniform theory of diffraction (UTD) and impedance UTD techniques have been employed to evaluate these fields. Innovative techniques are introduced at each stage to make the model versatile to handle most general terrain contours and also to reduce the computational requirement to a minimum. The model is applied to several terrain geometries, and the results are discussed.

  16. A modal parameter extraction procedure applicable to linear time-invariant dynamic systems

    NASA Technical Reports Server (NTRS)

    Kurdila, A. J.; Craig, R. R., Jr.

    1985-01-01

    Modal analysis has emerged as a valuable tool in many phases of the engineering design process. Complex vibration and acoustic problems in new designs can often be remedied through use of the method. Moreover, the technique has been used to enhance the conceptual understanding of structures by serving to verify analytical models. A new modal parameter estimation procedure is presented. The technique is applicable to linear, time-invariant systems and accommodates multiple input excitations. In order to provide a background for the derivation of the method, some modal parameter extraction procedures currently in use are described. Key features implemented in the new technique are elaborated upon.

  17. Measurement of third-order nonlinear susceptibility tensor in InP using extended Z-scan technique with elliptical polarization

    NASA Astrophysics Data System (ADS)

    Oishi, Masaki; Shinozaki, Tomohisa; Hara, Hikaru; Yamamoto, Kazunuki; Matsusue, Toshio; Bando, Hiroyuki

    2018-05-01

    The elliptical polarization dependence of the two-photon absorption coefficient β in InP has been measured by the extended Z-scan technique for thick materials in the wavelength range from 1640 to 1800 nm. The analytical formula of the Z-scan technique has been extended with consideration of multiple reflections. The Z-scan results have been fitted very well by the formula and β has been evaluated accurately. The three independent elements of the third-order nonlinear susceptibility tensor in InP have also been determined accurately from the elliptical polarization dependence of β.

  18. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers*

    PubMed Central

    Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-01-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782

  19. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    PubMed

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.

  20. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  1. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  2. Environmental applications for the analysis of chlorinated dibenzo-p-dioxins and dibenzofurans using mass spectrometry/mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiner, E.J.; Schellenberg, D.H.; Taguchi, V.Y.

    1991-01-01

    A mass spectrometry/mass spectrometry-multiple reaction monitoring (MS/MS-MRM) technique for the analysis of all tetra- through octachlorinated dibenzo-p-dioxins (Cl{sub x}DD, x = 4-8) and dibenzofurans (Cl{sub x}DF, x = 4-8) has been developed at the Ministry of the Environment (MOE) utilizing a triple quadrupole mass spectrometer. Optimization of instrumental parameters using the analyte of interest in a direct insertion probe (DIP) resulted in sensitivities approaching those obtainable by high-resolution mass spectrometric (HRMS) methods. All congeners of dioxins and furans were detected in the femtogram range. Results on selected samples indicated that for some matrices, fewer chemical interferences were observed by MS/MSmore » than by HRMS. The technique used to optimize the instrument for chlorinated dibenzo-p-dioxins (CDDs) and chlorinated dibenzofurans (CDFs) analysis is adaptable to other analytes.« less

  3. Electrospray ionization mass spectrometry: a technique to access the information beyond the molecular weight of the analyte.

    PubMed

    Banerjee, Shibdas; Mazumdar, Shyamalava

    2012-01-01

    The Electrospray Ionization (ESI) is a soft ionization technique extensively used for production of gas phase ions (without fragmentation) of thermally labile large supramolecules. In the present review we have described the development of Electrospray Ionization mass spectrometry (ESI-MS) during the last 25 years in the study of various properties of different types of biological molecules. There have been extensive studies on the mechanism of formation of charged gaseous species by the ESI. Several groups have investigated the origin and implications of the multiple charge states of proteins observed in the ESI-mass spectra of the proteins. The charged analytes produced by ESI can be fragmented by activating them in the gas-phase, and thus tandem mass spectrometry has been developed, which provides very important insights on the structural properties of the molecule. The review will highlight recent developments and emerging directions in this fascinating area of research.

  4. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  5. Comparison of procedures for correction of matrix interferences in the analysis of soils by ICP-OES with CCD detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadler, D.A.; Sun, F.; Littlejohn, D.

    1995-12-31

    ICP-OES is a useful technique for multi-element analysis of soils. However, as a number of elements are present in relatively high concentrations, matrix interferences can occur and examples have been widely reported. The availability of CCD detectors has increased the opportunities for rapid multi-element, multi-wave-length determination of elemental concentrations in soils and other environmental samples. As the composition of soils from industrial sites can vary considerably, especially when taken from different pit horizons, procedures are required to assess the extent of interferences and correct the effects, on a simultaneous multi-element basis. In single element analysis, plasma operating conditions can sometimesmore » be varied to minimize or even remove multiplicative interferences. In simultaneous multi-element analysis, the scope for this approach may be limited, depending on the spectrochemical characteristics of the emitting analyte species. Matrix matching, by addition of major sample components to the analyte calibrant solutions, can be used to minimize inaccuracies. However, there are also limitations to this procedure, when the sample composition varies significantly. Multiplicative interference effects can also be assessed by a {open_quotes}single standard addition{close_quotes} of each analyte to the sample solution and the information obtained may be used to correct the analyte concentrations determined directly. Each of these approaches has been evaluated to ascertain the best procedure for multi-element analysis of industrial soils by ICP-OES with CCD detection at multiple wavelengths. Standard reference materials and field samples have been analyzed to illustrate the efficacy of each procedure.« less

  6. Chromatographic peak deconvolution of constitutional isomers by multiple-reaction-monitoring mass spectrometry.

    PubMed

    Trapp, Oliver

    2010-02-12

    Highly efficient and sophisticated separation techniques are available to analyze complex compound mixtures with superior sensitivities and selectivities often enhanced by a 2nd dimension, e.g. a separation technique or spectroscopic and spectrometric techniques. For enantioselective separations numerous chiral stationary phases (CSPs) exist to cover a broad range of chiral compounds. Despite these advances enantioselective separations can become very challenging for mixtures of stereolabile constitutional isomers, because the on-column interconversion can lead to completely overlapping peak profiles. Typically, multidimensional separation techniques, e.g. multidimensional GC (MDGC), using an achiral 1st separation dimension and transferring selected analytes to a chiral 2nd separation are the method of choice to approach such problems. However, this procedure is very time consuming and only predefined sections of peaks can be transferred by column switching to the second dimension. Here we demonstrate for stereolabile 1,2-dialkylated diaziridines a technique to experimentally deconvolute overlapping gas chromatographic elution profiles of constitutional isomers based on multiple-reaction-monitoring MS (MRM-MS). The here presented technique takes advantage of different fragmentation probabilities and pathways to isolate the elution profile of configurational isomers. Copyright 2009 Elsevier B.V. All rights reserved.

  7. Technique for Predicting the RF Field Strength Inside an Enclosure

    NASA Technical Reports Server (NTRS)

    Hallett, M.; Reddell, J.

    1998-01-01

    This Memorandum presents a simple analytical technique for predicting the RF electric field strength inside an enclosed volume in which radio frequency radiation occurs. The technique was developed to predict the radio frequency (RF) field strength within a launch vehicle's fairing from payloads launched with their telemetry transmitters radiating and to the impact of the radiation on the vehicle and payload. The RF field strength is shown to be a function of the surface materials and surface areas. The method accounts for RF energy losses within exposed surfaces, through RF windows, and within multiple layers of dielectric materials which may cover the surfaces. This Memorandum includes the rigorous derivation of all equations and presents examples and data to support the validity of the technique.

  8. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  9. Bessel Fourier orientation reconstruction: an analytical EAP reconstruction using multiple shell acquisitions in diffusion MRI.

    PubMed

    Hosseinbor, Ameer Pasha; Chung, Moo K; Wu, Yu-Chien; Alexander, Andrew L

    2011-01-01

    The estimation of the ensemble average propagator (EAP) directly from q-space DWI signals is an open problem in diffusion MRI. Diffusion spectrum imaging (DSI) is one common technique to compute the EAP directly from the diffusion signal, but it is burdened by the large sampling required. Recently, several analytical EAP reconstruction schemes for multiple q-shell acquisitions have been proposed. One, in particular, is Diffusion Propagator Imaging (DPI) which is based on the Laplace's equation estimation of diffusion signal for each shell acquisition. Viewed intuitively in terms of the heat equation, the DPI solution is obtained when the heat distribution between temperatuere measurements at each shell is at steady state. We propose a generalized extension of DPI, Bessel Fourier Orientation Reconstruction (BFOR), whose solution is based on heat equation estimation of the diffusion signal for each shell acquisition. That is, the heat distribution between shell measurements is no longer at steady state. In addition to being analytical, the BFOR solution also includes an intrinsic exponential smootheing term. We illustrate the effectiveness of the proposed method by showing results on both synthetic and real MR datasets.

  10. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  11. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  12. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers.

    PubMed

    Chen, Yi-Ting; Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-05-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  13. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  14. Microphotographs of cyanobacteria documenting the effects of various cell-lysis techniques

    USGS Publications Warehouse

    Rosen, Barry H.; Loftin, Keith A.; Smith, Christopher E.; Lane, Rachael F.; Keydel, Susan P.

    2011-01-01

    Cyanotoxins are a group of organic compounds biosynthesized intracellularly by many species of cyanobacteria found in surface water. The United States Environmental Protection Agency has listed cyanotoxins on the Safe Drinking Water Act's Contaminant Candidate List 3 for consideration for future regulation to protect public health. Cyanotoxins also pose a risk to humans and other organisms in a variety of other exposure scenarios. Accurate and precise analytical measurements of cyanotoxins are critical to the evaluation of concentrations in surface water to address the human health and ecosystem effects. A common approach to total cyanotoxin measurement involves cell membrane disruption to release the cyanotoxins to the dissolved phase followed by filtration to remove cellular debris. Several methods have been used historically, however no standard protocols exist to ensure this process is consistent between laboratories before the dissolved phase is measured by an analytical technique for cyanotoxin identification and quantitation. No systematic evaluation has been conducted comparing the multiple laboratory sample processing techniques for physical disruption of cell membrane or cyanotoxins recovery. Surface water samples collected from lakes, reservoirs, and rivers containing mixed assemblages of organisms dominated by cyanobacteria, as well as laboratory cultures of species-specific cyanobacteria, were used as part of this study evaluating multiple laboratory cell-lysis techniques in partnership with the U.S. Environmental Protection Agency. Evaluated extraction techniques included boiling, autoclaving, sonication, chemical treatment, and freeze-thaw. Both treated and untreated samples were evaluated for cell membrane integrity microscopically via light, epifluorescence, and epifluorescence in the presence of a DNA stain. The DNA stain, which does not permeate live cells with intact membrane structures, was used as an indicator for cyanotoxin release into the dissolved phase. Of the five techniques, sonication (at 70 percent) was most effective at complete cell destruction while QuikLyse (Trademarked) was least effective. Autoclaving, boiling, and sequential freeze-thaw were moderately effective in physical destruction of colonies and filaments.

  15. Transient well flow in leaky multiple-aquifer systems

    NASA Astrophysics Data System (ADS)

    Hemker, C. J.

    1985-10-01

    A previously developed eigenvalue analysis approach to groundwater flow in leaky multiple aquifers is used to derive exact solutions for transient well flow problems in leaky and confined systems comprising any number of aquifers. Equations are presented for the drawdown distribution in systems of infinite extent, caused by wells penetrating one or more of the aquifers completely and discharging each layer at a constant rate. Since the solution obtained may be regarded as a combined analytical-numerical technique, a type of one-dimensional modelling can be applied to find approximate solutions for several complicating conditions. Numerical evaluations are presented as time-drawdown curves and include effects of storage in the aquitard, unconfined conditions, partially penetrating wells and stratified aquifers. The outcome of calculations for relatively simple systems compares very well with published corresponding results. The proposed multilayer solution can be a valuable tool in aquifer test evaluation, as it provides the analytical expression required to enable the application of existing computer methods to the determination of aquifer characteristics.

  16. Signal-on electrochemical detection of antibiotics at zeptomole level based on target-aptamer binding triggered multiple recycling amplification.

    PubMed

    Wang, Hongzhi; Wang, Yu; Liu, Su; Yu, Jinghua; Guo, Yuna; Xu, Ying; Huang, Jiadong

    2016-06-15

    In the work, a signal-on electrochemical DNA sensor based on multiple amplification for ultrasensitive detection of antibiotics has been reported. In the presence of target, the ingeniously designed hairpin probe (HP1) is opened and the polymerase-assisted target recycling amplification is triggered, resulting in autonomous generation of secondary target. It is worth noting that the produced secondary target could not only hybridize with other HP1, but also displace the Helper from the electrode. Consequently, methylene blue labeled HP2 forms a "close" probe structure, and the increase of signal is monitored. The increasing current provides an ultrasensitive electrochemical detection for antibiotics down to 1.3 fM. To our best knowledge, such work is the first report about multiple recycling amplification combing with signal-on sensing strategy, which has been utilized for quantitative determination of antibiotics. It would be further used as a general strategy associated with more analytical techniques toward the detection of a wide spectrum of analytes. Thus, it holds great potential for the development of ultrasensitive biosensing platform for the applications in bioanalysis, disease diagnostics, and clinical biomedicine. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Miniaturized Temperature-Controlled Planar Chromatography (Micro-TLC) as a Versatile Technique for Fast Screening of Micropollutants and Biomarkers Derived from Surface Water Ecosystems and During Technological Processes of Wastewater Treatment.

    PubMed

    Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2017-07-01

    There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.

  18. Video Analytics Evaluation: Survey of Datasets, Performance Metrics and Approaches

    DTIC Science & Technology

    2014-09-01

    training phase and a fusion of the detector outputs. 6.3.1 Training Techniques 1. Bagging: The basic idea of Bagging is to train multiple classifiers...can reduce more noise interesting points. Person detection and background subtraction methods were used to create hot regions. The hot regions were...detection algorithms are incorporated with MHT to construct one integrated detector /tracker. 6.8 IRDS-CASIA team IRDS-CASIA proposed a method to solve a

  19. Reduction of interferences in graphite furnace atomic absorption spectrometry by multiple linear regression modelling

    NASA Astrophysics Data System (ADS)

    Grotti, Marco; Abelmoschi, Maria Luisa; Soggia, Francesco; Tiberiade, Christian; Frache, Roberto

    2000-12-01

    The multivariate effects of Na, K, Mg and Ca as nitrates on the electrothermal atomisation of manganese, cadmium and iron were studied by multiple linear regression modelling. Since the models proved to efficiently predict the effects of the considered matrix elements in a wide range of concentrations, they were applied to correct the interferences occurring in the determination of trace elements in seawater after pre-concentration of the analytes. In order to obtain a statistically significant number of samples, a large volume of the certified seawater reference materials CASS-3 and NASS-3 was treated with Chelex-100 resin; then, the chelating resin was separated from the solution, divided into several sub-samples, each of them was eluted with nitric acid and analysed by electrothermal atomic absorption spectrometry (for trace element determinations) and inductively coupled plasma optical emission spectrometry (for matrix element determinations). To minimise any other systematic error besides that due to matrix effects, accuracy of the pre-concentration step and contamination levels of the procedure were checked by inductively coupled plasma mass spectrometric measurements. Analytical results obtained by applying the multiple linear regression models were compared with those obtained with other calibration methods, such as external calibration using acid-based standards, external calibration using matrix-matched standards and the analyte addition technique. Empirical models proved to efficiently reduce interferences occurring in the analysis of real samples, allowing an improvement of accuracy better than for other calibration methods.

  20. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  1. Evaluating the decision accuracy and speed of clinical data visualizations.

    PubMed

    Pieczkiewicz, David S; Finkelstein, Stanley M

    2010-01-01

    Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.

  2. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  3. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  4. Electrospray Ionization Mass Spectrometry: A Technique to Access the Information beyond the Molecular Weight of the Analyte

    PubMed Central

    Banerjee, Shibdas; Mazumdar, Shyamalava

    2012-01-01

    The Electrospray Ionization (ESI) is a soft ionization technique extensively used for production of gas phase ions (without fragmentation) of thermally labile large supramolecules. In the present review we have described the development of Electrospray Ionization mass spectrometry (ESI-MS) during the last 25 years in the study of various properties of different types of biological molecules. There have been extensive studies on the mechanism of formation of charged gaseous species by the ESI. Several groups have investigated the origin and implications of the multiple charge states of proteins observed in the ESI-mass spectra of the proteins. The charged analytes produced by ESI can be fragmented by activating them in the gas-phase, and thus tandem mass spectrometry has been developed, which provides very important insights on the structural properties of the molecule. The review will highlight recent developments and emerging directions in this fascinating area of research. PMID:22611397

  5. Numerical solution of potential flow about arbitrary 2-dimensional multiple bodies

    NASA Technical Reports Server (NTRS)

    Thompson, J. F.; Thames, F. C.

    1982-01-01

    A procedure for the finite-difference numerical solution of the lifting potential flow about any number of arbitrarily shaped bodies is given. The solution is based on a technique of automatic numerical generation of a curvilinear coordinate system having coordinate lines coincident with the contours of all bodies in the field, regardless of their shapes and number. The effects of all numerical parameters involved are analyzed and appropriate values are recommended. Comparisons with analytic solutions for single Karman-Trefftz airfoils and a circular cylinder pair show excellent agreement. The technique of application of the boundary-fitted coordinate systems to the numerical solution of partial differential equations is illustrated.

  6. High-throughput screening for new psychoactive substances (NPS) in whole blood by DLLME extraction and UHPLC-MS/MS analysis.

    PubMed

    Odoardi, Sara; Fisichella, Marco; Romolo, Francesco Saverio; Strano-Rossi, Sabina

    2015-09-01

    The increasing number of new psychoactive substances (NPS) present in the illicit market render their identification in biological fluids/tissues of great concern for clinical and forensic toxicology. Analytical methods able to detect the huge number of substances that can be used are sought, considering also that many NPS are not detected by the standard immunoassays generally used for routine drug screening. The aim of this work was to develop a method for the screening of different classes of NPS (a total of 78 analytes including cathinones, synthetic cannabinoids, phenethylamines, piperazines, ketamine and analogues, benzofurans, tryptamines) from blood samples. The simultaneous extraction of analytes was performed by Dispersive Liquid/Liquid Microextraction DLLME, a very rapid, cheap and efficient extraction technique that employs microliters amounts of organic solvents. Analyses were performed by a target Ultrahigh Performance Liquid Chromatography tandem Mass Spectrometry (UHPLC-MS/MS) method in multiple reaction monitoring (MRM). The method allowed the detection of the studied analytes with limits of detection (LODs) ranging from 0.2 to 2ng/mL. The proposed DLLME method can be used as an alternative to classical liquid/liquid or solid-phase extraction techniques due to its rapidity, necessity to use only microliters amounts of organic solvents, cheapness, and to its ability to extract simultaneously a huge number of analytes also from different chemical classes. The method was then applied to 60 authentic real samples from forensic cases, demonstrating its suitability for the screening of a wide number of NPS. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. A State-of-the-Art Contamination Effects Research and Test Facility

    NASA Technical Reports Server (NTRS)

    Olson, Keith R.; Folgner, Kelsey A.; Barrie, James D.; Villahermosa, Randy M.

    2008-01-01

    In the ongoing effort to better understand various spacecraft contamination phenomena, a new state of the art contamination effects research and test facility was designed, and recently brought on-line at The Aerospace Corporation s Space Materials Laboratory. This high vacuum test chamber employs multiple in-situ analytical techniques, making it possible to study both the qualitative and quantitative aspects of contaminant film formation in the presence or absence of VUV radiation. Adsorption and desorption kinetics, "photo-fixing efficiency", transmission loss of uniform contaminant films, light scatter from non-uniform films, and film morphology have been studied in this facility. This paper describes this new capability in detail and presents data collected from several of the analytical instruments.

  8. New tools for investigating student learning in upper-division electrostatics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.

    Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.

  9. The composition-explicit distillation curve technique: Relating chemical analysis and physical properties of complex fluids.

    PubMed

    Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L

    2010-04-16

    The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.

  10. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Trace level detection of compounds related to the chemical weapons convention by 1H-detected 13C NMR spectroscopy executed with a sensitivity-enhanced, cryogenic probehead.

    PubMed

    Cullinan, David B; Hondrogiannis, George; Henderson, Terry J

    2008-04-15

    Two-dimensional 1H-13C HSQC (heteronuclear single quantum correlation) and fast-HMQC (heteronuclear multiple quantum correlation) pulse sequences were implemented using a sensitivity-enhanced, cryogenic probehead for detecting compounds relevant to the Chemical Weapons Convention present in complex mixtures. The resulting methods demonstrated exceptional sensitivity for detecting the analytes at trace level concentrations. 1H-13C correlations of target analytes at < or = 25 microg/mL were easily detected in a sample where the 1H solvent signal was approximately 58,000-fold more intense than the analyte 1H signals. The problem of overlapping signals typically observed in conventional 1H spectroscopy was essentially eliminated, while 1H and 13C chemical shift information could be derived quickly and simultaneously from the resulting spectra. The fast-HMQC pulse sequences generated magnitude mode spectra suitable for detailed analysis in approximately 4.5 h and can be used in experiments to efficiently screen a large number of samples. The HSQC pulse sequences, on the other hand, required roughly twice the data acquisition time to produce suitable spectra. These spectra, however, were phase-sensitive, contained considerably more resolution in both dimensions, and proved to be superior for detecting analyte 1H-13C correlations. Furthermore, a HSQC spectrum collected with a multiplicity-edited pulse sequence provided additional structural information valuable for identifying target analytes. The HSQC pulse sequences are ideal for collecting high-quality data sets with overnight acquisitions and logically follow the use of fast-HMQC pulse sequences to rapidly screen samples for potential target analytes. Use of the pulse sequences considerably improves the performance of NMR spectroscopy as a complimentary technique for the screening, identification, and validation of chemical warfare agents and other small-molecule analytes present in complex mixtures and environmental samples.

  12. A new concept of pencil beam dose calculation for 40-200 keV photons using analytical dose kernels.

    PubMed

    Bartzsch, Stefan; Oelfke, Uwe

    2013-11-01

    The advent of widespread kV-cone beam computer tomography in image guided radiation therapy and special therapeutic application of keV photons, e.g., in microbeam radiation therapy (MRT) require accurate and fast dose calculations for photon beams with energies between 40 and 200 keV. Multiple photon scattering originating from Compton scattering and the strong dependence of the photoelectric cross section on the atomic number of the interacting tissue render these dose calculations by far more challenging than the ones established for corresponding MeV beams. That is why so far developed analytical models of kV photon dose calculations fail to provide the required accuracy and one has to rely on time consuming Monte Carlo simulation techniques. In this paper, the authors introduce a novel analytical approach for kV photon dose calculations with an accuracy that is almost comparable to the one of Monte Carlo simulations. First, analytical point dose and pencil beam kernels are derived for homogeneous media and compared to Monte Carlo simulations performed with the Geant4 toolkit. The dose contributions are systematically separated into contributions from the relevant orders of multiple photon scattering. Moreover, approximate scaling laws for the extension of the algorithm to inhomogeneous media are derived. The comparison of the analytically derived dose kernels in water showed an excellent agreement with the Monte Carlo method. Calculated values deviate less than 5% from Monte Carlo derived dose values, for doses above 1% of the maximum dose. The analytical structure of the kernels allows adaption to arbitrary materials and photon spectra in the given energy range of 40-200 keV. The presented analytical methods can be employed in a fast treatment planning system for MRT. In convolution based algorithms dose calculation times can be reduced to a few minutes.

  13. Facilitating Multiple Intelligences through Multimodal Learning Analytics

    ERIC Educational Resources Information Center

    Perveen, Ayesha

    2018-01-01

    This paper develops a theoretical framework for employing learning analytics in online education to trace multiple learning variations of online students by considering their potential of being multiple intelligences based on Howard Gardner's 1983 theory of multiple intelligences. The study first emphasizes the need to facilitate students as…

  14. Experiences with Probabilistic Analysis Applied to Controlled Systems

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Giesy, Daniel P.

    2004-01-01

    This paper presents a semi-analytic method for computing frequency dependent means, variances, and failure probabilities for arbitrarily large-order closed-loop dynamical systems possessing a single uncertain parameter or with multiple highly correlated uncertain parameters. The approach will be shown to not suffer from the same computational challenges associated with computing failure probabilities using conventional FORM/SORM techniques. The approach is demonstrated by computing the probabilistic frequency domain performance of an optimal feed-forward disturbance rejection scheme.

  15. The 2D analytic signal for envelope detection and feature extraction on ultrasound images.

    PubMed

    Wachinger, Christian; Klein, Tassilo; Navab, Nassir

    2012-08-01

    The fundamental property of the analytic signal is the split of identity, meaning the separation of qualitative and quantitative information in form of the local phase and the local amplitude, respectively. Especially the structural representation, independent of brightness and contrast, of the local phase is interesting for numerous image processing tasks. Recently, the extension of the analytic signal from 1D to 2D, covering also intrinsic 2D structures, was proposed. We show the advantages of this improved concept on ultrasound RF and B-mode images. Precisely, we use the 2D analytic signal for the envelope detection of RF data. This leads to advantages for the extraction of the information-bearing signal from the modulated carrier wave. We illustrate this, first, by visual assessment of the images, and second, by performing goodness-of-fit tests to a Nakagami distribution, indicating a clear improvement of statistical properties. The evaluation is performed for multiple window sizes and parameter estimation techniques. Finally, we show that the 2D analytic signal allows for an improved estimation of local features on B-mode images. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Multiple Theoretical Lenses as an Analytical Strategy in Researching Group Discussions

    ERIC Educational Resources Information Center

    Berge, Maria; Ingerman, Åke

    2017-01-01

    Background: In science education today, there is an emerging focus on what is happening in situ, making use of an array of analytical traditions. Common practice is to use one specific analytical framing within a research project, but there are projects that make use of multiple analytical framings to further the understanding of the same data,…

  17. Detection and quantification of cocaine and benzoylecgonine in meconium using solid phase extraction and UPLC/MS/MS.

    PubMed

    Gunn, Josh; Kriger, Scott; Terrell, Andrea R

    2010-01-01

    The simultaneous determination and quantification of cocaine and its major metabolite, benzoylecgonine, in meconium using UPLC-MS/MS is described. Ultra-performance liquid chromatography (UPLC) is an emerging analytical technique which draws upon the principles of chromatography to run separations at higher flow rates for increased speed, while simultaneously achieving superior resolution and sensitivity. Extraction of cocaine and benzoylecgonine from the homogenized meconium matrix was achieved with a preliminary protein precipitation or protein 'crash' employing cold acetonitrile, followed by a mixed mode solid phase extraction (SPE). Following elution from the SPE cartridge, eluents were dried down under nitrogen, reconstituted in 200 microL of DI water:acetonitrile (ACN) (75:25), and injected onto the UPLC/MS/MS for analysis. The increased speed and separation efficiency afforded by UPLC, allowed for the separation and subsequent quantification of both analytes in less than 2 min. Analytes were quantified using multiple reaction monitoring (MRM) and six-point calibration curves constructed in negative blood. Limits of detection for both analytes were 3 ng/g and the lower limit of quantitation (LLOQ) was 30 ng/g.

  18. Common aero vehicle autonomous reentry trajectory optimization satisfying waypoint and no-fly zone constraints

    NASA Astrophysics Data System (ADS)

    Jorris, Timothy R.

    2007-12-01

    To support the Air Force's Global Reach concept, a Common Aero Vehicle is being designed to support the Global Strike mission. "Waypoints" are specified for reconnaissance or multiple payload deployments and "no-fly zones" are specified for geopolitical restrictions or threat avoidance. Due to time critical targets and multiple scenario analysis, an autonomous solution is preferred over a time-intensive, manually iterative one. Thus, a real-time or near real-time autonomous trajectory optimization technique is presented to minimize the flight time, satisfy terminal and intermediate constraints, and remain within the specified vehicle heating and control limitations. This research uses the Hypersonic Cruise Vehicle (HCV) as a simplified two-dimensional platform to compare multiple solution techniques. The solution techniques include a unique geometric approach developed herein, a derived analytical dynamic optimization technique, and a rapidly emerging collocation numerical approach. This up-and-coming numerical technique is a direct solution method involving discretization then dualization, with pseudospectral methods and nonlinear programming used to converge to the optimal solution. This numerical approach is applied to the Common Aero Vehicle (CAV) as the test platform for the full three-dimensional reentry trajectory optimization problem. The culmination of this research is the verification of the optimality of this proposed numerical technique, as shown for both the two-dimensional and three-dimensional models. Additionally, user implementation strategies are presented to improve accuracy and enhance solution convergence. Thus, the contributions of this research are the geometric approach, the user implementation strategies, and the determination and verification of a numerical solution technique for the optimal reentry trajectory problem that minimizes time to target while satisfying vehicle dynamics and control limitation, and heating, waypoint, and no-fly zone constraints.

  19. Multiple control strategies for prevention of avian influenza pandemic.

    PubMed

    Ullah, Roman; Zaman, Gul; Islam, Saeed

    2014-01-01

    We present the prevention of avian influenza pandemic by adjusting multiple control functions in the human-to-human transmittable avian influenza model. First we show the existence of the optimal control problem; then by using both analytical and numerical techniques, we investigate the cost-effective control effects for the prevention of transmission of disease. To do this, we use three control functions, the effort to reduce the number of contacts with human infected with mutant avian influenza, the antiviral treatment of infected individuals, and the effort to reduce the number of infected birds. We completely characterized the optimal control and compute numerical solution of the optimality system by using an iterative method.

  20. Modelling a suitable location for Urban Solid Waste Management using AHP method and GIS -A geospatial approach and MCDM Model

    NASA Astrophysics Data System (ADS)

    Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.

    2016-12-01

    Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.

  1. Thermal Studies of Ammonium Cyanide Reactions: A Model for Thermal Alteration of Prebiotic Compounds in Meteorite Parent Bodies

    NASA Technical Reports Server (NTRS)

    Hammer, P. G.; Locke, D. R.; Burton, A. S.; Callahan, M. P.

    2017-01-01

    Organic compounds in carbonaceous chondrites were likely transformed by a variety of parent body processes including thermal and aqueous processing. Here, we analyzed ammonium cyanide reactions that were heated at different temperatures and times by multiple analytical techniques. The goal of this study is to better understand the effect of hydrothermal alteration on cyanide chemistry, which is believed to be responsible for the abiotic synthesis of purine nucleobases and their structural analogs detected in carbonaceous chondrites.

  2. Analytical techniques and instrumentation, a compilation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Procedures for conducting materials tests and structural analyses of aerospace components are presented as a part of the NASA technology utilization program. Some of the subjects discussed are as follows: (1) failures in cryogenic tank insulation, (2) friction characteristics of graphite and graphite-metal combinations, (3) evaluation of polymeric products in thermal-vacuum environment, (4) erosion of metals by multiple impacts with water, (5) mass loading effects on vibrated ring and shell structures, (6) nonlinear damping in structures, and (7) method for estimating reliability of randomly excited structures.

  3. Extending the Distributed Lag Model framework to handle chemical mixtures.

    PubMed

    Bello, Ghalib A; Arora, Manish; Austin, Christine; Horton, Megan K; Wright, Robert O; Gennings, Chris

    2017-07-01

    Distributed Lag Models (DLMs) are used in environmental health studies to analyze the time-delayed effect of an exposure on an outcome of interest. Given the increasing need for analytical tools for evaluation of the effects of exposure to multi-pollutant mixtures, this study attempts to extend the classical DLM framework to accommodate and evaluate multiple longitudinally observed exposures. We introduce 2 techniques for quantifying the time-varying mixture effect of multiple exposures on an outcome of interest. Lagged WQS, the first technique, is based on Weighted Quantile Sum (WQS) regression, a penalized regression method that estimates mixture effects using a weighted index. We also introduce Tree-based DLMs, a nonparametric alternative for assessment of lagged mixture effects. This technique is based on the Random Forest (RF) algorithm, a nonparametric, tree-based estimation technique that has shown excellent performance in a wide variety of domains. In a simulation study, we tested the feasibility of these techniques and evaluated their performance in comparison to standard methodology. Both methods exhibited relatively robust performance, accurately capturing pre-defined non-linear functional relationships in different simulation settings. Further, we applied these techniques to data on perinatal exposure to environmental metal toxicants, with the goal of evaluating the effects of exposure on neurodevelopment. Our methods identified critical neurodevelopmental windows showing significant sensitivity to metal mixtures. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  6. Sensor Data Qualification System (SDQS) Implementation Study

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Melcher, Kevin; Fulton, Christopher; Maul, William

    2009-01-01

    The Sensor Data Qualification System (SDQS) is being developed to provide a sensor fault detection capability for NASA s next-generation launch vehicles. In addition to traditional data qualification techniques (such as limit checks, rate-of-change checks and hardware redundancy checks), SDQS can provide augmented capability through additional techniques that exploit analytical redundancy relationships to enable faster and more sensitive sensor fault detection. This paper documents the results of a study that was conducted to determine the best approach for implementing a SDQS network configuration that spans multiple subsystems, similar to those that may be implemented on future vehicles. The best approach is defined as one that most minimizes computational resource requirements without impacting the detection of sensor failures.

  7. High-voltage spark atomic emission detector for gas chromatography

    NASA Technical Reports Server (NTRS)

    Calkin, C. L.; Koeplin, S. M.; Crouch, S. R.

    1982-01-01

    A dc-powered, double-gap, miniature nanosecond spark source for emission spectrochemical analysis of gas chromatographic effluents is described. The spark is formed between two thoriated tungsten electrodes by the discharge of a coaxial capacitor. The spark detector is coupled to the gas chromatograph by a heated transfer line. The gas chromatographic effluent is introduced into the heated spark chamber where atomization and excitation of the effluent occurs upon breakdown of the analytical gap. A microcomputer-controlled data acquisition system allows the implementation of time-resolution techniques to distinguish between the analyte emission and the background continuum produced by the spark discharge. Multiple sparks are computer averaged to improve the signal-to-noise ratio. The application of the spark detector for element-selective detection of metals and nonmetals is reported.

  8. Multiple Scattering Effects on Pulse Propagation in Optically Turbid Media.

    NASA Astrophysics Data System (ADS)

    Joelson, Bradley David

    The effects of multiple scattering in a optically turbid media is examined for an impulse solution to the radiative transfer equation for a variety of geometries and phase functions. In regions where the complexities of the phase function proved too cumbersome for analytic methods Monte Carlo techniques were developed to describe the entire scalar radiance distribution. The determination of a general spread function is strongly dependent on geometry and particular regions where limits can be placed on the variables of the problem. Hence, the general spread function is first simplified by considering optical regions which reduce the complexity of the variable dependence. First, in the small-angle limit we calculate some contracted spread functions along with their moments and then use Monte Carlo techniques to establish the limitations imposed by the small-angle approximation in planar geometry. The point spread function (PSF) for a spherical geometry is calculated for the full angular spread in the forward direction of ocean waters using Monte Carlo methods in the optically thin and moderate depths and analytic methods in the diffusion domain. The angular dependence of the PSF for various ocean waters is examined for a range of optical parameters. The analytic method used in the diffusion calculation is justified by examining the angular dependence of the radiance of a impulse solution in a planar geometry for a prolongated Henyey-Greenstein phase function of asymmetry factor approximately equal to that of the ocean phase functions. The Legendre moments of the radiance are examined in order to examine the viability of the diffusion approximation which assumes a linearly anisotropic angular distribution for the radiance. A realistic lidar calculation is performed for a variety of ocean waters to determine the effects of multiple scattering on the determination of the speed of sound by using the range gated frequency spectrum of the lidar signal. It is shown that the optical properties of the ocean help to ensure single scatter form for the frequency spectra of the lidar signal. This spectra can then be used to compute the speed of sound and backscatter probability.

  9. Considerations in detecting CDC select agents under field conditions

    NASA Astrophysics Data System (ADS)

    Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul

    2008-04-01

    Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.

  10. Importance of Preserving Cross-correlation in developing Statistically Downscaled Climate Forcings and in estimating Land-surface Fluxes and States

    NASA Astrophysics Data System (ADS)

    Das Bhowmik, R.; Arumugam, S.

    2015-12-01

    Multivariate downscaling techniques exhibited superiority over univariate regression schemes in terms of preserving cross-correlations between multiple variables- precipitation and temperature - from GCMs. This study focuses on two aspects: (a) develop an analytical solutions on estimating biases in cross-correlations from univariate downscaling approaches and (b) quantify the uncertainty in land-surface states and fluxes due to biases in cross-correlations in downscaled climate forcings. Both these aspects are evaluated using climate forcings available from both historical climate simulations and CMIP5 hindcasts over the entire US. The analytical solution basically relates the univariate regression parameters, co-efficient of determination of regression and the co-variance ratio between GCM and downscaled values. The analytical solutions are compared with the downscaled univariate forcings by choosing the desired p-value (Type-1 error) in preserving the observed cross-correlation. . For quantifying the impacts of biases on cross-correlation on estimating streamflow and groundwater, we corrupt the downscaled climate forcings with different cross-correlation structure.

  11. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  12. Microfabricated capillary electrophoresis chip and method for simultaneously detecting multiple redox labels

    DOEpatents

    Mathies, Richard A.; Singhal, Pankaj; Xie, Jin; Glazer, Alexander N.

    2002-01-01

    This invention relates to a microfabricated capillary electrophoresis chip for detecting multiple redox-active labels simultaneously using a matrix coding scheme and to a method of selectively labeling analytes for simultaneous electrochemical detection of multiple label-analyte conjugates after electrophoretic or chromatographic separation.

  13. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  14. A Quantile Regression Approach to Understanding the Relations Between Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students

    PubMed Central

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2015-01-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773

  15. Development and validation of a sensitive and fast UPLC-MS/MS method for simultaneous determination of seven bioactive compounds in rat plasma after oral administration of Guizhi-gancao decoction.

    PubMed

    Ji, Bin; Zhuo, Limeng; Yang, Bin; Wang, Yang; Li, Lin; Yu, Miao; Zhao, Yunli; Yu, Zhiguo

    2017-04-15

    Rapid, sensitive, selective and accurate UPLC-MS/MS method was developed and fully validated for simultaneous determination of cinnamaldehyde, cinnamic acid, 2-methoxy cinnamic acid, glycyrrhizic acid, glycyrrhetinic acid, liquiritigenin and isoliquiritin in rat plasma after oral administration of Guizhi-gancao decoction. Plasma samples were processed with a simple protein precipitation technique using acetonitrile, followed by chromatographic separation using a Thermo Hypersil GOLD C 18 column. A 11.0min linear gradient elution was used at a flow rate of 0.2mL/min with a mobile phase of 0.1% acetic acid containing 0.2mM ammonium acetate in water and acetonitrile. The analytes and internal standard, schisandrin, were detected using both positive and negative ion electrospray ionization in multiple reaction monitoring mode. The developed method was validated for intra-day and inter-day accuracy and precision whose values fell in the acceptable limits. Matrix effect was found to be minimal. Recovery efficiency of all the analytes was found to be >60%. Stability results showed that the analytes were stable at all the conditions. This validated method was successfully used to study the pharmacokinetics of multiple compounds in rat plasma after oral administration of Guizhi-gancao decoction. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Graphene Nanoplatelet-Polymer Chemiresistive Sensor Arrays for the Detection and Discrimination of Chemical Warfare Agent Simulants.

    PubMed

    Wiederoder, Michael S; Nallon, Eric C; Weiss, Matt; McGraw, Shannon K; Schnee, Vincent P; Bright, Collin J; Polcha, Michael P; Paffenroth, Randy; Uzarski, Joshua R

    2017-11-22

    A cross-reactive array of semiselective chemiresistive sensors made of polymer-graphene nanoplatelet (GNP) composite coated electrodes was examined for detection and discrimination of chemical warfare agents (CWA). The arrays employ a set of chemically diverse polymers to generate a unique response signature for multiple CWA simulants and background interferents. The developed sensors' signal remains consistent after repeated exposures to multiple analytes for up to 5 days with a similar signal magnitude across different replicate sensors with the same polymer-GNP coating. An array of 12 sensors each coated with a different polymer-GNP mixture was exposed 100 times to a cycle of single analyte vapors consisting of 5 chemically similar CWA simulants and 8 common background interferents. The collected data was vector normalized to reduce concentration dependency, z-scored to account for baseline drift and signal-to-noise ratio, and Kalman filtered to reduce noise. The processed data was dimensionally reduced with principal component analysis and analyzed with four different machine learning algorithms to evaluate discrimination capabilities. For 5 similarly structured CWA simulants alone 100% classification accuracy was achieved. For all analytes tested 99% classification accuracy was achieved demonstrating the CWA discrimination capabilities of the developed system. The novel sensor fabrication methods and data processing techniques are attractive for development of sensor platforms for discrimination of CWA and other classes of chemical vapors.

  17. UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.

    PubMed

    Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun

    2013-12-01

    Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.

  18. Silica Modified with Polyaniline as a Potential Sorbent for Matrix Solid Phase Dispersion (MSPD) and Dispersive Solid Phase Extraction (d-SPE) of Plant Samples

    PubMed Central

    Sowa, Ireneusz; Wójciak-Kosior, Magdalena; Strzemski, Maciej; Sawicki, Jan; Staniak, Michał; Dresler, Sławomir; Szwerc, Wojciech; Mołdoch, Jarosław; Latalski, Michał

    2018-01-01

    Polyaniline (PANI) is one of the best known conductive polymers with multiple applications. Recently, it was also used in separation techniques, mostly as a component of composites for solid-phase microextraction (SPME). In the present paper, sorbent obtained by in situ polymerization of aniline directly on silica gel particles (Si-PANI) was used for dispersive solid phase extraction (d-SPE) and matrix solid–phase extraction (MSPD). The efficiency of both techniques was evaluated with the use of high performance liquid chromatography with diode array detection (HPLC-DAD) quantitative analysis. The quality of the sorbent was verified by Raman spectroscopy and microscopy combined with automated procedure using computer image analysis. For extraction experiments, triterpenes were chosen as model compounds. The optimal conditions were as follows: protonated Si-PANI impregnated with water, 160/1 sorbent/analyte ratio, 3 min of extraction time, 4 min of desorption time and methanolic solution of ammonia for elution of analytes. The proposed procedure was successfully used for pretreatment of plant samples. PMID:29565297

  19. Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.

    PubMed

    Ötles, Semih; Kartal, Canan

    2016-01-01

    Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.

  20. Photochemical Degradation of the Anticancer Drug Bortezomib by V-UV/UV (185/254 nm) Investigated by (1)H NMR Fingerprinting: A Way to Follow Aromaticity Evolution.

    PubMed

    Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence

    2015-06-18

    We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.

  1. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  2. The Use of Meta-Analytic Statistical Significance Testing

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Pigott, Terri D.

    2015-01-01

    Meta-analysis multiplicity, the concept of conducting multiple tests of statistical significance within one review, is an underdeveloped literature. We address this issue by considering how Type I errors can impact meta-analytic results, suggest how statistical power may be affected through the use of multiplicity corrections, and propose how…

  3. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less

  4. Big data in health care: using analytics to identify and manage high-risk and high-cost patients.

    PubMed

    Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel

    2014-07-01

    The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.

  5. Metrological approaches to organic chemical purity: primary reference materials for vitamin D metabolites.

    PubMed

    Nelson, Michael A; Bedner, Mary; Lang, Brian E; Toman, Blaza; Lippa, Katrice A

    2015-11-01

    Given the critical role of pure, organic compound primary reference standards used to characterize and certify chemical Certified Reference Materials (CRMs), it is essential that associated mass purity assessments be fit-for-purpose, represented by an appropriate uncertainty interval, and metrologically sound. The mass fraction purities (% g/g) of 25-hydroxyvitamin D (25(OH)D) reference standards used to produce and certify values for clinical vitamin D metabolite CRMs were investigated by multiple orthogonal quantitative measurement techniques. Quantitative (1)H-nuclear magnetic resonance spectroscopy (qNMR) was performed to establish traceability of these materials to the International System of Units (SI) and to directly assess the principal analyte species. The 25(OH)D standards contained volatile and water impurities, as well as structurally-related impurities that are difficult to observe by chromatographic methods or to distinguish from the principal 25(OH)D species by one-dimensional NMR. These impurities have the potential to introduce significant biases to purity investigations in which a limited number of measurands are quantified. Combining complementary information from multiple analytical methods, using both direct and indirect measurement techniques, enabled mitigation of these biases. Purities of 25(OH)D reference standards and associated uncertainties were determined using frequentist and Bayesian statistical models to combine data acquired via qNMR, liquid chromatography with UV absorbance and atmospheric pressure-chemical ionization mass spectrometric detection (LC-UV, LC-ACPI-MS), thermogravimetric analysis (TGA), and Karl Fischer (KF) titration.

  6. Pathways to Identity: Aiding Law Enforcement in Identification Tasks With Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruce, Joseph R.; Scholtz, Jean; Hodges, Duncan

    The nature of identity has changed dramatically in recent years, and has grown in complexity. Identities are defined in multiple domains: biological and psychological elements strongly contribute, but also biographical and cyber elements are necessary to complete the picture. Law enforcement is beginning to adjust to these changes, recognizing its importance in criminal justice. The SuperIdentity project seeks to aid law enforcement officials in their identification tasks through research of techniques for discovering identity traits, generation of statistical models of identity and analysis of identity traits through visualization. We present use cases compiled through user interviews in multiple fields, includingmore » law enforcement, as well as the modeling and visualization tools design to aid in those use cases.« less

  7. Practical Guidance for Conducting Mediation Analysis With Multiple Mediators Using Inverse Odds Ratio Weighting

    PubMed Central

    Nguyen, Quynh C.; Osypuk, Theresa L.; Schmidt, Nicole M.; Glymour, M. Maria; Tchetgen Tchetgen, Eric J.

    2015-01-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994–2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. PMID:25693776

  8. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  9. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  10. Determination of the plutonium content in a spent fuel assembly by passive and active interrogation using a differential die-away instrument

    NASA Astrophysics Data System (ADS)

    Henzl, V.; Croft, S.; Richard, J.; Swinhoe, M. T.; Tobin, S. J.

    2013-06-01

    In this paper, we present a novel approach to estimating the total plutonium content in a spent fuel assembly (SFA) that is based on combining information from a passive measurement of the total neutron count rate (PN) of the assayed SFA and a measure of its multiplication. While PN can be measured essentially with any non-destructive assay (NDA) technique capable of neutron detection, the measure of multiplication is, in our approach, determined by means of active interrogation using an instrument based on the Differential Die-Away technique (DDA). The DDA is a NDA technique developed within the U.S. Department of Energy's Next Generation Safeguards Initiative (NGSI) project focused on the utilization of NDA techniques to determine the elemental plutonium content in commercial nuclear SFA's [1]. This approach was adopted since DDA also allows determination of other SFA characteristics, such as burnup, initial enrichment, and cooling time, and also allows for detection of certain types of diversion of nuclear material. The quantification of total plutonium is obtained using an analytical correlation function in terms of the observed PN and active multiplication. Although somewhat similar approaches relating Pu content with PN have been adopted in the past, we demonstrate by extensive simulation of the fuel irradiation and NDA process that our analytical method is independent of explicit knowledge of the initial enrichment, burnup, and an absolute value of the SFA's reactivity (i.e. multiplication factor). We show that when tested with MCNPX™ simulations comprising the 64 SFA NGSI Spent Fuel Library-1 we were able to determine elemental plutonium content, using just a few calibration parameters, with an average variation in the prediction of around 1-2% across the wide dynamic range of irradiation history parameters used, namely initial enrichment (IE=2-5%), burnup (BU=15-60 GWd/tU) and cooling time (CT=1-80 y). In this paper we describe the basic approach and the success obtained against synthetic data. We recognize that our synthetic data may not fully capture the rich behavior of actual irradiated fuel and the uncertainties of the practical measurements. However, this design study is based on a rather complete nuclide inventory and the correlations for Pu seem robust to variation of input. Thus it is concluded that the proposed method is sufficiently promising that further experimentally based work is desirable.

  11. Sparsity-aware tight frame learning with adaptive subspace recognition for multiple fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Yang, Boyuan

    2017-09-01

    It is a challenging problem to design excellent dictionaries to sparsely represent diverse fault information and simultaneously discriminate different fault sources. Therefore, this paper describes and analyzes a novel multiple feature recognition framework which incorporates the tight frame learning technique with an adaptive subspace recognition strategy. The proposed framework consists of four stages. Firstly, by introducing the tight frame constraint into the popular dictionary learning model, the proposed tight frame learning model could be formulated as a nonconvex optimization problem which can be solved by alternatively implementing hard thresholding operation and singular value decomposition. Secondly, the noises are effectively eliminated through transform sparse coding techniques. Thirdly, the denoised signal is decoupled into discriminative feature subspaces by each tight frame filter. Finally, in guidance of elaborately designed fault related sensitive indexes, latent fault feature subspaces can be adaptively recognized and multiple faults are diagnosed simultaneously. Extensive numerical experiments are sequently implemented to investigate the sparsifying capability of the learned tight frame as well as its comprehensive denoising performance. Most importantly, the feasibility and superiority of the proposed framework is verified through performing multiple fault diagnosis of motor bearings. Compared with the state-of-the-art fault detection techniques, some important advantages have been observed: firstly, the proposed framework incorporates the physical prior with the data-driven strategy and naturally multiple fault feature with similar oscillation morphology can be adaptively decoupled. Secondly, the tight frame dictionary directly learned from the noisy observation can significantly promote the sparsity of fault features compared to analytical tight frames. Thirdly, a satisfactory complete signal space description property is guaranteed and thus weak feature leakage problem is avoided compared to typical learning methods.

  12. Motion compensation via redundant-wavelet multihypothesis.

    PubMed

    Fowler, James E; Cui, Suxia; Wang, Yonghui

    2006-10-01

    Multihypothesis motion compensation has been widely used in video coding with previous attention focused on techniques employing predictions that are diverse spatially or temporally. In this paper, the multihypothesis concept is extended into the transform domain by using a redundant wavelet transform to produce multiple predictions that are diverse in transform phase. The corresponding multiple-phase inverse transform implicitly combines the phase-diverse predictions into a single spatial-domain prediction for motion compensation. The performance advantage of this redundant-wavelet-multihypothesis approach is investigated analytically, invoking the fact that the multiple-phase inverse involves a projection that significantly reduces the power of a dense-motion residual modeled as additive noise. The analysis shows that redundant-wavelet multihypothesis is capable of up to a 7-dB reduction in prediction-residual variance over an equivalent single-phase, single-hypothesis approach. Experimental results substantiate the performance advantage for a block-based implementation.

  13. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    PubMed

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  14. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  15. Determination of mycotoxins in foods: current state of analytical methods and limitations.

    PubMed

    Köppen, Robert; Koch, Matthias; Siegel, David; Merkel, Stefan; Maul, Ronald; Nehls, Irene

    2010-05-01

    Mycotoxins are natural contaminants produced by a range of fungal species. Their common occurrence in food and feed poses a threat to the health of humans and animals. This threat is caused either by the direct contamination of agricultural commodities or by a "carry-over" of mycotoxins and their metabolites into animal tissues, milk, and eggs after feeding of contaminated hay or corn. As a consequence of their diverse chemical structures and varying physical properties, mycotoxins exhibit a wide range of biological effects. Individual mycotoxins can be genotoxic, mutagenic, carcinogenic, teratogenic, and oestrogenic. To protect consumer health and to reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities and researchers worldwide. However, the variety of chemical structures makes it impossible to use one single technique for mycotoxin analysis. Hence, a vast number of analytical methods has been developed and validated. The heterogeneity of food matrices combined with the demand for a fast, simultaneous and accurate determination of multiple mycotoxins creates enormous challenges for routine analysis. The most crucial issues will be discussed in this review. These are (1) the collection of representative samples, (2) the performance of classical and emerging analytical methods based on chromatographic or immunochemical techniques, (3) the validation of official methods for enforcement, and (4) the limitations and future prospects of the current methods.

  16. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  17. A Versatile Integrated Ambient Ionization Source Platform.

    PubMed

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-30

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.

  18. A Versatile Integrated Ambient Ionization Source Platform

    NASA Astrophysics Data System (ADS)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  19. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  20. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  2. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  3. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  4. A portable fluorescent sensing system using multiple LEDs

    NASA Astrophysics Data System (ADS)

    Shin, Young-Ho; Barnett, Jonathan Z.; Gutierrez-Wing, M. Teresa; Rusch, Kelly A.; Choi, Jin-Woo

    2017-02-01

    This paper presents a portable fluorescent sensing system that utilizes different light emitting diode (LED) excitation lights for multiple target detection. In order to identify different analytes, three different wavelengths (385 nm, 448 nm, and 590 nm) of excitation light emitting diodes were used to selectively stimulate the target analytes. A highly sensitive silicon photomultiplier (SiPM) was used to detect corresponding fluorescent signals from each analyte. Based on the unique fluorescent response of each analyte, it is possible to simultaneously differentiate one analyte from the other in a mixture of target analytes. A portable system was designed and fabricated consisting of a display module, battery, data storage card, and sample loading tray into a compact 3D-printed jig. The portable sensor system was demonstrated for quantification and differentiation of microalgae (Chlorella vulgaris) and cyanobacteria (Spirulina) by measuring fluorescent responses of chlorophyll a in microalgae and phycocyanin in cyanobacteria. Obtained results suggest that the developed portable sensor system could be used as a generic fluorescence sensor platform for on-site detection of multiple analytes of interest.

  5. Thermotropic Liquid Crystal-Assisted Chemical and Biological Sensors

    PubMed Central

    Honaker, Lawrence W.; Usol’tseva, Nadezhda; Mann, Elizabeth K.

    2017-01-01

    In this review article, we analyze recent progress in the application of liquid crystal-assisted advanced functional materials for sensing biological and chemical analytes. Multiple research groups demonstrate substantial interest in liquid crystal (LC) sensing platforms, generating an increasing number of scientific articles. We review trends in implementing LC sensing techniques and identify common problems related to the stability and reliability of the sensing materials as well as to experimental set-ups. Finally, we suggest possible means of bridging scientific findings to viable and attractive LC sensor platforms. PMID:29295530

  6. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  7. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  9. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  10. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE PAGES

    Steed, Chad A.; Halsey, William; Dehoff, Ryan; ...

    2017-02-16

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  11. Falcon: Visual analysis of large, irregularly sampled, and multivariate time series data in additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A.; Halsey, William; Dehoff, Ryan

    Flexible visual analysis of long, high-resolution, and irregularly sampled time series data from multiple sensor streams is a challenge in several domains. In the field of additive manufacturing, this capability is critical for realizing the full potential of large-scale 3D printers. Here, we propose a visual analytics approach that helps additive manufacturing researchers acquire a deep understanding of patterns in log and imagery data collected by 3D printers. Our specific goals include discovering patterns related to defects and system performance issues, optimizing build configurations to avoid defects, and increasing production efficiency. We introduce Falcon, a new visual analytics system thatmore » allows users to interactively explore large, time-oriented data sets from multiple linked perspectives. Falcon provides overviews, detailed views, and unique segmented time series visualizations, all with adjustable scale options. To illustrate the effectiveness of Falcon at providing thorough and efficient knowledge discovery, we present a practical case study involving experts in additive manufacturing and data from a large-scale 3D printer. The techniques described are applicable to the analysis of any quantitative time series, though the focus of this paper is on additive manufacturing.« less

  12. Compact and cost effective instrument for detecting drug precursors in different environments based on fluorescence polarization

    NASA Astrophysics Data System (ADS)

    Antolín-Urbaneja, J. C.; Eguizabal, I.; Briz, N.; Dominguez, A.; Estensoro, P.; Secchi, A.; Varriale, A.; Di Giovanni, S.; D'Auria, S.

    2013-05-01

    Several techniques for detecting chemical drug precursors have been developed in the last decade. Most of them are able to identify molecules at very low concentration under lab conditions. Other commercial devices are able to detect a fixed number and type of target substances based on a single detection technique providing an absence of flexibility with respect to target compounds. The construction of compact and easy to use detection systems providing screening for a large number of compounds being able to discriminate them with low false alarm rate and high probability of detection is still an open concern. Under CUSTOM project, funded by the European Commission within the FP7, a stand-alone portable sensing device based on multiple techniques is being developed. One of these techniques is based on the LED induced fluorescence polarization to detect Ephedrine and Benzyl Methyl Keton (BMK) as a first approach. This technique is highly selective with respect to the target compounds due to the generation of properly engineered fluorescent proteins which are able to bind the target analytes, as it happens in an "immune-type reaction". This paper deals with the advances in the design, construction and validation of the LED induced fluorescence sensor to detect BMK analytes. This sensor includes an analysis module based on high performance LED and PMT detector, a fluidic system to dose suitable quantities of reagents and some printed circuit boards, all of them fixed in a small structure (167mm × 193mm × 228mm) with the capability of working as a stand-alone application.

  13. Multiple injection mode with or without repeated sample injections: Strategies to enhance productivity in countercurrent chromatography.

    PubMed

    Müller, Marco; Wasmer, Katharina; Vetter, Walter

    2018-06-29

    Countercurrent chromatography (CCC) is an all liquid based separation technique typically used for the isolation and purification of natural compounds. The simplicity of the method makes it easy to scale up CCC separations from analytical to preparative and even industrial scale. However, scale-up of CCC separations requires two different instruments with varying coil dimensions. Here we developed two variants of the CCC multiple injection mode as an alternative to increase the throughput and enhance productivity of a CCC separation when using only one instrument. The concept is based on the parallel injection of samples at different points in the CCC column system and the simultaneous separation using one pump only. The wiring of the CCC setup was modified by the insertion of a 6-port selection valve, multiple T-pieces and sample loops. Furthermore, the introduction of storage sample loops enabled the CCC system to be used with repeated injection cycles. Setup and advantages of both multiple injection modes were shown by the isolation of the furan fatty acid 11-(3,4-dimethyl-5-pentylfuran-2-yl)-undecanoic acid (11D5-EE) from an ethyl ester oil rich in 4,7,10,13,16,19-docosahexaenoic acid (DHA-EE). 11D5-EE was enriched in one step from 1.9% to 99% purity. The solvent consumption per isolated amount of analyte could be reduced by ∼40% compared to increased throughput CCC and by ∼5% in the repeated multiple injection mode which also facilitated the isolation of the major compound (DHA-EE) in the sample. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Surface Anatomy: A Holistic Treatment of Multiple Disconnected Anatomical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Koay, Cheng Guan; Schaefer, Stacey M.; van Reekum, Carien M.; Schmitz, Lara Peschke; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2015-01-01

    Image-based parcellation of the brain often leads to multiple disconnected anatomical structures, which pose significant challenges for analyses of morphological shapes. Existing shape models, such as the widely used spherical harmonic (SPHARM) representation, assume topological invariance, so are unable to simultaneously parameterize multiple disjoint structures. In such a situation, SPHARM has to be applied separately to each individual structure. We present a novel surface parameterization technique using 4D hyperspherical harmonics in representing multiple disjoint objects as a single analytic function, terming it HyperSPHARM. The underlying idea behind Hyper-SPHARM is to stereographically project an entire collection of disjoint 3D objects onto the 4D hypersphere and subsequently simultaneously parameterize them with the 4D hyperspherical harmonics. Hence, HyperSPHARM allows for a holistic treatment of multiple disjoint objects, unlike SPHARM. In an imaging dataset of healthy adult human brains, we apply HyperSPHARM to the hippocampi and amygdalae. The HyperSPHARM representations are employed as a data smoothing technique, while the HyperSPHARM coefficients are utilized in a support vector machine setting for object classification. HyperSPHARM yields nearly identical results as SPHARM, as will be shown in the paper. Its key advantage over SPHARM lies computationally; Hyper-SPHARM possess greater computational efficiency than SPHARM because it can parameterize multiple disjoint structures using much fewer basis functions and stereographic projection obviates SPHARM's burdensome surface flattening. In addition, HyperSPHARM can handle any type of topology, unlike SPHARM, whose analysis is confined to topologically invariant structures. PMID:25828650

  15. DYGABCD: A program for calculating linear A, B, C, and D matrices from a nonlinear dynamic engine simulation

    NASA Technical Reports Server (NTRS)

    Geyser, L. C.

    1978-01-01

    A digital computer program, DYGABCD, was developed that generates linearized, dynamic models of simulated turbofan and turbojet engines. DYGABCD is based on an earlier computer program, DYNGEN, that is capable of calculating simulated nonlinear steady-state and transient performance of one- and two-spool turbojet engines or two- and three-spool turbofan engines. Most control design techniques require linear system descriptions. For multiple-input/multiple-output systems such as turbine engines, state space matrix descriptions of the system are often desirable. DYGABCD computes the state space matrices commonly referred to as the A, B, C, and D matrices required for a linear system description. The report discusses the analytical approach and provides a users manual, FORTRAN listings, and a sample case.

  16. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  17. Machine learning and predictive data analytics enabling metrology and process control in IC fabrication

    NASA Astrophysics Data System (ADS)

    Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.

    2015-03-01

    Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.

  18. A convergent functional architecture of the insula emerges across imaging modalities.

    PubMed

    Kelly, Clare; Toro, Roberto; Di Martino, Adriana; Cox, Christine L; Bellec, Pierre; Castellanos, F Xavier; Milham, Michael P

    2012-07-16

    Empirical evidence increasingly supports the hypothesis that patterns of intrinsic functional connectivity (iFC) are sculpted by a history of evoked coactivation within distinct neuronal networks. This, together with evidence of strong correspondence among the networks defined by iFC and those delineated using a variety of other neuroimaging techniques, suggests a fundamental brain architecture detectable across multiple functional and structural imaging modalities. Here, we leverage this insight to examine the functional organization of the human insula. We parcellated the insula on the basis of three distinct neuroimaging modalities - task-evoked coactivation, intrinsic (i.e., task-independent) functional connectivity, and gray matter structural covariance. Clustering of these three different covariance-based measures revealed a convergent elemental organization of the insula that likely reflects a fundamental brain architecture governing both brain structure and function at multiple spatial scales. While not constrained to be hierarchical, our parcellation revealed a pseudo-hierarchical, multiscale organization that was consistent with previous clustering and meta-analytic studies of the insula. Finally, meta-analytic examination of the cognitive and behavioral domains associated with each of the insular clusters obtained elucidated the broad functional dissociations likely underlying the topography observed. To facilitate future investigations of insula function across healthy and pathological states, the insular parcels have been made freely available for download via http://fcon_1000.projects.nitrc.org, along with the analytic scripts used to perform the parcellations. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Aspects of Voyager photogrammetry

    NASA Technical Reports Server (NTRS)

    Wu, Sherman S. C.; Schafer, Francis J.; Jordan, Raymond; Howington, Annie-Elpis

    1987-01-01

    In January 1986, Voyager 2 took a series of pictures of Uranus and its satellites with the Imaging Science System (ISS) on board the spacecraft. Based on six stereo images from the ISS narrow-angle camera, a topographic map was compiled of the Southern Hemisphere of Miranda, one of Uranus' moons. Assuming a spherical figure, a 20-km surface relief is shown on the map. With three additional images from the ISS wide-angle camera, a control network of Miranda's Southern Hemisphere was established by analytical photogrammetry, producing 88 ground points for the control of multiple-model compilation on the AS-11AM analytical stereoplotter. Digital terrain data from the topographic map of Miranda have also been produced. By combining these data and the image data from the Voyager 2 mission, perspective views or even a movie of the mapped area can be made. The application of these newly developed techniques to Voyager 1 imagery, which includes a few overlapping pictures of Io and Ganymede, permits the compilation of contour maps or topographic profiles of these bodies on the analytical stereoplotters.

  20. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    NASA Astrophysics Data System (ADS)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  1. A Quantile Regression Approach to Understanding the Relations Among Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students.

    PubMed

    Tighe, Elizabeth L; Schatschneider, Christopher

    2016-07-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in adult basic education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82%-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. © Hammill Institute on Disabilities 2014.

  2. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. IMS alone is useful, but its coupling with mass spectrometry (MS) and front-end separations has been extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information in biological and environmental sample analyses. Multiple studies in disease screening and environmental evaluations have even shown these IMS-based multidimensional separations extract information not possible with each technique individually. This review highlights 3-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography (GC),more » supercritical fluid chromatography (SFC), liquid chromatography (LC), solid phase extractions (SPE), capillary electrophoresis (CE), field asymmetric ion mobility spectrometry (FAIMS), and microfluidic devices. The origination, current state, various applications, and future capabilities for these multidimensional approaches are described to provide insight into the utility and potential of each technique.« less

  3. A Factor Analytic Model of Drug-Related Behavior in Adolescence and Its Impact on Arrests at Multiple Stages of the Life Course

    PubMed Central

    2016-01-01

    Objectives Recognizing the inherent variability of drug-related behaviors, this study develops an empirically-driven and holistic model of drug-related behavior during adolescence using factor analysis to simultaneously model multiple drug behaviors. Methods The factor analytic model uncovers latent dimensions of drug-related behaviors, rather than patterns of individuals. These latent dimensions are treated as empirical typologies which are then used to predict an individual’s number of arrests accrued at multiple phases of the life course. The data are robust enough to simultaneously capture drug behavior measures typically considered in isolation in the literature, and to allow for behavior to change and evolve over the period of adolescence. Results Results show that factor analysis is capable of developing highly descriptive patterns of drug offending, and that these patterns have great utility in predicting arrests. Results further demonstrate that while drug behavior patterns are predictive of arrests at the end of adolescence for both males and females, the impacts on arrests are longer lasting for females. Conclusions The various facets of drug behaviors have been a long-time concern of criminological research. However, the ability to model multiple behaviors simultaneously is often constrained by data that do not measure the constructs fully. Factor analysis is shown to be a useful technique for modeling adolescent drug involvement patterns in a way that accounts for the multitude and variability of possible behaviors, and in predicting future negative life outcomes, such as arrests. PMID:28435183

  4. Use of multiple colorimetric indicators for paper-based microfluidic devices.

    PubMed

    Dungchai, Wijitar; Chailapakul, Orawon; Henry, Charles S

    2010-08-03

    We report here the use of multiple indicators for a single analyte for paper-based microfluidic devices (microPAD) in an effort to improve the ability to visually discriminate between analyte concentrations. In existing microPADs, a single dye system is used for the measurement of a single analyte. In our approach, devices are designed to simultaneously quantify analytes using multiple indicators for each analyte improving the accuracy of the assay. The use of multiple indicators for a single analyte allows for different indicator colors to be generated at different analyte concentration ranges as well as increasing the ability to better visually discriminate colors. The principle of our devices is based on the oxidation of indicators by hydrogen peroxide produced by oxidase enzymes specific for each analyte. Each indicator reacts at different peroxide concentrations and therefore analyte concentrations, giving an extended range of operation. To demonstrate the utility of our approach, the mixture of 4-aminoantipyrine and 3,5-dichloro-2-hydroxy-benzenesulfonic acid, o-dianisidine dihydrochloride, potassium iodide, acid black, and acid yellow were chosen as the indicators for simultaneous semi-quantitative measurement of glucose, lactate, and uric acid on a microPAD. Our approach was successfully applied to quantify glucose (0.5-20 mM), lactate (1-25 mM), and uric acid (0.1-7 mM) in clinically relevant ranges. The determination of glucose, lactate, and uric acid in control serum and urine samples was also performed to demonstrate the applicability of this device for biological sample analysis. Finally results for the multi-indicator and single indicator system were compared using untrained readers to demonstrate the improvements in accuracy achieved with the new system. 2010 Elsevier B.V. All rights reserved.

  5. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  6. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  7. Scattering of focused ultrasonic beams by cavities in a solid half-space.

    PubMed

    Rahni, Ehsan Kabiri; Hajzargarbashi, Talieh; Kundu, Tribikram

    2012-08-01

    The ultrasonic field generated by a point focused acoustic lens placed in a fluid medium adjacent to a solid half-space, containing one or more spherical cavities, is modeled. The semi-analytical distributed point source method (DPSM) is followed for the modeling. This technique properly takes into account the interaction effect between the cavities placed in the focused ultrasonic field, fluid-solid interface and the lens surface. The approximate analytical solution that is available in the literature for the single cavity geometry is very restrictive and cannot handle multiple cavity problems. Finite element solutions for such problems are also prohibitively time consuming at high frequencies. Solution of this problem is necessary to predict when two cavities placed in close proximity inside a solid can be distinguished by an acoustic lens placed outside the solid medium and when such distinction is not possible.

  8. Determination of gamma-aminobutyric acid in food matrices by isotope dilution hydrophilic interaction chromatography coupled to mass spectrometry.

    PubMed

    Zazzeroni, Raniero; Homan, Andrew; Thain, Emma

    2009-08-01

    The estimation of the dietary intake of gamma-aminobutyric acid (GABA) is dependent upon the knowledge of its concentration values in food matrices. To this end, an isotope dilution liquid chromatography-mass spectrometry method has been developed employing the hydrophilic interaction chromatography technique for analyte separation. This approach enabled accurate quantification of GABA in apple, potato, soybeans, and orange juice without the need of a pre- or post-column derivatization reaction. A selective and precise analytical measurement has been obtained with a triple quadrupole mass spectrometer operating in multiple reaction monitoring using the method of standard additions and GABA-d(6) as an internal standard. The concentrations of GABA found in the matrices tested are 7 microg/g of apple, 342 microg/g of potatoes, 211 microg/g of soybeans, and 344 microg/mL of orange juice.

  9. IT vendor selection model by using structural equation model & analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  10. Group decision making with the analytic hierarchy process in benefit-risk assessment: a tutorial.

    PubMed

    Hummel, J Marjan; Bridges, John F P; IJzerman, Maarten J

    2014-01-01

    The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria. This tutorial illustrates the procedural steps of the AHP in supporting group decision making about new healthcare technology, including (1) identifying the decision goal, decision criteria, and alternative healthcare technologies to compare, (2) structuring the decision criteria, (3) judging the value of the alternative technologies on each decision criterion, (4) judging the importance of the decision criteria, (5) calculating group judgments, (6) analyzing the inconsistency in judgments, (7) calculating the overall value of the technologies, and (8) conducting sensitivity analyses. The AHP is illustrated via a hypothetical example, adapted from an empirical AHP analysis on the benefits and risks of tissue regeneration to repair small cartilage lesions in the knee.

  11. Translational research in pediatrics III: bronchoalveolar lavage.

    PubMed

    Radhakrishnan, Dhenuka; Yamashita, Cory; Gillio-Meina, Carolina; Fraser, Douglas D

    2014-07-01

    The role of flexible bronchoscopy and bronchoalveolar lavage (BAL) for the care of children with airway and pulmonary diseases is well established, with collected BAL fluid most often used clinically for microbiologic pathogen identification and cellular analyses. More recently, powerful analytic research methods have been used to investigate BAL samples to better understand the pathophysiological basis of pediatric respiratory disease. Investigations have focused on the cellular components contained in BAL fluid, such as macrophages, lymphocytes, neutrophils, eosinophils, and mast cells, as well as the noncellular components such as serum molecules, inflammatory proteins, and surfactant. Molecular techniques are frequently used to investigate BAL fluid for the presence of infectious pathologies and for cellular gene expression. Recent advances in proteomics allow identification of multiple protein expression patterns linked to specific respiratory diseases, whereas newer analytic techniques allow for investigations on surfactant quantification and function. These translational research studies on BAL fluid have aided our understanding of pulmonary inflammation and the injury/repair responses in children. We review the ethics and practices for the execution of BAL in children for translational research purposes, with an emphasis on the optimal handling and processing of BAL samples. Copyright © 2014 by the American Academy of Pediatrics.

  12. High Technology Service Value Maximization through an MCDM-Based Innovative e-Business Model

    NASA Astrophysics Data System (ADS)

    Huang, Chi-Yo; Tzeng, Gwo-Hshiung; Ho, Wen-Rong; Chuang, Hsiu-Tyan; Lue, Yeou-Feng

    The emergence of the Internet has changed the high technology marketing channels thoroughly in the past decade while E-commerce has already become one of the most efficient channels which high technology firms may skip the intermediaries and reach end customers directly. However, defining appropriate e-business models for commercializing new high technology products or services through Internet are not that easy. To overcome the above mentioned problems, a novel analytic framework based on the concept of high technology customers’ competence set expansion by leveraging high technology service firms’ capabilities and resources as well as novel multiple criteria decision making (MCDM) techniques, will be proposed in order to define an appropriate e-business model. An empirical example study of a silicon intellectual property (SIP) commercialization e-business model based on MCDM techniques will be provided for verifying the effectiveness of this novel analytic framework. The analysis successful assisted a Taiwanese IC design service firm to define an e-business model for maximizing its customer’s SIP transactions. In the future, the novel MCDM framework can be applied successful to novel business model definitions in the high technology industry.

  13. Interlaboratory comparability, bias, and precision for four laboratories measuring constituents in precipitation, November 1982-August 1983

    USGS Publications Warehouse

    Brooks, M.H.; Schroder, L.J.; Malo, B.A.

    1985-01-01

    Four laboratories were evaluated in their analysis of identical natural and simulated precipitation water samples. Interlaboratory comparability was evaluated using analysis of variance coupled with Duncan 's multiple range test, and linear-regression models describing the relations between individual laboratory analytical results for natural precipitation samples. Results of the statistical analyses indicate that certain pairs of laboratories produce different results when analyzing identical samples. Analyte bias for each laboratory was examined using analysis of variance coupled with Duncan 's multiple range test on data produced by the laboratories from the analysis of identical simulated precipitation samples. Bias for a given analyte produced by a single laboratory has been indicated when the laboratory mean for that analyte is shown to be significantly different from the mean for the most-probable analyte concentrations in the simulated precipitation samples. Ion-chromatographic methods for the determination of chloride, nitrate, and sulfate have been compared with the colorimetric methods that were also in use during the study period. Comparisons were made using analysis of variance coupled with Duncan 's multiple range test for means produced by the two methods. Analyte precision for each laboratory has been estimated by calculating a pooled variance for each analyte. Analyte estimated precisions have been compared using F-tests and differences in analyte precisions for laboratory pairs have been reported. (USGS)

  14. Electrochemical Detection of Multiple Bioprocess Analytes

    NASA Technical Reports Server (NTRS)

    Rauh, R. David

    2010-01-01

    An apparatus that includes highly miniaturized thin-film electrochemical sensor array has been demonstrated as a prototype of instruments for simultaneous detection of multiple substances of interest (analytes) and measurement of acidity or alkalinity in bioprocess streams. Measurements of pH and of concentrations of nutrients and wastes in cell-culture media, made by use of these instruments, are to be used as feedback for optimizing the growth of cells or the production of desired substances by the cultured cells. The apparatus is designed to utilize samples of minimal volume so as to minimize any perturbation of monitored processes. The apparatus can function in a potentiometric mode (for measuring pH), an amperometric mode (detecting analytes via oxidation/reduction reactions), or both. The sensor array is planar and includes multiple thin-film microelectrodes covered with hydrous iridium oxide. The oxide layer on each electrode serves as both a protective and electrochemical transducing layer. In its transducing role, the oxide provides electrical conductivity for amperometric measurement or pH response for potentiometric measurement. The oxide on an electrode can also serve as a matrix for one or more enzymes that render the electrode sensitive to a specific analyte. In addition to transducing electrodes, the array includes electrodes for potential control. The array can be fabricated by techniques familiar to the microelectronics industry. The sensor array is housed in a thin-film liquid-flow cell that has a total volume of about 100 mL. The flow cell is connected to a computer-controlled subsystem that periodically draws samples from the bioprocess stream to be monitored. Before entering the cell, each 100-mL sample is subjected to tangential-flow filtration to remove particles. In the present version of the apparatus, the electrodes are operated under control by a potentiostat and are used to simultaneously measure the pH and the concentration of glucose. It is anticipated that development of procedures for trapping more enzymes into hydrous iridium oxide (and possibly into other electroactive metal oxides) and of means for imparting long-term stability to the transducer layers should make it possible to monitor concentrations of products of many enzyme reactions for example, such key bioprocess analytes as amino acids, vitamins, lactose, and acetate.

  15. Resin embedded multicycle imaging (REMI): a tool to evaluate protein domains.

    PubMed

    Busse, B L; Bezrukov, L; Blank, P S; Zimmerberg, J

    2016-08-08

    Protein complexes associated with cellular processes comprise a significant fraction of all biology, but our understanding of their heterogeneous organization remains inadequate, particularly for physiological densities of multiple protein species. Towards resolving this limitation, we here present a new technique based on resin-embedded multicycle imaging (REMI) of proteins in-situ. By stabilizing protein structure and antigenicity in acrylic resins, affinity labels were repeatedly applied, imaged, removed, and replaced. In principle, an arbitrarily large number of proteins of interest may be imaged on the same specimen with subsequent digital overlay. A series of novel preparative methods were developed to address the problem of imaging multiple protein species in areas of the plasma membrane or volumes of cytoplasm of individual cells. For multiplexed examination of antibody staining we used straightforward computational techniques to align sequential images, and super-resolution microscopy was used to further define membrane protein colocalization. We give one example of a fibroblast membrane with eight multiplexed proteins. A simple statistical analysis of this limited membrane proteomic dataset is sufficient to demonstrate the analytical power contributed by additional imaged proteins when studying membrane protein domains.

  16. Comprehensive quantification of signal-to-noise ratio and g-factor for image-based and k-space-based parallel imaging reconstructions.

    PubMed

    Robson, Philip M; Grant, Aaron K; Madhuranthakam, Ananth J; Lattanzi, Riccardo; Sodickson, Daniel K; McKenzie, Charles A

    2008-10-01

    Parallel imaging reconstructions result in spatially varying noise amplification characterized by the g-factor, precluding conventional measurements of noise from the final image. A simple Monte Carlo based method is proposed for all linear image reconstruction algorithms, which allows measurement of signal-to-noise ratio and g-factor and is demonstrated for SENSE and GRAPPA reconstructions for accelerated acquisitions that have not previously been amenable to such assessment. Only a simple "prescan" measurement of noise amplitude and correlation in the phased-array receiver, and a single accelerated image acquisition are required, allowing robust assessment of signal-to-noise ratio and g-factor. The "pseudo multiple replica" method has been rigorously validated in phantoms and in vivo, showing excellent agreement with true multiple replica and analytical methods. This method is universally applicable to the parallel imaging reconstruction techniques used in clinical applications and will allow pixel-by-pixel image noise measurements for all parallel imaging strategies, allowing quantitative comparison between arbitrary k-space trajectories, image reconstruction, or noise conditioning techniques. (c) 2008 Wiley-Liss, Inc.

  17. A numerical approach to controller design for the ACES facility

    NASA Technical Reports Server (NTRS)

    Frazier, W. Garth; Irwin, R. Dennis

    1993-01-01

    In recent years the employment of active control techniques for improving the performance of systems involving highly flexible structures has become a topic of considerable research interest. Most of these systems are quite complicated, using multiple actuators and sensors, and possessing high order models. The majority of analytical controller synthesis procedures capable of handling multivariable systems in a systematic way require considerable insight into the underlying mathematical theory to achieve a successful design. This insight is needed in selecting the proper weighting matrices or weighting functions to cast what is naturally a multiple constraint satisfaction problem into an unconstrained optimization problem. Although designers possessing considerable experience with these techniques have a feel for the proper choice of weights, others may spend a significant amount of time attempting to find an acceptable solution. Another disadvantage of such procedures is that the resulting controller has an order greater than or equal to that of the model used for the design. Of course, the order of these controllers can often be reduced, but again this requires a good understanding of the theory involved.

  18. Resin embedded multicycle imaging (REMI): a tool to evaluate protein domains

    PubMed Central

    Busse, B. L.; Bezrukov, L.; Blank, P. S.; Zimmerberg, J.

    2016-01-01

    Protein complexes associated with cellular processes comprise a significant fraction of all biology, but our understanding of their heterogeneous organization remains inadequate, particularly for physiological densities of multiple protein species. Towards resolving this limitation, we here present a new technique based on resin-embedded multicycle imaging (REMI) of proteins in-situ. By stabilizing protein structure and antigenicity in acrylic resins, affinity labels were repeatedly applied, imaged, removed, and replaced. In principle, an arbitrarily large number of proteins of interest may be imaged on the same specimen with subsequent digital overlay. A series of novel preparative methods were developed to address the problem of imaging multiple protein species in areas of the plasma membrane or volumes of cytoplasm of individual cells. For multiplexed examination of antibody staining we used straightforward computational techniques to align sequential images, and super-resolution microscopy was used to further define membrane protein colocalization. We give one example of a fibroblast membrane with eight multiplexed proteins. A simple statistical analysis of this limited membrane proteomic dataset is sufficient to demonstrate the analytical power contributed by additional imaged proteins when studying membrane protein domains. PMID:27499335

  19. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  20. Surface enhanced Raman spectroscopy based nanoparticle assays for rapid, point-of-care diagnostics

    NASA Astrophysics Data System (ADS)

    Driscoll, Ashley J.

    Nucleotide and immunoassays are important tools for disease diagnostics. Many of the current laboratory-based analytical diagnostic techniques require multiple assay steps and long incubation times before results are acquired. In the development of bioassays designed for detecting the emergence and spread of diseases in point-of-care (POC) and remote settings, more rapid and portable analytical methods are necessary. Nanoparticles provide simple and reproducible synthetic methods for the preparation of substrates that can be applied in colloidal assays, providing gains in kinetics due to miniaturization and plasmonic substrates for surface enhanced spectroscopies. Specifically, surface enhanced Raman spectroscopy (SERS) is finding broad application as a signal transduction method in immunological and nucleotide assays due to the production of narrow spectral peaks from the scattering molecules and the potential for simultaneous multiple analyte detection. The application of SERS to a no-wash, magnetic capture assay for the detection of West Nile Virus Envelope and Rift Valley Fever Virus N antigens is described. The platform utilizes colloid based capture of the target antigen in solution, magnetic collection of the immunocomplexes and acquisition of SERS spectra by a handheld Raman spectrometer. The reagents for a core-shell nanoparticle, SERS based assay designed for the capture of target microRNA implicated in acute myocardial infarction are also characterized. Several new, small molecule Raman scatterers are introduced and used to analyze the enhancing properties of the synthesized gold coated-magnetic nanoparticles. Nucleotide and immunoassay platforms have shown improvements in speed and analyte capture through the miniaturization of the capture surface and particle-based capture systems can provide a route to further surface miniaturization. A reaction-diffusion model of the colloidal assay platform is presented to understand the interplay of system parameters such as particle diameter, initial analyte concentration and dissociation constants. The projected sensitivities over a broad range of assay conditions are examined and the governing regime of particle systems reported. The results provide metrics in the design of more robust analytics that are of particular interest for POC diagnostics.

  1. Application of radar chart array analysis to visualize effects of formulation variables on IgG1 particle formation as measured by multiple analytical techniques

    PubMed Central

    Kalonia, Cavan; Kumru, Ozan S.; Kim, Jae Hyun; Middaugh, C. Russell; Volkin, David B.

    2013-01-01

    This study presents a novel method to visualize protein aggregate and particle formation data to rapidly evaluate the effect of solution and stress conditions on the physical stability of an IgG1 monoclonal antibody (mAb). Radar chart arrays were designed so that hundreds of Microflow Digital Imaging (MFI) solution measurements, evaluating different mAb formulations under varying stresses, could be presented in a single figure with minimal loss of data resolution. These MFI radar charts show measured changes in subvisible particle number, size and morphology distribution as a change in the shape of polygons. Radar charts were also created to visualize mAb aggregate and particle formation across a wide size range by combining data sets from size exclusion chromatography (SEC), Archimedes resonant mass measurements, and MFI. We found that the environmental/mechanical stress condition (e.g., heat vs. agitation) was the most important factor in influencing the particle size and morphology distribution with this IgG1 mAb. Additionally, the presence of NaCl exhibited a pH and stress dependent behavior resulting in promotion or inhibition mAb particle formation. This data visualization technique provides a comprehensive analysis of the aggregation tendencies of this IgG1 mAb in different formulations with varying stresses as measured by different analytical techniques. PMID:24122556

  2. Effect of posttranslational modifications on enzyme function and assembly.

    PubMed

    Ryšlavá, Helena; Doubnerová, Veronika; Kavan, Daniel; Vaněk, Ondřej

    2013-10-30

    The detailed examination of enzyme molecules by mass spectrometry and other techniques continues to identify hundreds of distinct PTMs. Recently, global analyses of enzymes using methods of contemporary proteomics revealed widespread distribution of PTMs on many key enzymes distributed in all cellular compartments. Critically, patterns of multiple enzymatic and nonenzymatic PTMs within a single enzyme are now functionally evaluated providing a holistic picture of a macromolecule interacting with low molecular mass compounds, some of them being substrates, enzyme regulators, or activated precursors for enzymatic and nonenzymatic PTMs. Multiple PTMs within a single enzyme molecule and their mutual interplays are critical for the regulation of catalytic activity. Full understanding of this regulation will require detailed structural investigation of enzymes, their structural analogs, and their complexes. Further, proteomics is now integrated with molecular genetics, transcriptomics, and other areas leading to systems biology strategies. These allow the functional interrogation of complex enzymatic networks in their natural environment. In the future, one might envisage the use of robust high throughput analytical techniques that will be able to detect multiple PTMs on a global scale of individual proteomes from a number of carefully selected cells and cellular compartments. This article is part of a Special Issue entitled: Posttranslational Protein modifications in biology and Medicine. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Methods in endogenous steroid profiling - A comparison of gas chromatography mass spectrometry (GC-MS) with supercritical fluid chromatography tandem mass spectrometry (SFC-MS/MS).

    PubMed

    Teubel, Juliane; Wüst, Bernhard; Schipke, Carola G; Peters, Oliver; Parr, Maria Kristina

    2018-06-15

    In various fields of endocrinology, the determination of steroid hormones synthesised by the human body plays an important role. Research on central neurosteroids has been intensified within the last years, as they are discussed as biomarkers for various cognitive disorders. Their concentrations in cerebrospinal fluid (CSF) are considered to be regulated independently from peripheral fluids. For that reason, the challenging matrix CSF becomes a very interesting specimen for analysis. Concentrations are expected to be very low and available amount of CSF is limited. Thus, a comprehensive method for very sensitive quantification of a set of analytes as large as possible in one analytical aliquot is desired. However, high structural similarities of the selected panel of 51 steroids and steroid sulfates, including numerous isomers, challenges achievement of chromatographic selectivity. Since decades the analysis of endogenous steroids in various body fluids is mainly performed by gas chromatography (GC) coupled to (tandem) mass spectrometry (MS(/MS)). Due to the structure of the steroids of interest, derivatisation is performed to meet the analytical requirements for GC-MS(/MS). Most of the laboratories use a two-step derivatisation in multi-analyte assays that was already published in the 1980s. However, for some steroids this elaborate procedure yields multiple isomeric derivatives. Thus, some laboratories utilize (ultra) high performance liquid chromatography ((U)HPLC)-MS/MS as alternative but, even UHPLC is not able to separate some of the isomeric pairs. Supercritical fluid chromatography (SFC) as an orthogonal separation technique to GC and (U)HPLC may help to overcome these issues. Within this project the two most promising methods for endogenous steroid profiling were investigated and compared: the "gold standard" GC-MS and the orthogonal separation technique SFC-MS/MS. Different derivatisation procedures for gas chromatographic detection were explored and the formation of multiple derivatives described and confirmed. Taken together, none of the investigated derivatisation procedures provided acceptable results for further method development to meet the requirements of this project. SFC with its unique selectivity was able to overcome these issues and to distinguish all selected steroids, including (pro-)gestagens, androgens, corticoids, estrogens, and steroid sulfates with appropriate selectivity. Valued especially in the separation of enantiomeric analytes, SFC has shown its potential as alternative to GC. The successful separation of 51 steroids and steroid sulfates on different columns is presented to demonstrate the potential of SFC in endogenous steroid profiling. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Modal vector estimation for closely spaced frequency modes

    NASA Technical Reports Server (NTRS)

    Craig, R. R., Jr.; Chung, Y. T.; Blair, M.

    1982-01-01

    Techniques for obtaining improved modal vector estimates for systems with closely spaced frequency modes are discussed. In describing the dynamical behavior of a complex structure modal parameters are often analyzed: undamped natural frequency, mode shape, modal mass, modal stiffness and modal damping. From both an analytical standpoint and an experimental standpoint, identification of modal parameters is more difficult if the system has repeated frequencies or even closely spaced frequencies. The more complex the structure, the more likely it is to have closely spaced frequencies. This makes it difficult to determine valid mode shapes using single shaker test methods. By employing band selectable analysis (zoom) techniques and by employing Kennedy-Pancu circle fitting or some multiple degree of freedom (MDOF) curve fit procedure, the usefulness of the single shaker approach can be extended.

  5. Application of Soft Computing in Coherent Communications Phase Synchronization

    NASA Technical Reports Server (NTRS)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    2000-01-01

    The use of soft computing techniques in coherent communications phase synchronization provides an alternative to analytical or hard computing methods. This paper discusses a novel use of Adaptive Neuro-Fuzzy Inference Systems (ANFIS) for phase synchronization in coherent communications systems utilizing Multiple Phase Shift Keying (MPSK) modulation. A brief overview of the M-PSK digital communications bandpass modulation technique is presented and it's requisite need for phase synchronization is discussed. We briefly describe the hybrid platform developed by Jang that incorporates fuzzy/neural structures namely the, Adaptive Neuro-Fuzzy Interference Systems (ANFIS). We then discuss application of ANFIS to phase estimation for M-PSK. The modeling of both explicit, and implicit phase estimation schemes for M-PSK symbols with unknown structure are discussed. Performance results from simulation of the above scheme is presented.

  6. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  7. Macroradical initiated polymerisation of acrylic and methacrylic monomers.

    PubMed

    Mijangos, Irene; Guerreiro, António; Piletska, Elena; Whitcombe, Michael J; Karim, Kal; Chianella, Iva; Piletsky, Sergey

    2009-10-01

    An approach has been developed for the grafting of monomers onto poly(trimethylolpropane trimethacrylate) (polyTRIM) particles using 2,2-diethyl dithiocarbamic acid benzyl ester (DDCABE) as an initiator. A set of polymers was prepared with this technique over different lengths of time and the kinetics of the reaction studied experimentally. It was found that the grafting of initiator to the polymeric support followed a second order reaction, while the subsequent addition of monomers from solution into the polyTRIM macroradicals followed a first order reaction. The living nature of the iniferter modified macroradicals permits easy consecutive grafting of multiple polymeric layers, allowing straightforward functionalisation of particles. However, the effectiveness of the grafted initiator decreased with each cycle of polymerisation. This technique can be used for a wide range of applications in analytical and biochemistry.

  8. Investigations of Archaeological Glass Bracelets and Perfume Bottles Excavated in Ancient Ainos (Enez) by Multiple Analytical Techniques

    NASA Astrophysics Data System (ADS)

    Celik, S.; Akyuz, T.; Akyuz, S.; Ozel, A. E.; Kecel-Gunduz, S.; Basaran, S.

    2018-03-01

    Fragments of two perfume bottles belonging to the Hellenistic and Roman periods, and five bracelets belonging to the Roman, Byzantine, and Ottoman periods, excavated in the archaeological site of Enez during the excavations in 2000, have been investigated. The samples were analyzed using micro-Raman, FTIR, and energy dispersive X-ray fluorescence techniques, in order to study the ancient technology of glass production and to determine chemical compositions of the basic components and coloring elements of the glassware. All the investigated glasses can be characterized as low-magnesia-soda-lime silicate glasses, whose colors are induced by metal ions. The melting points of the investigated glasses are estimated to be quite close to each other and around 1000°C.

  9. Next Generation Space Surveillance System-of-Systems

    NASA Astrophysics Data System (ADS)

    McShane, B.

    2014-09-01

    International economic and military dependence on space assets is pervasive and ever-growing in an environment that is now congested, contested, and competitive. There are a number of natural and man-made risks that need to be monitored and characterized to protect and preserve the space environment and the assets within it. Unfortunately, today's space surveillance network (SSN) has gaps in coverage, is not resilient, and has a growing number of objects that get lost. Risks can be efficiently and effectively mitigated, gaps closed, resiliency improved, and performance increased within a next generation space surveillance network implemented as a system-of-systems with modern information architectures and analytic techniques. This also includes consideration for the newest SSN sensors (e.g. Space Fence) which are born Net-Centric out-of-the-box and able to seamlessly interface with the JSpOC Mission System, global information grid, and future unanticipated users. Significant opportunity exists to integrate legacy, traditional, and non-traditional sensors into a larger space system-of-systems (including command and control centers) for multiple clients through low cost sustainment, modification, and modernization efforts. Clients include operations centers (e.g. JSpOC, USSTRATCOM, CANSPOC), Intelligence centers (e.g. NASIC), space surveillance sensor sites (e.g. AMOS, GEODSS), international governments (e.g. Germany, UK), space agencies (e.g. NASA), and academic institutions. Each has differing priorities, networks, data needs, timeliness, security, accuracy requirements and formats. Enabling processes and technologies include: Standardized and type accredited methods for secure connections to multiple networks, machine-to-machine interfaces for near real-time data sharing and tip-and-queue activities, common data models for analytical processing across multiple radar and optical sensor types, an efficient way to automatically translate between differing client and sensor formats, data warehouse of time based space events, secure collaboration tools for international coalition space operations, shared concept-of-operations, tactics, techniques, and procedures.

  10. World Spatiotemporal Analytics and Mapping Project (wstamp): Discovering, Exploring, and Mapping Spatiotemporal Patterns across the World's Largest Open Soruce Data Sets

    NASA Astrophysics Data System (ADS)

    Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.

    2015-07-01

    The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.

  11. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.

  12. Co-Creativity and Interactive Repair: Commentary on Berta Bornstein's "The Analysis of a Phobic Child".

    PubMed

    Harrison, Alexandra

    2014-01-01

    My comments focus on a consideration of three issues central to child psychoanalysis stimulated by rereading the classic paper by Berta Bornstein, "The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis": (1) the importance of "co-creativity" and its use in analysis to repair disruptions in the mother-child relationship; (2) working analytically with the "inner world of the child "; and (3) the fundamental importance of multiple simultaneous meaning-making processes. I begin with a discussion of current thinking about the importance of interactive processes in developmental and therapeutic change and then lead to the concepts of "co-creativity" and interactive repair, elements that are missing in the "Frankie" paper. The co-creative process that I outline includes multiple contributions that Frankie and his caregivers brought to their relationships--his mother, his father, his nurse, and even his analyst. I then address the question of how child analysts can maintain a central focus on the inner world of the child while still taking into account the complex nature of co-creativity in the change process. Finally, I discuss insights into the multiple simultaneous meaning-making processes in the analytic relationship to effect therapeutic change, including what I call the "sandwich model," an attempt to organize this complexity so that is more accessible to the practicing clinician. In terms of the specific case of Frankie, my reading of the case suggests that failure to repair disruptions in the mother-child relationship from infancy through the time of the analytic treatment was central to Frankie's problems. My hypothesis is that, rather than the content of his analyst's interpretations, what was helpful to Frankie in the analysis was the series of attempts at interactive repair in the analytic process. Unfortunately, the case report does not offer data to test this hypothesis. Indeed, one concluding observation from my reading of this classic case is how useful it would be for the contemporary analyst to pay attention to the multifaceted co-creative process in order to explain and foster the therapeutic change that can occur in analysis.

  13. Calibrating a novel multi-sensor physical activity measurement system.

    PubMed

    John, D; Liu, S; Sasaki, J E; Howe, C A; Staudenmayer, J; Gao, R X; Freedson, P S

    2011-09-01

    Advancing the field of physical activity (PA) monitoring requires the development of innovative multi-sensor measurement systems that are feasible in the free-living environment. The use of novel analytical techniques to combine and process these multiple sensor signals is equally important. This paper describes a novel multi-sensor 'integrated PA measurement system' (IMS), the lab-based methodology used to calibrate the IMS, techniques used to predict multiple variables from the sensor signals, and proposes design changes to improve the feasibility of deploying the IMS in the free-living environment. The IMS consists of hip and wrist acceleration sensors, two piezoelectric respiration sensors on the torso, and an ultraviolet radiation sensor to obtain contextual information (indoors versus outdoors) of PA. During lab-based calibration of the IMS, data were collected on participants performing a PA routine consisting of seven different ambulatory and free-living activities while wearing a portable metabolic unit (criterion measure) and the IMS. Data analyses on the first 50 adult participants are presented. These analyses were used to determine if the IMS can be used to predict the variables of interest. Finally, physical modifications for the IMS that could enhance the feasibility of free-living use are proposed and refinement of the prediction techniques is discussed.

  14. Optical levitation and translation of a microscopic particle by use of multiple beams generated by vertical-cavity surface-emitting laser array sources.

    PubMed

    Ogura, Yusuke; Shirai, Nobuhiro; Tanida, Jun

    2002-09-20

    An optical levitation and translation method for a microscopic particle by use of the resultant force induced by multiple light beams is studied. We show dependence of the radiation pressure force on the illuminating distribution by numerical calculation, and we find that the strongest axial force is obtained by a specific spacing period of illuminating beams. Extending the optical manipulation technique by means of vertical-cavity surface-emitting laser (VCSEL) array sources [Appl. Opt. 40, 5430 (2001)], we are the first, to our knowledge, to demonstrate levitation of a particle and its translation while levitated by using a VCSEL array. The vertical position of the target particle can be controlled in a range of a few tens of micrometers with an accuracy of 2 microm or less. The analytical and experimental results suggest that use of multiple beams is an effective method to levitate a particle with low total illumination power. Some issues on the manipulation method that uses multiple beams are discussed.

  15. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Molecular preservation in Late Cretaceous sauropod dinosaur eggshells.

    PubMed

    Schweitzer, M H; Chiappe, L; Garrido, A C; Lowenstein, J M; Pincus, S H

    2005-04-22

    Exceptionally preserved sauropod eggshells discovered in Upper Cretaceous (Campanian) deposits in Patagonia, Argentina, contain skeletal remains and soft tissues of embryonic Titanosaurid dinosaurs. To preserve these labile embryonic remains, the rate of mineral precipitation must have superseded post-mortem degradative processes, resulting in virtually instantaneous mineralization of soft tissues. If so, mineralization may also have been rapid enough to retain fragments of original biomolecules in these specimens. To investigate preservation of biomolecular compounds in these well-preserved sauropod dinosaur eggshells, we applied multiple analytical techniques. Results demonstrate organic compounds and antigenic structures similar to those found in extant eggshells.

  17. Molecular preservation in Late Cretaceous sauropod dinosaur eggshells

    PubMed Central

    Schweitzer, M.H; Chiappe, L; Garrido, A.C; Lowenstein, J.M; Pincus, S.H

    2005-01-01

    Exceptionally preserved sauropod eggshells discovered in Upper Cretaceous (Campanian) deposits in Patagonia, Argentina, contain skeletal remains and soft tissues of embryonic Titanosaurid dinosaurs. To preserve these labile embryonic remains, the rate of mineral precipitation must have superseded post-mortem degradative processes, resulting in virtually instantaneous mineralization of soft tissues. If so, mineralization may also have been rapid enough to retain fragments of original biomolecules in these specimens. To investigate preservation of biomolecular compounds in these well-preserved sauropod dinosaur eggshells, we applied multiple analytical techniques. Results demonstrate organic compounds and antigenic structures similar to those found in extant eggshells. PMID:15888409

  18. Digital robust active control law synthesis for large order flexible structure using parameter optimization

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, V.

    1988-01-01

    A generic procedure for the parameter optimization of a digital control law for a large-order flexible flight vehicle or large space structure modeled as a sampled data system is presented. A linear quadratic Guassian type cost function was minimized, while satisfying a set of constraints on the steady-state rms values of selected design responses, using a constrained optimization technique to meet multiple design requirements. Analytical expressions for the gradients of the cost function and the design constraints on mean square responses with respect to the control law design variables are presented.

  19. Differentiation between Staphylococcus aureus and Staphylococcus epidermidis strains using Raman spectroscopy.

    PubMed

    Rebrošová, Katarína; Šiler, Martin; Samek, Ota; Růžička, Filip; Bernatová, Silvie; Ježek, Jan; Zemánek, Pavel; Holá, Veronika

    2017-08-01

    Raman spectroscopy is an analytical method with a broad range of applications across multiple scientific fields. We report on a possibility to differentiate between two important Gram-positive species commonly found in clinical material - Staphylococcus aureus and Staphylococcus epidermidis - using this rapid noninvasive technique. For this, we tested 87 strains, 41 of S. aureus and 46 of S. epidermidis, directly from colonies grown on a Mueller-Hinton agar plate using Raman spectroscopy. The method paves a way for separation of these two species even on high number of samples and therefore, it can be potentially used in clinical diagnostics.

  20. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries.

    PubMed

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-15

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 - 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  1. A Newton–Krylov method with an approximate analytical Jacobian for implicit solution of Navier–Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    PubMed Central

    Asgharzadeh, Hafez; Borazjani, Iman

    2016-01-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 – 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80–90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future. PMID:28042172

  2. A Newton-Krylov method with an approximate analytical Jacobian for implicit solution of Navier-Stokes equations on staggered overset-curvilinear grids with immersed boundaries

    NASA Astrophysics Data System (ADS)

    Asgharzadeh, Hafez; Borazjani, Iman

    2017-02-01

    The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for non-linear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form a preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42-74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal and full Jacobian, respectivley, when the stretching factor was increased. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.

  3. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  4. Are Higher Education Institutions Prepared for Learning Analytics?

    ERIC Educational Resources Information Center

    Ifenthaler, Dirk

    2017-01-01

    Higher education institutions and involved stakeholders can derive multiple benefits from learning analytics by using different data analytics strategies to produce summative, real-time, and predictive insights and recommendations. However, are institutions and academic as well as administrative staff prepared for learning analytics? A learning…

  5. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Decryption with incomplete cyphertext and multiple-information encryption in phase space.

    PubMed

    Xu, Xiaobin; Wu, Quanying; Liu, Jun; Situ, Guohai

    2016-01-25

    Recently, we have demonstrated that information encryption in phase space offers security enhancement over the traditional encryption schemes operating in real space. However, there is also an important issue with this technique: increasing the cost for data transmitting and storage. To address this issue, here we investigate the problem of decryption using incomplete cyphertext. We show that the analytic solution under the traditional framework set the lower limit of decryption performance. More importantly, we demonstrate that one just needs a small amount of cyphertext to recover the plaintext signal faithfully using compressive sensing, meaning that the amount of data that needs to transmit and store can be significantly reduced. This leads to multiple information encryption so that we can use the system bandwidth more effectively. We also provide an optical experimental result to demonstrate the plaintext recovered in phase space.

  7. Application of stable isotope ratio analysis for biodegradation monitoring in groundwater

    USGS Publications Warehouse

    Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.

    2013-01-01

    Stable isotope ratio analysis is increasingly being applied as a tool to detect, understand, and quantify biodegradation of organic and inorganic contaminants in groundwater. An important feature of this approach is that it allows degradative losses of contaminants to be distinguished from those caused by non-destructive processes such as dilution, dispersion, and sorption. Recent advances in analytical techniques, and new approaches for interpreting stable isotope data, have expanded the utility of this method while also exposing complications and ambiguities that must be considered in data interpretations. Isotopic analyses of multiple elements in a compound, and multiple compounds in the environment, are being used to distinguish biodegradative pathways by their characteristic isotope effects. Numerical models of contaminant transport, degradation pathways, and isotopic composition are improving quantitative estimates of in situ contaminant degradation rates under realistic environmental conditions.

  8. Coherent total internal reflection dark-field microscopy: label-free imaging beyond the diffraction limit.

    PubMed

    von Olshausen, Philipp; Rohrbach, Alexander

    2013-10-15

    Coherent imaging is barely applicable in life-science microscopy due to multiple interference artifacts. Here, we show how these interferences can be used to improve image resolution and contrast. We present a dark-field microscopy technique with evanescent illumination via total internal reflection that delivers high-contrast images of coherently scattering samples. By incoherent averaging of multiple coherent images illuminated from different directions we can resolve image structures that remain unresolved by conventional (incoherent) fluorescence microscopy. We provide images of 190 nm beads revealing resolution beyond the diffraction limit and slightly increased object distances. An analytical model is introduced that accounts for the observed effects and which is confirmed by numerical simulations. Our approach may be a route to fast, label-free, super-resolution imaging in live-cell microscopy.

  9. Multiple imputation as one tool to provide longitudinal databases for modelling human height and weight development.

    PubMed

    Aßmann, C

    2016-06-01

    Besides large efforts regarding field work, provision of valid databases requires statistical and informational infrastructure to enable long-term access to longitudinal data sets on height, weight and related issues. To foster use of longitudinal data sets within the scientific community, provision of valid databases has to address data-protection regulations. It is, therefore, of major importance to hinder identifiability of individuals from publicly available databases. To reach this goal, one possible strategy is to provide a synthetic database to the public allowing for pretesting strategies for data analysis. The synthetic databases can be established using multiple imputation tools. Given the approval of the strategy, verification is based on the original data. Multiple imputation by chained equations is illustrated to facilitate provision of synthetic databases as it allows for capturing a wide range of statistical interdependencies. Also missing values, typically occurring within longitudinal databases for reasons of item non-response, can be addressed via multiple imputation when providing databases. The provision of synthetic databases using multiple imputation techniques is one possible strategy to ensure data protection, increase visibility of longitudinal databases and enhance the analytical potential.

  10. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  11. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  12. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  13. Conversion of multiple analyte cation types to a single analyte anion type via ion/ion charge inversion.

    PubMed

    Hassell, Kerry M; LeBlanc, Yves; McLuckey, Scott A

    2009-11-01

    Charge inversion ion/ion reactions can convert several cation types associated with a single analyte molecule to a single anion type for subsequent mass analysis. Specifically, analyte ions present with one of a variety of cationizing agents, such as an excess proton, excess sodium ion, or excess potassium ion, can all be converted to the deprotonated molecule, provided that a stable anion can be generated for the analyte. Multiply deprotonated species that are capable of exchanging a proton for a metal ion serve as the reagent anions for the reaction. This process is demonstrated here for warfarin and for a glutathione conjugate. Examples for several other glutathione conjugates are provided as supplementary material to demonstrate the generality of the reaction. In the case of glutathione conjugates, multiple metal ions can be associated with the singly-charged analyte due to the presence of two carboxylate groups. The charge inversion reaction involves the removal of the excess cationizing agent, as well as any metal ions associated with anionic groups to yield a singly deprotonated analyte molecule. The ability to convert multiple cation types to a single anion type is analytically desirable in cases in which the analyte signal is distributed among several cation types, as is common in the electrospray ionization of solutions with relatively high salt contents. For analyte species that undergo efficient charge inversion, such as glutathione conjugates, there is the additional potential advantage for significantly improved signal-to-noise ratios when species that give rise to 'chemical noise' in the positive ion spectrum do not undergo efficient charge inversion.

  14. Mechanical behavior of regular open-cell porous biomaterials made of diamond lattice unit cells.

    PubMed

    Ahmadi, S M; Campoli, G; Amin Yavari, S; Sajadi, B; Wauthle, R; Schrooten, J; Weinans, H; Zadpoor, A A

    2014-06-01

    Cellular structures with highly controlled micro-architectures are promising materials for orthopedic applications that require bone-substituting biomaterials or implants. The availability of additive manufacturing techniques has enabled manufacturing of biomaterials made of one or multiple types of unit cells. The diamond lattice unit cell is one of the relatively new types of unit cells that are used in manufacturing of regular porous biomaterials. As opposed to many other types of unit cells, there is currently no analytical solution that could be used for prediction of the mechanical properties of cellular structures made of the diamond lattice unit cells. In this paper, we present new analytical solutions and closed-form relationships for predicting the elastic modulus, Poisson׳s ratio, critical buckling load, and yield (plateau) stress of cellular structures made of the diamond lattice unit cell. The mechanical properties predicted using the analytical solutions are compared with those obtained using finite element models. A number of solid and porous titanium (Ti6Al4V) specimens were manufactured using selective laser melting. A series of experiments were then performed to determine the mechanical properties of the matrix material and cellular structures. The experimentally measured mechanical properties were compared with those obtained using analytical solutions and finite element (FE) models. It has been shown that, for small apparent density values, the mechanical properties obtained using analytical and numerical solutions are in agreement with each other and with experimental observations. The properties estimated using an analytical solution based on the Euler-Bernoulli theory markedly deviated from experimental results for large apparent density values. The mechanical properties estimated using FE models and another analytical solution based on the Timoshenko beam theory better matched the experimental observations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Multiplex, Rapid, and Sensitive Isothermal Detection of Nucleic-Acid Sequence by Endonuclease Restriction-Mediated Real-Time Multiple Cross Displacement Amplification.

    PubMed

    Wang, Yi; Wang, Yan; Zhang, Lu; Liu, Dongxin; Luo, Lijuan; Li, Hua; Cao, Xiaolong; Liu, Kai; Xu, Jianguo; Ye, Changyun

    2016-01-01

    We have devised a novel isothermal amplification technology, termed endonuclease restriction-mediated real-time multiple cross displacement amplification (ET-MCDA), which facilitated multiplex, rapid, specific and sensitive detection of nucleic-acid sequences at a constant temperature. The ET-MCDA integrated multiple cross displacement amplification strategy, restriction endonuclease cleavage and real-time fluorescence detection technique. In the ET-MCDA system, the functional cross primer E-CP1 or E-CP2 was constructed by adding a short sequence at the 5' end of CP1 or CP2, respectively, and the new E-CP1 or E-CP2 primer was labeled at the 5' end with a fluorophore and in the middle with a dark quencher. The restriction endonuclease Nb.BsrDI specifically recognized the short sequence and digested the newly synthesized double-stranded terminal sequences (5' end short sequences and their complementary sequences), which released the quenching, resulting on a gain of fluorescence signal. Thus, the ET-MCDA allowed real-time detection of single or multiple targets in only a single reaction, and the positive results were observed in as short as 12 min, detecting down to 3.125 fg of genomic DNA per tube. Moreover, the analytical specificity and the practical application of the ET-MCDA were also successfully evaluated in this study. Here, we provided the details on the novel ET-MCDA technique and expounded the basic ET-MCDA amplification mechanism.

  16. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  17. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario

    NASA Astrophysics Data System (ADS)

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga; Parro, Víctor

    2017-10-01

    The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing.

  18. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    PubMed

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Lessons Learned from Deploying an Analytical Task Management Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.; Welch, Clara; Arceneaux, Joshua; Bulgatz, Dennis; Hunt, Mitch; Young, Stephen

    2007-01-01

    Defining requirements, missions, technologies, and concepts for space exploration involves multiple levels of organizations, teams of people with complementary skills, and analytical models and simulations. Analytical activities range from filling a To-Be-Determined (TBD) in a requirement to creating animations and simulations of exploration missions. In a program as large as returning to the Moon, there are hundreds of simultaneous analysis activities. A way to manage and integrate efforts of this magnitude is to deploy a centralized database that provides the capability to define tasks, identify resources, describe products, schedule deliveries, and generate a variety of reports. This paper describes a web-accessible task management system and explains the lessons learned during the development and deployment of the database. Through the database, managers and team leaders can define tasks, establish review schedules, assign teams, link tasks to specific requirements, identify products, and link the task data records to external repositories that contain the products. Data filters and spreadsheet export utilities provide a powerful capability to create custom reports. Import utilities provide a means to populate the database from previously filled form files. Within a four month period, a small team analyzed requirements, developed a prototype, conducted multiple system demonstrations, and deployed a working system supporting hundreds of users across the aeros pace community. Open-source technologies and agile software development techniques, applied by a skilled team enabled this impressive achievement. Topics in the paper cover the web application technologies, agile software development, an overview of the system's functions and features, dealing with increasing scope, and deploying new versions of the system.

  20. ANALYTICAL SOLUTIONS OF THE ATMOSPHERIC DIFFUSION EQUATION WITH MULTIPLE SOURCES AND HEIGHT-DEPENDENT WIND SPEED AND EDDY DIFFUSIVITIES. (R825689C072)

    EPA Science Inventory

    Abstract

    Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...

  1. ANALYTICAL SOLUTIONS OF THE ATMOSPHERIC DIFFUSION EQUATION WITH MULTIPLE SOURCES AND HEIGHT-DEPENDENT WIND SPEED AND EDDY DIFFUSIVITIES. (R825689C048)

    EPA Science Inventory

    Abstract

    Three-dimensional analytical solutions of the atmospheric diffusion equation with multiple sources and height-dependent wind speed and eddy diffusivities are derived in a systematic fashion. For homogeneous Neumann (total reflection), Dirichlet (total adsorpti...

  2. A parallel computing engine for a class of time critical processes.

    PubMed

    Nabhan, T M; Zomaya, A Y

    1997-01-01

    This paper focuses on the efficient parallel implementation of systems of numerically intensive nature over loosely coupled multiprocessor architectures. These analytical models are of significant importance to many real-time systems that have to meet severe time constants. A parallel computing engine (PCE) has been developed in this work for the efficient simplification and the near optimal scheduling of numerical models over the different cooperating processors of the parallel computer. First, the analytical system is efficiently coded in its general form. The model is then simplified by using any available information (e.g., constant parameters). A task graph representing the interconnections among the different components (or equations) is generated. The graph can then be compressed to control the computation/communication requirements. The task scheduler employs a graph-based iterative scheme, based on the simulated annealing algorithm, to map the vertices of the task graph onto a Multiple-Instruction-stream Multiple-Data-stream (MIMD) type of architecture. The algorithm uses a nonanalytical cost function that properly considers the computation capability of the processors, the network topology, the communication time, and congestion possibilities. Moreover, the proposed technique is simple, flexible, and computationally viable. The efficiency of the algorithm is demonstrated by two case studies with good results.

  3. A semi-analytical method for near-trapped mode and fictitious frequencies of multiple scattering by an array of elliptical cylinders in water waves

    NASA Astrophysics Data System (ADS)

    Chen, Jeng-Tzong; Lee, Jia-Wei

    2013-09-01

    In this paper, we focus on the water wave scattering by an array of four elliptical cylinders. The null-field boundary integral equation method (BIEM) is used in conjunction with degenerate kernels and eigenfunctions expansion. The closed-form fundamental solution is expressed in terms of the degenerate kernel containing the Mathieu and the modified Mathieu functions in the elliptical coordinates. Boundary densities are represented by using the eigenfunction expansion. To avoid using the addition theorem to translate the Mathieu functions, the present approach can solve the water wave problem containing multiple elliptical cylinders in a semi-analytical manner by introducing the adaptive observer system. Regarding water wave problems, the phenomena of numerical instability of fictitious frequencies may appear when the BIEM/boundary element method (BEM) is used. Besides, the near-trapped mode for an array of four identical elliptical cylinders is observed in a special layout. Both physical (near-trapped mode) and mathematical (fictitious frequency) resonances simultaneously appear in the present paper for a water wave problem by an array of four identical elliptical cylinders. Two regularization techniques, the combined Helmholtz interior integral equation formulation (CHIEF) method and the Burton and Miller approach, are adopted to alleviate the numerical resonance due to fictitious frequency.

  4. Accounting for differences in the bioactivity and bioavailability of vitamers

    PubMed Central

    Gregory, Jesse F.

    2012-01-01

    Essentially all vitamins exist with multiple nutritionally active chemical species often called vitamers. Our quantitative understanding of the bioactivity and bioavailability of the various members of each vitamin family has increased markedly, but many issues remain to be resolved concerning the reporting and use of analytical data. Modern methods of vitamin analysis rely heavily on chromatographic techniques that generally allow the measurement of the individual chemical forms of vitamins. Typical applications of food analysis include the evaluation of shelf life and storage stability, monitoring of nutrient retention during food processing, developing food composition databases and data needed for food labeling, assessing dietary adequacy and evaluating epidemiological relationships between diet and disease. Although the usage of analytical data varies depending on the situation, important issues regarding how best to present and interpret the data in light of the presence of multiple vitamers are common to all aspects of food analysis. In this review, we will evaluate the existence of vitamers that exhibit differences in bioactivity or bioavailability, consider when there is a need to address differences in bioactivity or bioavailability of vitamers, and then consider alternative approaches and possible ways to improve the reporting of data. Major examples are taken from literature and experience with vitamin B6 and folate. PMID:22489223

  5. On the multiple zeros of a real analytic function with applications to the averaging theory of differential equations

    NASA Astrophysics Data System (ADS)

    García, Isaac A.; Llibre, Jaume; Maza, Susanna

    2018-06-01

    In this work we consider real analytic functions , where , Ω is a bounded open subset of , is an interval containing the origin, are parameters, and ε is a small parameter. We study the branching of the zero-set of at multiple points when the parameter ε varies. We apply the obtained results to improve the classical averaging theory for computing T-periodic solutions of λ-families of analytic T-periodic ordinary differential equations defined on , using the displacement functions defined by these equations. We call the coefficients in the Taylor expansion of in powers of ε the averaged functions. The main contribution consists in analyzing the role that have the multiple zeros of the first non-zero averaged function. The outcome is that these multiple zeros can be of two different classes depending on whether the zeros belong or not to the analytic set defined by the real variety associated to the ideal generated by the averaged functions in the Noetheriang ring of all the real analytic functions at . We bound the maximum number of branches of isolated zeros that can bifurcate from each multiple zero z 0. Sometimes these bounds depend on the cardinalities of minimal bases of the former ideal. Several examples illustrate our results and they are compared with the classical theory, branching theory and also under the light of singularity theory of smooth maps. The examples range from polynomial vector fields to Abel differential equations and perturbed linear centers.

  6. Addressing the targeting range of the ABILHAND-56 in relapsing-remitting multiple sclerosis: A mixed methods psychometric study.

    PubMed

    Cleanthous, Sophie; Strzok, Sara; Pompilus, Farrah; Cano, Stefan; Marquis, Patrick; Cohan, Stanley; Goldman, Myla D; Kresa-Reahl, Kiren; Petrillo, Jennifer; Castrillo-Viguera, Carmen; Cadavid, Diego; Chen, Shih-Yin

    2018-01-01

    ABILHAND, a manual ability patient-reported outcome instrument originally developed for stroke patients, has been used in multiple sclerosis clinical trials; however, psychometric analyses indicated the measure's limited measurement range and precision in higher-functioning multiple sclerosis patients. The purpose of this study was to identify candidate items to expand the measurement range of the ABILHAND-56, thus improving its ability to detect differences in manual ability in higher-functioning multiple sclerosis patients. A step-wise mixed methods design strategy was used, comprising two waves of patient interviews, a combination of qualitative (concept elicitation and cognitive debriefing) and quantitative (Rasch measurement theory) analytic techniques, and consultation interviews with three clinical neurologists specializing in multiple sclerosis. Original ABILHAND was well understood in this context of use. Eighty-two new manual ability concepts were identified. Draft supplementary items were generated and refined with patient and neurologist input. Rasch measurement theory psychometric analysis indicated supplementary items improved targeting to higher-functioning multiple sclerosis patients and measurement precision. The final pool of Early Multiple Sclerosis Manual Ability items comprises 20 items. The synthesis of qualitative and quantitative methods used in this study improves the ABILHAND content validity to more effectively identify manual ability changes in early multiple sclerosis and potentially help determine treatment effect in higher-functioning patients in clinical trials.

  7. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  8. Visual Analytics for Heterogeneous Geoscience Data

    NASA Astrophysics Data System (ADS)

    Pan, Y.; Yu, L.; Zhu, F.; Rilee, M. L.; Kuo, K. S.; Jiang, H.; Yu, H.

    2017-12-01

    Geoscience data obtained from diverse sources have been routinely leveraged by scientists to study various phenomena. The principal data sources include observations and model simulation outputs. These data are characterized by spatiotemporal heterogeneity originated from different instrument design specifications and/or computational model requirements used in data generation processes. Such inherent heterogeneity poses several challenges in exploring and analyzing geoscience data. First, scientists often wish to identify features or patterns co-located among multiple data sources to derive and validate certain hypotheses. Heterogeneous data make it a tedious task to search such features in dissimilar datasets. Second, features of geoscience data are typically multivariate. It is challenging to tackle the high dimensionality of geoscience data and explore the relations among multiple variables in a scalable fashion. Third, there is a lack of transparency in traditional automated approaches, such as feature detection or clustering, in that scientists cannot intuitively interact with their analysis processes and interpret results. To address these issues, we present a new scalable approach that can assist scientists in analyzing voluminous and diverse geoscience data. We expose a high-level query interface that allows users to easily express their customized queries to search features of interest across multiple heterogeneous datasets. For identified features, we develop a visualization interface that enables interactive exploration and analytics in a linked-view manner. Specific visualization techniques such as scatter plots to parallel coordinates are employed in each view to allow users to explore various aspects of features. Different views are linked and refreshed according to user interactions in any individual view. In such a manner, a user can interactively and iteratively gain understanding into the data through a variety of visual analytics operations. We demonstrate with use cases how scientists can combine the query and visualization interfaces to enable a customized workflow facilitating studies using heterogeneous geoscience datasets.

  9. Electronic nose for detecting multiple targets

    NASA Astrophysics Data System (ADS)

    Chakraborty, Anirban; Parthasarathi, Ganga; Poddar, Rakesh; Zhao, Weiqiang; Luo, Cheng

    2006-05-01

    The discovery of high conductivity in doped polyacetylene in 1977 (garnering the 2000 Nobel Prize in Chemistry for the three discovering scientists) has attracted considerable interest in the application of polymers as the semiconducting and conducting materials due to their promising potential to replace silicon and metals in building devices. Previous and current efforts in developing conducting polymer microsystems mainly focus on generating a device of a single function. When multiple micropatterns made of different conducting polymers are produced on the same substrate, many microsystems of multiple functions can be envisioned. For example, analogous to the mammalian olfactory system which includes over 1,000 receptor genes in detecting various odors (e.g., beer, soda etc.), a sensor consisting of multiple distinct conducting polymer sensing elements will be capable of detecting a number of analytes simultaneously. However, existing techniques present significant technical challenges of degradation, low throughput, low resolution, depth of field, and/or residual layer in producing conducting polymer microstructures. To circumvent these challenges, an intermediate-layer lithography method developed in our group is used to generate multiple micropatterns made of different, commonly used conducting polymers, Polypyrrole (PPy), Poly(3,4-ethylenedioxy)thiophene (PEDOT) and Polyaniline (PANI). The generated multiple micropatterns are further used in an "electronic nose" to detect water vapor, glucose, toluene and acetone.

  10. Evaluating Moving Target Defense with PLADD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Stephen T.; Outkin, Alexander V.; Gearhart, Jared Lee

    This project evaluates the effectiveness of moving target defense (MTD) techniques using a new game we have designed, called PLADD, inspired by the game FlipIt [28]. PLADD extends FlipIt by incorporating what we believe are key MTD concepts. We have analyzed PLADD and proven the existence of a defender strategy that pushes a rational attacker out of the game, demonstrated how limited the strategies available to an attacker are in PLADD, and derived analytic expressions for the expected utility of the game’s players in multiple game variants. We have created an algorithm for finding a defender’s optimal PLADD strategy. Wemore » show that in the special case of achieving deterrence in PLADD, MTD is not always cost effective and that its optimal deployment may shift abruptly from not using MTD at all to using it as aggressively as possible. We believe our effort provides basic, fundamental insights into the use of MTD, but conclude that a truly practical analysis requires model selection and calibration based on real scenarios and empirical data. We propose several avenues for further inquiry, including (1) agents with adaptive capabilities more reflective of real world adversaries, (2) the presence of multiple, heterogeneous adversaries, (3) computational game theory-based approaches such as coevolution to allow scaling to the real world beyond the limitations of analytical analysis and classical game theory, (4) mapping the game to real-world scenarios, (5) taking player risk into account when designing a strategy (in addition to expected payoff), (6) improving our understanding of the dynamic nature of MTD-inspired games by using a martingale representation, defensive forecasting, and techniques from signal processing, and (7) using adversarial games to develop inherently resilient cyber systems.« less

  11. A two-dimensional composite grid numerical model based on the reduced system for oceanography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Y.F.; Browning, G.L.; Chesshire, G.

    The proper mathematical limit of a hyperbolic system with multiple time scales, the reduced system, is a system that contains no high-frequency motions and is well posed if suitable boundary conditions are chosen for the initial-boundary value problem. The composite grid method, a robust and efficient grid-generation technique that smoothly and accurately treats general irregular boundaries, is used to approximate the two-dimensional version of the reduced system for oceanography on irregular ocean basins. A change-of-variable technique that substantially increases the accuracy of the model and a method for efficiently solving the elliptic equation for the geopotential are discussed. Numerical resultsmore » are presented for circular and kidney-shaped basins by using a set of analytic solutions constructed in this paper.« less

  12. Imaging-based molecular barcoding with pixelated dielectric metasurfaces

    NASA Astrophysics Data System (ADS)

    Tittl, Andreas; Leitis, Aleksandrs; Liu, Mingkai; Yesilkoy, Filiz; Choi, Duk-Yong; Neshev, Dragomir N.; Kivshar, Yuri S.; Altug, Hatice

    2018-06-01

    Metasurfaces provide opportunities for wavefront control, flat optics, and subwavelength light focusing. We developed an imaging-based nanophotonic method for detecting mid-infrared molecular fingerprints and implemented it for the chemical identification and compositional analysis of surface-bound analytes. Our technique features a two-dimensional pixelated dielectric metasurface with a range of ultrasharp resonances, each tuned to a discrete frequency; this enables molecular absorption signatures to be read out at multiple spectral points, and the resulting information is then translated into a barcode-like spatial absorption map for imaging. The signatures of biological, polymer, and pesticide molecules can be detected with high sensitivity, covering applications such as biosensing and environmental monitoring. Our chemically specific technique can resolve absorption fingerprints without the need for spectrometry, frequency scanning, or moving mechanical parts, thereby paving the way toward sensitive and versatile miniaturized mid-infrared spectroscopy devices.

  13. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  14. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  15. Practical guidance for conducting mediation analysis with multiple mediators using inverse odds ratio weighting.

    PubMed

    Nguyen, Quynh C; Osypuk, Theresa L; Schmidt, Nicole M; Glymour, M Maria; Tchetgen Tchetgen, Eric J

    2015-03-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994-2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Electron Injections: A Study of Electron Acceleration by Multiple Dipolarizing Flux Bundles Using an Analytical Model

    NASA Astrophysics Data System (ADS)

    Gabrielse, C.; Angelopoulos, V.; Artemyev, A.; Runov, A.; Harris, C.

    2016-12-01

    We study energetic electron injections using an analytical model that self-consistently describes electric and magnetic field perturbations of transient, localized dipolarizing flux bundles (DFBs). Previous studies using THEMIS, Van Allen Probes, and the Magnetospheric Multiscale Mission have shown that injections can occur on short (minutes) or long (10s of minutes) timescales. These studies suggest that the short timescale injections correspond to a single DFB, whereas long timescale injections are likely caused by an aggregate of multiple DFBs, each incrementally heating the particle population. We therefore model the effects of multiple DFBs on the electron population using multi-spacecraft observations of the fields and particle fluxes to constrain the model parameters. The analytical model is the first of its kind to model multiple dipolarization fronts in order to better understand the transport and acceleration process throughout the plasma sheet. It can reproduce most injection signatures at multiple locations simultaneously, reaffirming earlier findings that multiple earthward-traveling DFBs can both transport and accelerate electrons to suprathermal energies, and can thus be considered the injections' primary driver.

  17. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  18. An analytical/numerical correlation study of the multiple concentric cylinder model for the thermoplastic response of metal matrix composites

    NASA Technical Reports Server (NTRS)

    Pindera, Marek-Jerzy; Salzar, Robert S.; Williams, Todd O.

    1993-01-01

    The utility of a recently developed analytical micromechanics model for the response of metal matrix composites under thermal loading is illustrated by comparison with the results generated using the finite-element approach. The model is based on the concentric cylinder assemblage consisting of an arbitrary number of elastic or elastoplastic sublayers with isotropic or orthotropic, temperature-dependent properties. The elastoplastic boundary-value problem of an arbitrarily layered concentric cylinder is solved using the local/global stiffness matrix formulation (originally developed for elastic layered media) and Mendelson's iterative technique of successive elastic solutions. These features of the model facilitate efficient investigation of the effects of various microstructural details, such as functionally graded architectures of interfacial layers, on the evolution of residual stresses during cool down. The available closed-form expressions for the field variables can readily be incorporated into an optimization algorithm in order to efficiently identify optimal configurations of graded interfaces for given applications. Comparison of residual stress distributions after cool down generated using finite-element analysis and the present micromechanics model for four composite systems with substantially different temperature-dependent elastic, plastic, and thermal properties illustrates the efficacy of the developed analytical scheme.

  19. Modeling and Analysis of Structural Dynamics for a One-Tenth Scale Model NGST Sunshield

    NASA Technical Reports Server (NTRS)

    Johnston, John; Lienard, Sebastien; Brodeur, Steve (Technical Monitor)

    2001-01-01

    New modeling and analysis techniques have been developed for predicting the dynamic behavior of the Next Generation Space Telescope (NGST) sunshield. The sunshield consists of multiple layers of pretensioned, thin-film membranes supported by deployable booms. Modeling the structural dynamic behavior of the sunshield is a challenging aspect of the problem due to the effects of membrane wrinkling. A finite element model of the sunshield was developed using an approximate engineering approach, the cable network method, to account for membrane wrinkling effects. Ground testing of a one-tenth scale model of the NGST sunshield were carried out to provide data for validating the analytical model. A series of analyses were performed to predict the behavior of the sunshield under the ground test conditions. Modal analyses were performed to predict the frequencies and mode shapes of the test article and transient response analyses were completed to simulate impulse excitation tests. Comparison was made between analytical predictions and test measurements for the dynamic behavior of the sunshield. In general, the results show good agreement with the analytical model correctly predicting the approximate frequency and mode shapes for the significant structural modes.

  20. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  1. A Multilevel Multiset Time-Series Model for Describing Complex Developmental Processes

    PubMed Central

    Ma, Xin; Shen, Jianping

    2017-01-01

    The authors sought to develop an analytical platform where multiple sets of time series can be examined simultaneously. This multivariate platform capable of testing interaction effects among multiple sets of time series can be very useful in empirical research. The authors demonstrated that the multilevel framework can readily accommodate this analytical capacity. Given their intention to use the multilevel multiset time-series model to pursue complicated research purposes, their resulting model is relatively simple to specify, to run, and to interpret. These advantages make the adoption of their model relatively effortless as long as researchers have the basic knowledge and skills in working with multilevel growth modeling. With multiple potential extensions of their model, the establishment of this analytical platform for analysis of multiple sets of time series can inspire researchers to pursue far more advanced research designs to address complex developmental processes in reality. PMID:29881094

  2. Optical sensing of analytes in aqueous solutions with a multiple surface-plasmon-polariton-wave platform

    PubMed Central

    Swiontek, Stephen E.; Pulsifer, Drew P.; Lakhtakia, Akhlesh

    2013-01-01

    The commonly used optical sensor based on surface plasmon-polariton wave phenomenon can sense just one chemical, because only one SPP wave can be guided by the interface of a metal and a dielectric material contained in the sensor. Multiple analytes could be detected and/or the sensing reliability for a single analyte could be enhanced, if multiple SPP-wave modes could be excited on a single metal/dielectric interface. For that to happen, the partnering dielectric material must be periodically non-homogeneous. Using a chiral sculptured thin film (CSTF) as that material in a SPP-wave platform, we show that the angular locations of multiple SPP-wave modes shift when the void regions of the CSTF are infiltrated with a fluid. The sensitivities realized in the proof-of-concept experiments are comparable to state-of-research values. PMID:23474988

  3. A rapid and high-precision method for sulfur isotope δ(34)S determination with a multiple-collector inductively coupled plasma mass spectrometer: matrix effect correction and applications for water samples without chemical purification.

    PubMed

    Lin, An-Jun; Yang, Tao; Jiang, Shao-Yong

    2014-04-15

    Previous studies have indicated that prior chemical purification of samples, although complex and time-consuming, is essential in obtaining precise and accurate results for sulfur isotope ratios using multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). In this study, we introduce a new, rapid and precise MC-ICP-MS method for sulfur isotope determination from water samples without chemical purification. The analytical work was performed on an MC-ICP-MS instrument with medium mass resolution (m/Δm ~ 3000). Standard-sample bracketing (SSB) was used to correct samples throughout the analytical sessions. Reference materials included an Alfa-S (ammonium sulfate) standard solution, ammonium sulfate provided by the lab of the authors and fresh seawater from the South China Sea. A range of matrix-matched Alfa-S standard solutions and ammonium sulfate solutions was used to investigate the matrix (salinity) effect (matrix was added in the form of NaCl). A seawater sample was used to confirm the reliability of the method. Using matrix-matched (salinity-matched) Alfa-S as the working standard, the measured δ(34)S value of AS (-6.73 ± 0.09‰) was consistent with the reference value (-6.78 ± 0.07‰) within the uncertainty, suggesting that this method could be recommended for the measurement of water samples without prior chemical purification. The δ(34)S value determination for the unpurified seawater also yielded excellent results (21.03 ± 0.18‰) that are consistent with the reference value (20.99‰), thus confirming the feasibility of the technique. The data and the results indicate that it is feasible to use MC-ICP-MS and matrix-matched working standards to measure the sulfur isotopic compositions of water samples directly without chemical purification. In comparison with the existing MC-ICP-MS techniques, the new method is better for directly measuring δ(34)S values in water samples with complex matrices; therefore, it can significantly accelerate analytical turnover. Copyright © 2014 John Wiley & Sons, Ltd.

  4. Group specific internal standard technology (GSIST) for simultaneous identification and quantification of small molecules

    DOEpatents

    Adamec, Jiri; Yang, Wen-Chu; Regnier, Fred E

    2014-01-14

    Reagents and methods are provided that permit simultaneous analysis of multiple diverse small molecule analytes present in a complex mixture. Samples are labeled with chemically identical but isotopically distince forms of the labeling reagent, and analyzed using mass spectrometry. A single reagent simultaneously derivatizes multiple small molecule analytes having different reactive functional groups.

  5. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  6. Identifying Nanoscale Structure-Function Relationships Using Multimodal Atomic Force Microscopy, Dimensionality Reduction, and Regression Techniques.

    PubMed

    Kong, Jessica; Giridharagopal, Rajiv; Harrison, Jeffrey S; Ginger, David S

    2018-05-31

    Correlating nanoscale chemical specificity with operational physics is a long-standing goal of functional scanning probe microscopy (SPM). We employ a data analytic approach combining multiple microscopy modes, using compositional information in infrared vibrational excitation maps acquired via photoinduced force microscopy (PiFM) with electrical information from conductive atomic force microscopy. We study a model polymer blend comprising insulating poly(methyl methacrylate) (PMMA) and semiconducting poly(3-hexylthiophene) (P3HT). We show that PiFM spectra are different from FTIR spectra, but can still be used to identify local composition. We use principal component analysis to extract statistically significant principal components and principal component regression to predict local current and identify local polymer composition. In doing so, we observe evidence of semiconducting P3HT within PMMA aggregates. These methods are generalizable to correlated SPM data and provide a meaningful technique for extracting complex compositional information that are impossible to measure from any one technique.

  7. Coupling Front-End Separations, Ion Mobility Spectrometry, and Mass Spectrometry For Enhanced Multidimensional Biological and Environmental Analyses

    PubMed Central

    Zheng, Xueyun; Wojcik, Roza; Zhang, Xing; Ibrahim, Yehia M.; Burnum-Johnson, Kristin E.; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Baker, Erin S.

    2017-01-01

    Ion mobility spectrometry (IMS) is a widely used analytical technique for rapid molecular separations in the gas phase. Though IMS alone is useful, its coupling with mass spectrometry (MS) and front-end separations is extremely beneficial for increasing measurement sensitivity, peak capacity of complex mixtures, and the scope of molecular information available from biological and environmental sample analyses. In fact, multiple disease screening and environmental evaluations have illustrated that the IMS-based multidimensional separations extract information that cannot be acquired with each technique individually. This review highlights three-dimensional separations using IMS-MS in conjunction with a range of front-end techniques, such as gas chromatography, supercritical fluid chromatography, liquid chromatography, solid-phase extractions, capillary electrophoresis, field asymmetric ion mobility spectrometry, and microfluidic devices. The origination, current state, various applications, and future capabilities of these multidimensional approaches are described in detail to provide insight into their uses and benefits. PMID:28301728

  8. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  9. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  10. Using multiple isotopes to understand the source of ingredients used in golden beverages

    NASA Astrophysics Data System (ADS)

    Wynn, J. G.

    2011-12-01

    Traditionally, beer contains 4 simple ingredients: water, barley, hops and yeast. Each of these ingredients used in the brewing process contributes some combination of a number of "traditional" stable isotopes (i.e., isotopes of H, C, O, N and S) to the final product. As an educational exercise in an "Analytical Techniques in Geology" course, a group of students analyzed the isotopic composition of the gas, liquid and solid phases of a variety of beer samples collected from throughout the world (including other beverages). The hydrogen and oxygen isotopic composition of the water followed closely the isotopic composition of local meteoric water at the source of the brewery, although there is a systematic offset from the global meteoric water line that may be due to the effects of CO2-H2O equilibration. The carbon isotopic composition of the CO2 reflected that of the solid residue (the source of carbon used as a fermentation substrate), but may potentially be modified by addition of gas-phase CO2 from an inorganic source. The carbon isotopic composition of the solid residue similarly tracks that of the fermentation substrate, and may indicate some alcohol fermented from added sugars in some cases. The nitrogen isotopic composition of the solid residue was relatively constant, and may track the source of nitrogen in the barley, hops and yeast. Each of the analytical methods used is a relatively standard technique used in geological applications, making this a "fun" exercise for those involved, and gives the students hands-on experience with a variety of analytes from a non-traditional sample material.

  11. A shipboard comparison of analytic methods for ballast water compliance monitoring

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and < 50 μm: 9 techniques). Water samples were analyzed in parallel to determine whether results were similar between methods and whether rapid, indicative methods offer comparable results to standard, time- and labor-intensive detailed methods (e.g. microscopy) and high-end scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  12. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  13. Guided Text Search Using Adaptive Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Symons, Christopher T; Senter, James K

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interactsmore » with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.« less

  14. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  15. Modelling by partial least squares the relationship between the HPLC mobile phases and analytes on phenyl column.

    PubMed

    Markopoulou, Catherine K; Kouskoura, Maria G; Koundourellis, John E

    2011-06-01

    Twenty-five descriptors and 61 structurally different analytes have been used on a partial least squares (PLS) to latent structure technique in order to study chromatographically their interaction mechanism on a phenyl column. According to the model, 240 different retention times of the analytes, expressed as Y variable (log k), at different % MeOH mobile-phase concentrations have been correlated with their theoretical most important structural or molecular descriptors. The goodness-of-fit was estimated by the coefficient of multiple determinations r(2) (0.919), and the root mean square error of estimation (RMSEE=0.1283) values with a predictive ability (Q(2)) of 0.901. The model was further validated using cross-validation (CV), validated by 20 response permutations r(2) (0.0, 0.0146), Q(2) (0.0, -0.136) and validated by external prediction. The contribution of certain mechanism interactions between the analytes, the mobile phase and the column, proportional or counterbalancing is also studied. Trying to evaluate the influence on Y of every variable in a PLS model, VIP (variables importance in the projection) plot provides evidence that lipophilicity (expressed as Log D, Log P), polarizability, refractivity and the eluting power of the mobile phase are dominant in the retention mechanism on a phenyl column. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Holographic Reciprocity Law Failure, with Applications to the Three-Dimensional Display of Medical Data

    NASA Astrophysics Data System (ADS)

    Johnson, Kristina Mary

    In 1973 the computerized tomography (CT) scanner revolutionized medical imaging. This machine can isolate and display in two-dimensional cross-sections, internal lesions and organs previously impossible to visualize. The possibility of three-dimensional imaging however is not yet exploited by present tomographic systems. Using multiple-exposure holography, three-dimensional displays can be synthesizing from two-dimensional CT cross -sections. A multiple-exposure hologram is an incoherent superposition of many individual holograms. Intuitively it is expected that holograms recorded with equal energy will reconstruct images with equal brightness. It is found however, that holograms recorded first are brighter than holograms recorded later in the superposition. This phenomena is called Holographic Reciprocity Law Failure (HRLF). Computer simulations of latent image formation in multiple-exposure holography are one of the methods used to investigate HRLF. These simulations indicate that it is the time between individual exposures in the multiple -exposure hologram that is responsible for HRLF. This physical parameter introduces an asymmetry into the latent image formation process that favors the signal of previously recorded holograms over holograms recorded later in the superposition. The origin of this asymmetry lies in the dynamics of latent image formation, and in particular in the decay of single-atom latent image specks, which have lifetimes that are short compared to typical times between exposures. An analytical model is developed for a double exposure hologram that predicts a decrease in the brightness of the second exposure as compared to the first exposure as the time between exposures increases. These results are consistent with the computer simulations. Experiments investigating the influence of this parameter on the diffraction efficiency of reconstructed images in a double exposure hologram are also found to be consistent with the computer simulations and analytical results. From this information, two techniques are presented that correct for HRLF, and succeed in reconstructing multiple holographic images of CT cross-sections with equal brightness. The multiple multiple-exposure hologram is a new hologram that increases the number of equally bright images that can be superimposed on one photographic plate.

  17. Multiple reaction monitoring with multistage fragmentation (MRM3) detection enhances selectivity for LC-MS/MS analysis of plasma free metanephrines.

    PubMed

    Wright, Michael J; Thomas, Rebecca L; Stanford, Phoebe E; Horvath, Andrea R

    2015-03-01

    LC-MS/MS with multiple reaction monitoring (MRM) is a powerful tool for quantifying target analytes in complex matrices. However, the technique lacks selectivity when plasma free metanephrines are measured. We propose the use of multistage fragmentation (MRM(3)) to improve the analytical selectivity of plasma free metanephrine measurement. Metanephrines were extracted from plasma with weak cation exchange solid-phase extraction before separation by hydrophilic interaction liquid chromatography. We quantified normetanephrine and metanephrine by either MRM or MRM(3) transitions m/z 166→134→79 and m/z 180→149→121, respectively. Over a 6-month period, approximately 1% (n = 21) of patient samples showed uncharacterized coeluting substances that interfered with the routine assay, resulting in an inability to report results. Quantification with MRM(3) removed these interferences and enabled measurement of the target compounds. For patient samples unaffected by interferences, Deming regression analysis demonstrated a correlation between MRM(3) and MRM methods of y = 1.00x - 0.00 nmol/L for normetanephrine and y = 0.99x + 0.03 nmol/L for metanephrine. Between the MRM(3) method and the median of all LC-MS/MS laboratories enrolled in a quality assurance program, the correlations were y = 0.97x + 0.03 nmol/L for normetanephrine and y = 1.03x - 0.04 nmol/L for metanephrine. Imprecision for the MRM(3) method was 6.2%-7.0% for normetanephrine and 6.1%-9.9% for metanephrine (n = 10). The lower limits of quantification for the MRM(3) method were 0.20 nmol/L for normetanephrine and 0.16 nmol/L for metanephrine. The use of MRM(3) technology improves the analytical selectivity of plasma free metanephrine quantification by LC-MS/MS while demonstrating sufficient analytical sensitivity and imprecision. © 2014 American Association for Clinical Chemistry.

  18. Self-interaction of NPM1 modulates multiple mechanisms of liquid-liquid phase separation.

    PubMed

    Mitrea, Diana M; Cika, Jaclyn A; Stanley, Christopher B; Nourse, Amanda; Onuchic, Paulo L; Banerjee, Priya R; Phillips, Aaron H; Park, Cheon-Gil; Deniz, Ashok A; Kriwacki, Richard W

    2018-02-26

    Nucleophosmin (NPM1) is an abundant, oligomeric protein in the granular component of the nucleolus with roles in ribosome biogenesis. Pentameric NPM1 undergoes liquid-liquid phase separation (LLPS) via heterotypic interactions with nucleolar components, including ribosomal RNA (rRNA) and proteins which display multivalent arginine-rich linear motifs (R-motifs), and is integral to the liquid-like nucleolar matrix. Here we show that NPM1 can also undergo LLPS via homotypic interactions between its polyampholytic intrinsically disordered regions, a mechanism that opposes LLPS via heterotypic interactions. Using a combination of biophysical techniques, including confocal microscopy, SAXS, analytical ultracentrifugation, and single-molecule fluorescence, we describe how conformational changes within NPM1 control valency and switching between the different LLPS mechanisms. We propose that this newly discovered interplay between multiple LLPS mechanisms may influence the direction of vectorial pre-ribosomal particle assembly within, and exit from the nucleolus as part of the ribosome biogenesis process.

  19. Rural Residents’ Perspectives on Multiple Morbidity Management and Disease Prevention

    PubMed Central

    Bardach, Shoshana H.; Schoenberg, Nancy E.; Tarasenko, Yelena N.; Fleming, Steven T.

    2013-01-01

    Middle-aged and older adults often experience several simultaneously occurring chronic conditions or “multiple morbidity” (MM). The task of both managing MM and preventing chronic conditions can be overwhelming, particularly in populations with high disease burdens, low socioeconomic status, and health care provider shortages. This article sought to understand Appalachian residents’ perspectives on MM management and prevention. Forty-one rural Appalachian residents aged 50 and above with MM were interviewed about disease management and colorectal cancer (CRC) prevention. Transcripts were examined for overall analytic categories and coded using techniques to enhance transferability and rigor. Participants indicate facing various challenges to prevention due, in part, to conditions within their rural environment. Patients and providers spend significant time and energy on MM management, often precluding prevention activities. This article discusses implications of MM management for CRC prevention and strategies to increase disease prevention among this rural, vulnerable population burdened by MM. PMID:23833393

  20. SER Analysis of MPPM-Coded MIMO-FSO System over Uncorrelated and Correlated Gamma-Gamma Atmospheric Turbulence Channels

    NASA Astrophysics Data System (ADS)

    Khallaf, Haitham S.; Garrido-Balsells, José M.; Shalaby, Hossam M. H.; Sampei, Seiichi

    2015-12-01

    The performance of multiple-input multiple-output free space optical (MIMO-FSO) communication systems, that adopt multipulse pulse position modulation (MPPM) techniques, is analyzed. Both exact and approximate symbol-error rates (SERs) are derived for both cases of uncorrelated and correlated channels. The effects of background noise, receiver shot-noise, and atmospheric turbulence are taken into consideration in our analysis. The random fluctuations of the received optical irradiance, produced by the atmospheric turbulence, is modeled by the widely used gamma-gamma statistical distribution. Uncorrelated MIMO channels are modeled by the α-μ distribution. A closed-form expression for the probability density function of the optical received irradiance is derived for the case of correlated MIMO channels. Using our analytical expressions, the degradation of the system performance with the increment of the correlation coefficients between MIMO channels is corroborated.

  1. Real-Time Continuous Identification of Greenhouse Plant Pathogens Based on Recyclable Microfluidic Bioassay System.

    PubMed

    Qu, Xiangmeng; Li, Min; Zhang, Hongbo; Lin, Chenglie; Wang, Fei; Xiao, Mingshu; Zhou, Yi; Shi, Jiye; Aldalbahi, Ali; Pei, Hao; Chen, Hong; Li, Li

    2017-09-20

    The development of a real-time continuous analytical platform for the pathogen detection is of great scientific importance for achieving better disease control and prevention. In this work, we report a rapid and recyclable microfluidic bioassay system constructed from oligonucleotide arrays for selective and sensitive continuous identification of DNA targets of fungal pathogens. We employ the thermal denaturation method to effectively regenerate the oligonucleotide arrays for multiple sample detection, which could considerably reduce the screening effort and costs. The combination of thermal denaturation and laser-induced fluorescence detection technique enables real-time continuous identification of multiple samples (<10 min per sample). As a proof of concept, we have demonstrated that two DNA targets of fungal pathogens (Botrytis cinerea and Didymella bryoniae) can be sequentially analyzed using our rapid microfluidic bioassay system, which provides a new paradigm in the design of microfluidic bioassay system and will be valuable for chemical and biomedical analysis.

  2. Investigating acoustic-induced deformations in a foam using multiple light scattering.

    PubMed

    Erpelding, M; Guillermic, R M; Dollet, B; Saint-Jalmes, A; Crassous, J

    2010-08-01

    We have studied the effect of an external acoustic wave on bubble displacements inside an aqueous foam. The signature of the acoustic-induced bubble displacements is found using a multiple light scattering technique, and occurs as a modulation on the photon correlation curve. Measurements for various sound frequencies and amplitudes are compared to analytical predictions and numerical simulations. These comparisons finally allow us to elucidate the nontrivial acoustic displacement profile inside the foam; in particular, we find that the acoustic wave creates a localized shear in the vicinity of the solid walls holding the foam, as a consequence of inertial contributions. This study of how bubbles "dance" inside a foam as a response to sound turns out to provide new insights on foam acoustics and sound transmission into a foam, foam deformation at high frequencies, and analysis of light scattering data in samples undergoing nonhomogeneous deformations.

  3. Performance analysis of fiber-based free-space optical communications with coherent detection spatial diversity.

    PubMed

    Li, Kangning; Ma, Jing; Tan, Liying; Yu, Siyuan; Zhai, Chao

    2016-06-10

    The performances of fiber-based free-space optical (FSO) communications over gamma-gamma distributed turbulence are studied for multiple aperture receiver systems. The equal gain combining (EGC) technique is considered as a practical scheme to mitigate the atmospheric turbulence. Bit error rate (BER) performances for binary-phase-shift-keying-modulated coherent detection fiber-based free-space optical communications are derived and analyzed for EGC diversity receptions through an approximation method. To show the net diversity gain of a multiple aperture receiver system, BER performances of EGC are compared with a single monolithic aperture receiver system with the same total aperture area (same average total incident optical power on the aperture surface) for fiber-based free-space optical communications. The analytical results are verified by Monte Carlo simulations. System performances are also compared for EGC diversity coherent FSO communications with or without considering fiber-coupling efficiencies.

  4. Modeling brook trout presence and absence from landscape variables using four different analytical methods

    USGS Publications Warehouse

    Steen, Paul J.; Passino-Reader, Dora R.; Wiley, Michael J.

    2006-01-01

    As a part of the Great Lakes Regional Aquatic Gap Analysis Project, we evaluated methodologies for modeling associations between fish species and habitat characteristics at a landscape scale. To do this, we created brook trout Salvelinus fontinalis presence and absence models based on four different techniques: multiple linear regression, logistic regression, neural networks, and classification trees. The models were tested in two ways: by application to an independent validation database and cross-validation using the training data, and by visual comparison of statewide distribution maps with historically recorded occurrences from the Michigan Fish Atlas. Although differences in the accuracy of our models were slight, the logistic regression model predicted with the least error, followed by multiple regression, then classification trees, then the neural networks. These models will provide natural resource managers a way to identify habitats requiring protection for the conservation of fish species.

  5. Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment.

    PubMed

    Onüt, Semih; Soner, Selin

    2008-01-01

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.

  6. Merits and limitations of optimality criteria method for structural optimization

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Guptill, James D.; Berke, Laszlo

    1993-01-01

    The merits and limitations of the optimality criteria (OC) method for the minimum weight design of structures subjected to multiple load conditions under stress, displacement, and frequency constraints were investigated by examining several numerical examples. The examples were solved utilizing the Optimality Criteria Design Code that was developed for this purpose at NASA Lewis Research Center. This OC code incorporates OC methods available in the literature with generalizations for stress constraints, fully utilized design concepts, and hybrid methods that combine both techniques. Salient features of the code include multiple choices for Lagrange multiplier and design variable update methods, design strategies for several constraint types, variable linking, displacement and integrated force method analyzers, and analytical and numerical sensitivities. The performance of the OC method, on the basis of the examples solved, was found to be satisfactory for problems with few active constraints or with small numbers of design variables. For problems with large numbers of behavior constraints and design variables, the OC method appears to follow a subset of active constraints that can result in a heavier design. The computational efficiency of OC methods appears to be similar to some mathematical programming techniques.

  7. Interaction of rippled shock wave with flat fast-slow interface

    NASA Astrophysics Data System (ADS)

    Zhai, Zhigang; Liang, Yu; Liu, Lili; Ding, Juchun; Luo, Xisheng; Zou, Liyong

    2018-04-01

    The evolution of a flat air/sulfur-hexafluoride interface subjected to a rippled shock wave is investigated. Experimentally, the rippled shock wave is produced by diffracting a planar shock wave around solid cylinder(s), and the effects of the cylinder number and the spacing between cylinders on the interface evolution are considered. The flat interface is created by a soap film technique. The postshock flow and the evolution of the shocked interface are captured by a schlieren technique combined with a high-speed video camera. Numerical simulations are performed to provide more details of flows. The wave patterns of a planar shock wave diffracting around one cylinder or two cylinders are studied. The shock stability problem is analytically discussed, and the effects of the spacing between cylinders on shock stability are highlighted. The relationship between the amplitudes of the rippled shock wave and the shocked interface is determined in the single cylinder case. Subsequently, the interface morphologies and growth rates under different cases are obtained. The results show that the shock-shock interactions caused by multiple cylinders have significant influence on the interface evolution. Finally, a modified impulsive theory is proposed to predict the perturbation growth when multiple solid cylinders are present.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onuet, Semih; Soner, Selin

    Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker tomore » describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.« less

  9. The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2017-12-01

    NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing

  10. Error determination of a successive correction type objective analysis scheme. [for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.; Leslie, F. W.

    1984-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.

  11. Miniature quadrupole mass spectrometer having a cold cathode ionization source

    DOEpatents

    Felter, Thomas E.

    2002-01-01

    An improved quadrupole mass spectrometer is described. The improvement lies in the substitution of the conventional hot filament electron source with a cold cathode field emitter array which in turn allows operating a small QMS at much high internal pressures then are currently achievable. By eliminating of the hot filament such problems as thermally "cracking" delicate analyte molecules, outgassing a "hot" filament, high power requirements, filament contamination by outgas species, and spurious em fields are avoid all together. In addition, the ability of produce FEAs using well-known and well developed photolithographic techniques, permits building a QMS having multiple redundancies of the ionization source at very low additional cost.

  12. Numerical analysis of the photo-injection time-of-flight curves in molecularly doped polymers

    NASA Astrophysics Data System (ADS)

    Tyutnev, A. P.; Ikhsanov, R. Sh.; Saenko, V. S.; Nikerov, D. V.

    2018-03-01

    We have performed numerical analysis of the charge carrier transport in a specific molecularly doped polymer using the multiple trapping model. The computations covered a wide range of applied electric fields, temperatures and most importantly, of the initial energies of photo injected one-sign carriers (in our case, holes). Special attention has been given to comparison of time of flight curves measured by the photo-injection and radiation-induced techniques which has led to a problematic situation concerning an interpretation of the experimental data. Computational results have been compared with both analytical and experimental results available in literature.

  13. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  14. Investigations of the historic textiles excavated from Ancient Ainos (Enez - Turkey) by multiple analytical techniques

    NASA Astrophysics Data System (ADS)

    Akyuz, Sevim; Akyuz, Tanil; Cakan, Banu; Basaran, Sait

    2014-09-01

    Some metal ornamented textile specimens and a textile button, excavated from Ancient Ainos (Enez - Turkey), have been investigated using FTIR and EDXRF spectrometry, for the purpose of material identification. FTIR spectral results indicated that textiles were made from partially degummed Bombyx mori silk. The IR spectral investigation of the textile button revealed that some cellulose fillings were used inside the button. The EDXRF analysis of the metal ornaments showed that they were silver plated copper. Surface morphology of the textiles and the metal ornaments were investigated by SEM images. It was shown that textile fibers were highly degraded.

  15. General statistical considerations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eberhardt, L L; Gilbert, R O

    From NAEG plutonium environmental studies program meeting; Las Vegas, Nevada, USA (2 Oct 1973). The high sampling variability encountered in environmental plutonium studies along with high analytical costs makes it very important that efficient soil sampling plans be used. However, efficient sampling depends on explicit and simple statements of the objectives of the study. When there are multiple objectives it may be difficult to devise a wholly suitable sampling scheme. Sampling for long-term changes in plutonium concentration in soils may also be complex and expensive. Further attention to problems associated with compositing samples is recommended, as is the consistent usemore » of random sampling as a basic technique. (auth)« less

  16. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification.

    PubMed

    Allen, Richard J; Musante, Cynthia J

    2017-01-01

    Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing 'transfer function'. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate) of the pathway as a whole.

  17. Modeling fructose-load-induced hepatic de-novo lipogenesis by model simplification

    PubMed Central

    Allen, Richard J; Musante, Cynthia J

    2017-01-01

    Hepatic de-novo lipogenesis is a metabolic process implemented in the pathogenesis of type 2 diabetes. Clinically, the rate of this process can be ascertained by use of labeled acetate and stimulation by fructose administration. A systems pharmacology model of this process is desirable because it facilitates the description, analysis, and prediction of this experiment. Due to the multiple enzymes involved in de-novo lipogenesis, and the limited data, it is desirable to use single functional expressions to encapsulate the flux between multiple enzymes. To accomplish this we developed a novel simplification technique which uses the available information about the properties of the individual enzymes to bound the parameters of a single governing ‘transfer function’. This method should be applicable to any model with linear chains of enzymes that are well stimulated. We validated this approach with computational simulations and analytical justification in a limiting case. Using this technique we generated a simple model of hepatic de-novo lipogenesis in these experimental conditions that matched prior data. This model can be used to assess pharmacological intervention at specific points on this pathway. We have demonstrated this with prospective simulation of acetyl-CoA carboxylase inhibition. This simplification technique suggests how the constituent properties of an enzymatic chain of reactions gives rise to the sensitivity (to substrate) of the pathway as a whole. PMID:28469410

  18. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  19. Antimicrobial residues in animal waste and water resources proximal to large-scale swine and poultry feeding operations

    USGS Publications Warehouse

    Campagnolo, E.R.; Johnson, K.R.; Karpati, A.; Rubin, C.S.; Kolpin, D.W.; Meyer, M.T.; Esteban, J. Emilio; Currier, R.W.; Smith, K.; Thu, K.M.; McGeehin, M.

    2002-01-01

    Expansion and intensification of large-scale animal feeding operations (AFOs) in the United States has resulted in concern about environmental contamination and its potential public health impacts. The objective of this investigation was to obtain background data on a broad profile of antimicrobial residues in animal wastes and surface water and groundwater proximal to large-scale swine and poultry operations. The samples were measured for antimicrobial compounds using both radioimmunoassay and liquid chromatography/electrospray ionization-mass spectrometry (LC/ESI-MS) techniques. Multiple classes of antimicrobial compounds (commonly at concentrations of >100 μg/l) were detected in swine waste storage lagoons. In addition, multiple classes of antimicrobial compounds were detected in surface and groundwater samples collected proximal to the swine and poultry farms. This information indicates that animal waste used as fertilizer for crops may serve as a source of antimicrobial residues for the environment. Further research is required to determine if the levels of antimicrobials detected in this study are of consequence to human and/or environmental ecosystems. A comparison of the radioimmunoassay and LC/ESI-MS analytical methods documented that radioimmunoassay techniques were only appropriate for measuring residues in animal waste samples likely to contain high levels of antimicrobials. More sensitive LC/ESI-MS techniques are required in environmental samples, where low levels of antimicrobial residues are more likely.

  20. Determination of double bond conversion in dental resins by near infrared spectroscopy.

    PubMed

    Stansbury, J W; Dickens, S H

    2001-01-01

    This study determined the validity and practicality of near infrared (NIR) spectroscopic techniques for measurement of conversion in dental resins. Conversion measurements by NIR and mid-IR were compared using two techniques: (1) The conversion of 3mm thick photopolymerized Bis-GMA/TEGDMA resin specimens was determined by transmission NIR. Specimens were then ground and reanalyzed in KBr pellet form by mid-IR. (2) As further verification, thin resin films were photocured and analyzed by mid-IR. Multiple thin films were then compressed into a thick pellet for examination by NIR. Conversion values obtained by NIR and mid-IR techniques did not differ significantly. A correction for changing specimen thickness due to polymerization shrinkage was applied to NIR conversion measurements since an internal standard reference peak was not employed. Sensitivity of the NIR technique was superior to those based on the mid-IR. The nondestructive analysis of conversion in dental resins by NIR offers advantages of convenience, practical specimen dimensions and precision compared with standard mid-IR analytical procedures. Because glass is virtually transparent in the NIR spectrum, this technique has excellent potential for use with filled dental resins as well.

  1. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  2. Investigation of the feasibility of an analytical method of accounting for the effects of atmospheric drag on satellite motion

    NASA Technical Reports Server (NTRS)

    Bozeman, Robert E.

    1987-01-01

    An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.

  3. The Effect of Multiple Intelligences Theory-Based Education on Academic Achievement: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Bas, Gökhan

    2016-01-01

    The main purpose of this study is to determine the effect of multiple intelligences theory (MIT)-based education on students' academic achievement. In this research, the meta-analytic method was adopted to determine this effect, and studies related to this subject carried out in Turkey were compiled. The effect sizes of the studies included in the…

  4. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    NASA Astrophysics Data System (ADS)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  5. Fundamental Biological Research on the International Space Station

    NASA Technical Reports Server (NTRS)

    Souza, K. A.; Yost, Bruce; Fletcher, L.; Dalton, Bonnie P. (Technical Monitor)

    2000-01-01

    The fundamental Biology Program of NASA's Life Sciences Division is chartered with enabling and sponsoring research on the International Space Station (ISS) in order to understand the effects of the space flight environment, particularly microgravity, on living systems. To accomplish this goal, NASA Ames Research Center (ARC) has been tasked with managing the development of a number of biological habitats, along with their support systems infrastructure. This integrated suite of habitats and support systems is being designed to support research requirements identified by the scientific community. As such, it will support investigations using cells and tissues, avian eggs, insects, plants, aquatic organisms and rodents. Studies following organisms through complete life cycles and over multiple generations will eventually be possible. As an adjunct to the development of these basic habitats, specific analytical and monitoring technologies are being targeted for maturation to complete the research cycle by transferring existing or emerging analytical techniques, sensors, and processes from the laboratory bench to the ISS research platform.

  6. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment.

    PubMed

    Streby, Ashleigh; Mull, Bonnie J; Levy, Karen; Hill, Vincent R

    2015-05-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Foursuch assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices.

  7. Comparison of real-time PCR methods for the detection of Naegleria fowleri in surface water and sediment

    PubMed Central

    Streby, Ashleigh; Mull, Bonnie J.; Levy, Karen

    2015-01-01

    Naegleria fowleri is a thermophilic free-living ameba found in freshwater environments worldwide. It is the cause of a rare but potentially fatal disease in humans known as primary amebic meningoencephalitis. Established N. fowleri detection methods rely on conventional culture techniques and morphological examination followed by molecular testing. Multiple alternative real-time PCR assays have been published for rapid detection of Naegleria spp. and N. fowleri. Four such assays were evaluated for the detection of N. fowleri from surface water and sediment. The assays were compared for thermodynamic stability, analytical sensitivity and specificity, detection limits, humic acid inhibition effects, and performance with seeded environmental matrices. Twenty-one ameba isolates were included in the DNA panel used for analytical sensitivity and specificity analyses. N. fowleri genotypes I and III were used for method performance testing. Two of the real-time PCR assays were determined to yield similar performance data for specificity and sensitivity for detecting N. fowleri in environmental matrices. PMID:25855343

  8. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  9. [Sustainability analysis of an evaluation policy: the case of primary health care in Brazil].

    PubMed

    Felisberto, Eronildo; Freese, Eduardo; Bezerra, Luciana Caroline Albuquerque; Alves, Cinthia Kalyne de Almeida; Samico, Isabella

    2010-06-01

    This study analyzes the sustainability of Brazil's National Policy for the Evaluation of Primary Health Care, based on the identification and categorization of representative critical events in the institutionalization process. This was an evaluative study of two analytical units: Federal Management of Primary Health Care and State Health Secretariats, using multiple case studies with data collected through interviews and institutional documents, using the critical incidents technique. Events that were temporally classified as specific to implementation, sustainability, and mixed were categorized analytically as pertaining to memory, adaptation, values, and rules. Federal management and one of the State Health Secretariats showed medium-level sustainability, while the other State Secretariat showed strong sustainability. The results indicate that the events were concurrent and suggest a weighting process, since the adaptation of activities, adequacy, and stabilization of resources displayed a strong influence on the others. Innovations and the development of technical capability are considered the most important results for sustainability.

  10. Towards the authentication of European sea bass origin through a combination of biometric measurements and multiple analytical techniques.

    PubMed

    Farabegoli, Federica; Pirini, Maurizio; Rotolo, Magda; Silvi, Marina; Testi, Silvia; Ghidini, Sergio; Zanardi, Emanuela; Remondini, Daniel; Bonaldo, Alessio; Parma, Luca; Badiani, Anna

    2018-06-08

    The authenticity of fish products has become an imperative issue for authorities involved in the protection of consumers against fraudulent practices and in the market stabilization. The present study aimed to provide a method for authentication of European sea bass (Dicentrarchus labrax) according to the requirements for seafood labels (Regulation 1379/2013/EU). Data on biometric traits, fatty acid profile, elemental composition, and isotopic abundance of wild and reared (intensively, semi-intensively and extensively) specimens from 18 Southern European sources (n = 160) were collected and clustered in 6 sets of parameters, then subjected to multivariate analysis. Correct allocations of subjects according to their production method, origin and stocking density were demonstrated with good approximation rates (94%, 92% and 92%, respectively) using fatty acid profiles. Less satisfying results were obtained using isotopic abundance, biometric traits, and elemental composition. The multivariate analysis also revealed that extensively reared subjects cannot be analytically discriminated from wild ones.

  11. Analysis of magnesium and copper in aluminum alloys with high repetition rate laser-ablation spark-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    He, Xiaoyong; Dong, Bo; Chen, Yuqi; Li, Runhua; Wang, Fujuan; Li, Jiaoyang; Cai, Zhigang

    2018-03-01

    In order to improve the analytical speed and performance of laser-ablation based atomic emission spectroscopy, high repetition rate laser-ablation spark-induced breakdown spectroscopy (HRR LA-SIBS) was first developed. Magnesium and copper in aluminum alloys were analyzed with this technique. In the experiments, the fundamental output of an acousto-optically Q-switched Nd:YAG laser operated at 1 kHz repetition rate with low pulse energy and 120 ns pulse width was used to ablate the samples and the plasma emission was enhanced by spark discharge. The spectra were recorded with a compact fiber spectrometer with non-intensified charge-coupled device in non-gating mode. Different parameters relative with analytical performance, such as capacitance, voltage, laser pulse energy were optimized. Under current experimental conditions, calibration curves of magnesium and copper in aluminum alloys were built and limits of detection of them were determined to be 14.0 and 9.9 ppm by HRR LA-SIBS, respectively, which were 8-12 folds better than that achieved by HRR LA under similar experimental condition without spark discharge. The analytical sensitivities are close to those obtained with conventional LIBS but with improved analytical speed as well as possibility of using compact fiber spectrometer. Under high repetition rate operation, the noise level can be decreased and the analytical reproducibility can be improved obviously by averaging multiple measurements within short time. High repetition rate operation of laser-ablation spark-induced breakdown spectroscopy is very helpful for improving analytical speed. It is possible to find applications in fast elements analysis, especially fast two-dimension elemental mapping of solid samples.

  12. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  13. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario

    PubMed Central

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga

    2017-01-01

    Abstract The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing. Key Words: Planetary exploration—Life detection—Microbial mat—Life detector chip—Thermogravimetry—Raman spectroscopy—NIR—DRIFTS. Astrobiology 17, 984–996. PMID:29016195

  14. Critical Assessment of Analytical Techniques in the Search for Biomarkers on Mars: A Mummified Microbial Mat from Antarctica as a Best-Case Scenario.

    PubMed

    Blanco, Yolanda; Gallardo-Carreño, Ignacio; Ruiz-Bermejo, Marta; Puente-Sánchez, Fernando; Cavalcante-Silva, Erika; Quesada, Antonio; Prieto-Ballesteros, Olga; Parro, Víctor

    2017-10-01

    The search for biomarkers of present or past life is one of the major challenges for in situ planetary exploration. Multiple constraints limit the performance and sensitivity of remote in situ instrumentation. In addition, the structure, chemical, and mineralogical composition of the sample may complicate the analysis and interpretation of the results. The aim of this work is to highlight the main constraints, performance, and complementarity of several techniques that have already been implemented or are planned to be implemented on Mars for detection of organic and molecular biomarkers on a best-case sample scenario. We analyzed a 1000-year-old desiccated and mummified microbial mat from Antarctica by Raman and IR (infrared) spectroscopies (near- and mid-IR), thermogravimetry (TG), differential thermal analysis, mass spectrometry (MS), and immunological detection with a life detector chip. In spite of the high organic content (ca. 20% wt/wt) of the sample, the Raman spectra only showed the characteristic spectral peaks of the remaining beta-carotene biomarker and faint peaks of phyllosilicates over a strong fluorescence background. IR spectra complemented the mineralogical information from Raman spectra and showed the main molecular vibrations of the humic acid functional groups. The TG-MS system showed the release of several volatile compounds attributed to biopolymers. An antibody microarray for detecting cyanobacteria (CYANOCHIP) detected biomarkers from Chroococcales, Nostocales, and Oscillatoriales orders. The results highlight limitations of each technique and suggest the necessity of complementary approaches in the search for biomarkers because some analytical techniques might be impaired by sample composition, presentation, or processing. Key Words: Planetary exploration-Life detection-Microbial mat-Life detector chip-Thermogravimetry-Raman spectroscopy-NIR-DRIFTS. Astrobiology 17, 984-996.

  15. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  16. Full-order optimal compensators for flow control: the multiple inputs case

    NASA Astrophysics Data System (ADS)

    Semeraro, Onofrio; Pralits, Jan O.

    2018-03-01

    Flow control has been the subject of numerous experimental and theoretical works. We analyze full-order, optimal controllers for large dynamical systems in the presence of multiple actuators and sensors. The full-order controllers do not require any preliminary model reduction or low-order approximation: this feature allows us to assess the optimal performance of an actuated flow without relying on any estimation process or further hypothesis on the disturbances. We start from the original technique proposed by Bewley et al. (Meccanica 51(12):2997-3014, 2016. https://doi.org/10.1007/s11012-016-0547-3), the adjoint of the direct-adjoint (ADA) algorithm. The algorithm is iterative and allows bypassing the solution of the algebraic Riccati equation associated with the optimal control problem, typically infeasible for large systems. In this numerical work, we extend the ADA iteration into a more general framework that includes the design of controllers with multiple, coupled inputs and robust controllers (H_{∞} methods). First, we demonstrate our results by showing the analytical equivalence between the full Riccati solutions and the ADA approximations in the multiple inputs case. In the second part of the article, we analyze the performance of the algorithm in terms of convergence of the solution, by comparing it with analogous techniques. We find an excellent scalability with the number of inputs (actuators), making the method a viable way for full-order control design in complex settings. Finally, the applicability of the algorithm to fluid mechanics problems is shown using the linearized Kuramoto-Sivashinsky equation and the Kármán vortex street past a two-dimensional cylinder.

  17. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  18. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    PubMed

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.

  19. Membrane-based lateral flow immunochromatographic strip with nanoparticles as reporters for detection: A review.

    PubMed

    Huang, Xiaolin; Aguilar, Zoraida P; Xu, Hengyi; Lai, Weihua; Xiong, Yonghua

    2016-01-15

    Membrane-based lateral flow immunochromatographic strip (LFICS) is widely used in various fields because of its simplicity, rapidity (detection within 10min), and low cost. However, early designs of membrane-based LFICS for preliminary screening only provide qualitative ("yes/no" signal) or semi-quantitative results without quantitative information. These designs often suffer from low-signal intensity and poor sensitivity and are only capable of single analyte detection, not simultaneous multiple detections. The performance of existing techniques used for detection using LFICS has been considerably improved by incorporating different kinds of nanoparticles (NPs) as reporters. NPs can serve as alternative labels and improve analytical sensitivity or limit of detection of LFICS because of their unique properties, such as optical absorption, fluorescence spectra, and magnetic properties. The controlled manipulation of NPs allows simultaneous or multiple detections by using membrane-based LFICS. In this review, we discuss how colored (e.g., colloidal gold, carbon, and colloidal selenium NPs), luminescent (e.g., quantum dots, up-converting phosphor NPs, and dye-doped NPs), and magnetic NPs are integrated into membrane-based LFICS for the detection of target analytes. Gold NPs are also featured because of their wide applications. Different types and unique properties of NPs are briefly explained. This review focuses on examples of NP-based LFICS to illustrate novel concepts in various devices with potential applications as screening tools. This review also highlights the superiority of NP-based approaches over existing conventional strategies for clinical analysis, food safety, and environmental monitoring. This paper is concluded by a short section on future research trends regarding NP-based LFICS. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Nanoscale chemical imaging by photoinduced force microscopy

    PubMed Central

    Nowak, Derek; Morrison, William; Wickramasinghe, H. Kumar; Jahng, Junghoon; Potma, Eric; Wan, Lei; Ruiz, Ricardo; Albrecht, Thomas R.; Schmidt, Kristin; Frommer, Jane; Sanders, Daniel P.; Park, Sung

    2016-01-01

    Correlating spatial chemical information with the morphology of closely packed nanostructures remains a challenge for the scientific community. For example, supramolecular self-assembly, which provides a powerful and low-cost way to create nanoscale patterns and engineered nanostructures, is not easily interrogated in real space via existing nondestructive techniques based on optics or electrons. A novel scanning probe technique called infrared photoinduced force microscopy (IR PiFM) directly measures the photoinduced polarizability of the sample in the near field by detecting the time-integrated force between the tip and the sample. By imaging at multiple IR wavelengths corresponding to absorption peaks of different chemical species, PiFM has demonstrated the ability to spatially map nm-scale patterns of the individual chemical components of two different types of self-assembled block copolymer films. With chemical-specific nanometer-scale imaging, PiFM provides a powerful new analytical method for deepening our understanding of nanomaterials. PMID:27051870

  1. Evaluation of trade-offs in costs and environmental impacts for returnable packaging implementation

    NASA Astrophysics Data System (ADS)

    Jarupan, Lerpong; Kamarthi, Sagar V.; Gupta, Surendra M.

    2004-02-01

    The main thrust of returnable packaging these days is to provide logistical services through transportation and distribution of products and be environmentally friendly. Returnable packaging and reverse logistics concepts have converged to mitigate the adverse effect of packaging materials entering the solid waste stream. Returnable packaging must be designed by considering the trade-offs between costs and environmental impact to satisfy manufacturers and environmentalists alike. The cost of returnable packaging entails such items as materials, manufacturing, collection, storage and disposal. Environmental impacts are explicitly linked with solid waste, air pollution, and water pollution. This paper presents a multi-criteria evaluation technique to assist decision-makers for evaluating the trade-offs in costs and environmental impact during the returnable packaging design process. The proposed evaluation technique involves a combination of multiple objective integer linear programming and analytic hierarchy process. A numerical example is used to illustrate the methodology.

  2. A deep learning-based reconstruction of cosmic ray-induced air showers

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Glombitza, J.; Walz, D.

    2018-01-01

    We describe a method of reconstructing air showers induced by cosmic rays using deep learning techniques. We simulate an observatory consisting of ground-based particle detectors with fixed locations on a regular grid. The detector's responses to traversing shower particles are signal amplitudes as a function of time, which provide information on transverse and longitudinal shower properties. In order to take advantage of convolutional network techniques specialized in local pattern recognition, we convert all information to the image-like grid of the detectors. In this way, multiple features, such as arrival times of the first particles and optimized characterizations of time traces, are processed by the network. The reconstruction quality of the cosmic ray arrival direction turns out to be competitive with an analytic reconstruction algorithm. The reconstructed shower direction, energy and shower depth show the expected improvement in resolution for higher cosmic ray energy.

  3. Bioinformatics and peptidomics approaches to the discovery and analysis of food-derived bioactive peptides.

    PubMed

    Agyei, Dominic; Tsopmo, Apollinaire; Udenigwe, Chibuike C

    2018-06-01

    There are emerging advancements in the strategies used for the discovery and development of food-derived bioactive peptides because of their multiple food and health applications. Bioinformatics and peptidomics are two computational and analytical techniques that have the potential to speed up the development of bioactive peptides from bench to market. Structure-activity relationships observed in peptides form the basis for bioinformatics and in silico prediction of bioactive sequences encrypted in food proteins. Peptidomics, on the other hand, relies on "hyphenated" (liquid chromatography-mass spectrometry-based) techniques for the detection, profiling, and quantitation of peptides. Together, bioinformatics and peptidomics approaches provide a low-cost and effective means of predicting, profiling, and screening bioactive protein hydrolysates and peptides from food. This article discuses the basis, strengths, and limitations of bioinformatics and peptidomics approaches currently used for the discovery and analysis of food-derived bioactive peptides.

  4. Multiple wavelength interferometry for distance measurements of moving objects with nanometer uncertainty

    NASA Astrophysics Data System (ADS)

    Kuschmierz, R.; Czarske, J.; Fischer, A.

    2014-08-01

    Optical measurement techniques offer great opportunities in diverse applications, such as lathe monitoring and microfluidics. Doppler-based interferometric techniques enable simultaneous measurement of the lateral velocity and axial distance of a moving object. However, there is a complementarity between the unambiguous axial measurement range and the uncertainty of the distance. Therefore, we present an extended sensor setup, which provides an unambiguous axial measurement range of 1 mm while achieving uncertainties below 100 nm. Measurements at a calibration system are performed. When using a pinhole for emulating a single scattering particle, the tumbling motion of the rotating object is resolved with a distance uncertainty of 50 nm. For measurements at the rough surface, the distance uncertainty amounts to 280 nm due to a lower signal-to-noise ratio. Both experimental results are close to the respective Cramér-Rao bound, which is derived analytically for both surface and single particle measurements.

  5. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  6. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  7. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  8. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  9. Opportunities for bead-based multiplex assays in veterinary diagnostic laboratories

    USDA-ARS?s Scientific Manuscript database

    Bead based multiplex assays (BBMA) also referred to as Luminex, MultiAnalyte Profiling or cytometric bead array (CBA) assays, are applicable for high throughput, simultaneous detection of multiple analytes in solution (from several, up to 50-500 analytes within a single, small sample volume). Curren...

  10. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  11. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  12. Synchronous in-field application of life-detection techniques in planetary analog missions

    NASA Astrophysics Data System (ADS)

    Amador, Elena S.; Cable, Morgan L.; Chaudry, Nosheen; Cullen, Thomas; Gentry, Diana; Jacobsen, Malene B.; Murukesan, Gayathri; Schwieterman, Edward W.; Stevens, Adam H.; Stockton, Amanda; Yin, Chang; Cullen, David C.; Geppert, Wolf

    2015-02-01

    Field expeditions that simulate the operations of robotic planetary exploration missions at analog sites on Earth can help establish best practices and are therefore a positive contribution to the planetary exploration community. There are many sites in Iceland that possess heritage as planetary exploration analog locations and whose environmental extremes make them suitable for simulating scientific sampling and robotic operations. We conducted a planetary exploration analog mission at two recent lava fields in Iceland, Fimmvörðuháls (2010) and Eldfell (1973), using a specially developed field laboratory. We tested the utility of in-field site sampling down selection and tiered analysis operational capabilities with three life detection and characterization techniques: fluorescence microscopy (FM), adenine-triphosphate (ATP) bioluminescence assay, and quantitative polymerase chain reaction (qPCR) assay. The study made use of multiple cycles of sample collection at multiple distance scales and field laboratory analysis using the synchronous life-detection techniques to heuristically develop the continuing sampling and analysis strategy during the expedition. Here we report the operational lessons learned and provide brief summaries of scientific data. The full scientific data report will follow separately. We found that rapid in-field analysis to determine subsequent sampling decisions is operationally feasible, and that the chosen life detection and characterization techniques are suitable for a terrestrial life-detection field mission. In-field analysis enables the rapid obtainment of scientific data and thus facilitates the collection of the most scientifically relevant samples within a single field expedition, without the need for sample relocation to external laboratories. The operational lessons learned in this study could be applied to future terrestrial field expeditions employing other analytical techniques and to future robotic planetary exploration missions.

  13. Common aspects influencing the translocation of SERS to Biomedicine.

    PubMed

    Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu

    2018-01-04

    In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Multiple piezo-patch energy harvesters integrated to a thin plate with AC-DC conversion: analytical modeling and numerical validation

    NASA Astrophysics Data System (ADS)

    Aghakhani, Amirreza; Basdogan, Ipek; Erturk, Alper

    2016-04-01

    Plate-like components are widely used in numerous automotive, marine, and aerospace applications where they can be employed as host structures for vibration based energy harvesting. Piezoelectric patch harvesters can be easily attached to these structures to convert the vibrational energy to the electrical energy. Power output investigations of these harvesters require accurate models for energy harvesting performance evaluation and optimization. Equivalent circuit modeling of the cantilever-based vibration energy harvesters for estimation of electrical response has been proposed in recent years. However, equivalent circuit formulation and analytical modeling of multiple piezo-patch energy harvesters integrated to thin plates including nonlinear circuits has not been studied. In this study, equivalent circuit model of multiple parallel piezoelectric patch harvesters together with a resistive load is built in electronic circuit simulation software SPICE and voltage frequency response functions (FRFs) are validated using the analytical distributedparameter model. Analytical formulation of the piezoelectric patches in parallel configuration for the DC voltage output is derived while the patches are connected to a standard AC-DC circuit. The analytic model is based on the equivalent load impedance approach for piezoelectric capacitance and AC-DC circuit elements. The analytic results are validated numerically via SPICE simulations. Finally, DC power outputs of the harvesters are computed and compared with the peak power amplitudes in the AC output case.

  15. Method development aspects for the quantitation of pharmaceutical compounds in human plasma with a matrix-assisted laser desorption/ionization source in the multiple reaction monitoring mode.

    PubMed

    Kovarik, Peter; Grivet, Chantal; Bourgogne, Emmanuel; Hopfgartner, Gérard

    2007-01-01

    The present work investigates various method development aspects for the quantitative analysis of pharmaceutical compounds in human plasma using matrix-assisted laser desorption/ionization and multiple reaction monitoring (MALDI-MRM). Talinolol was selected as a model analyte. Liquid-liquid extraction (LLE) and protein precipitation were evaluated regarding sensitivity and throughput for the MALDI-MRM technique and its applicability without and with chromatographic separation. Compared to classical electrospray liquid chromatography/mass spectrometry (LC/ESI-MS) method development, with MALDI-MRM the tuning of the analyte in single MS mode is more challenging due to interfering matrix background ions. An approach is proposed using background subtraction. With LLE and using a 200 microL human plasma aliquot acceptable precision and accuracy could be obtained in the range of 1 to 1000 ng/mL without any LC separation. Approximately 3 s were required for one analysis. A full calibration curve and its quality control samples (20 samples) can be analyzed within 1 min. Combining LC with the MALDI analysis allowed improving the linearity down to 50 pg/mL, while reducing the throughput potential only by two-fold. Matrix effects are still a significant issue with MALDI but can be monitored in a similar way to that used for LC/ESI-MS analysis.

  16. Screening and identification of steroidal saponins from Anemarrhena asphodeloides employing UPLC tandem triple quadrupole linear ion trap mass spectrometry.

    PubMed

    Xia, Yong-Gang; Guo, Xin-Dong; Liang, Jun; Yang, Bing-You; Kuang, Hai-Xue

    2017-09-01

    This study presents a practical and valid strategy for the screening and structural characterization of Anemarrhena asphodeloides Bge steroidal saponins (SSs) using ultra-high performance liquid chromatography coupled with triple quadrupole linear ion trap mass spectrometry. The whole analytical protocols integrate four-step procedures in the positive mode: (1) rational deduction of mass fragmentation pathways of A. asphodeloides SSs; (2) untargeted screening of potential A. asphodeloides SSs by multiple-ion monitoring-information-dependent-acquiring-enhanced product ion (MIM-IDA-EPI) scan through reverse phase liquid chromatography; (3) comprehensive construction of an ammoniated precursor ion database by combining untargeted MIM-IDA-EPI scans and data literature; and (4) structural interpretation of targeted A. asphodeloides SSs using MIM-IDA-EPI and multiple reaction monitoring (MRM)-IDA-EPI with an energy-resolved technique. The protocols were used to analyze SSs in A. asphodeloides; of the 87 detected SSs that were unambiguously characterized or tentatively identified, 19 compounds were the first to be reported from A. asphodeloides and 13 ones were characterized as potential new compounds. Accuracy of the analytical procedure was demonstrated by structural identification of three SSs by NMR spectroscopy. The proposed schemes hold an excellent promise in the structural prediction and interpretation of complex SSs from plant medicines by mass spectrometry. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Estimating habitat volume of living resources using three-dimensional circulation and biogeochemical models

    NASA Astrophysics Data System (ADS)

    Smith, Katharine A.; Schlag, Zachary; North, Elizabeth W.

    2018-07-01

    Coupled three-dimensional circulation and biogeochemical models predict changes in water properties that can be used to define fish habitat, including physiologically important parameters such as temperature, salinity, and dissolved oxygen. However, methods for calculating the volume of habitat defined by the intersection of multiple water properties are not well established for coupled three-dimensional models. The objectives of this research were to examine multiple methods for calculating habitat volume from three-dimensional model predictions, select the most robust approach, and provide an example application of the technique. Three methods were assessed: the "Step," "Ruled Surface", and "Pentahedron" methods, the latter of which was developed as part of this research. Results indicate that the analytical Pentahedron method is exact, computationally efficient, and preserves continuity in water properties between adjacent grid cells. As an example application, the Pentahedron method was implemented within the Habitat Volume Model (HabVol) using output from a circulation model with an Arakawa C-grid and physiological tolerances of juvenile striped bass (Morone saxatilis). This application demonstrates that the analytical Pentahedron method can be successfully applied to calculate habitat volume using output from coupled three-dimensional circulation and biogeochemical models, and it indicates that the Pentahedron method has wide application to aquatic and marine systems for which these models exist and physiological tolerances of organisms are known.

  18. Spectral relative standard deviation: a practical benchmark in metabolomics.

    PubMed

    Parsons, Helen M; Ekman, Drew R; Collette, Timothy W; Viant, Mark R

    2009-03-01

    Metabolomics datasets, by definition, comprise of measurements of large numbers of metabolites. Both technical (analytical) and biological factors will induce variation within these measurements that is not consistent across all metabolites. Consequently, criteria are required to assess the reproducibility of metabolomics datasets that are derived from all the detected metabolites. Here we calculate spectrum-wide relative standard deviations (RSDs; also termed coefficient of variation, CV) for ten metabolomics datasets, spanning a variety of sample types from mammals, fish, invertebrates and a cell line, and display them succinctly as boxplots. We demonstrate multiple applications of spectral RSDs for characterising technical as well as inter-individual biological variation: for optimising metabolite extractions, comparing analytical techniques, investigating matrix effects, and comparing biofluids and tissue extracts from single and multiple species for optimising experimental design. Technical variation within metabolomics datasets, recorded using one- and two-dimensional NMR and mass spectrometry, ranges from 1.6 to 20.6% (reported as the median spectral RSD). Inter-individual biological variation is typically larger, ranging from as low as 7.2% for tissue extracts from laboratory-housed rats to 58.4% for fish plasma. In addition, for some of the datasets we confirm that the spectral RSD values are largely invariant across different spectral processing methods, such as baseline correction, normalisation and binning resolution. In conclusion, we propose spectral RSDs and their median values contained herein as practical benchmarks for metabolomics studies.

  19. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  20. Surface-Enhanced Raman Spectroscopy.

    ERIC Educational Resources Information Center

    Garrell, Robin L.

    1989-01-01

    Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)

  1. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  2. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  3. Analytical Pyrolysis-Chromatography: Something Old, Something New

    ERIC Educational Resources Information Center

    Bower, Nathan W.; Blanchet, Conor J. K.

    2010-01-01

    Despite a long history of use across multiple disciplines, analytical pyrolysis is rarely taught in undergraduate curricula. We briefly review some interesting applications and discuss the three types of analytical pyrolyzers available commercially. We also describe a low-cost alternative that can be used to teach the basic principles of…

  4. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  5. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  6. Insights from two industrial hygiene pilot e-cigarette passive vaping studies.

    PubMed

    Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A

    2016-01-01

    While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.

  7. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  8. An overview of the characterization of occupational exposure to nanoaerosols in workplaces

    NASA Astrophysics Data System (ADS)

    Castellano, Paola; Ferrante, Riccardo; Curini, Roberta; Canepari, Silvia

    2009-05-01

    Currently, there is a lack of standardized sampling and metric methods that can be applied to measure the level of exposure to nanosized aerosols. Therefore, any attempt to characterize exposure to nanoparticles (NP) in a workplace must involve a multifaceted approach characterized by different sampling and analytical techniques to measure all relevant characteristics of NP exposure. Furthermore, as NP aerosols are always complex mixtures of multiple origins, sampling and analytical methods need to be improved to selectively evaluate the apportionment from specific sources to the final nanomaterials. An open question at the world's level is how to relate specific toxic effects of NP with one or more among several different parameters (such as particle size, mass, composition, surface area, number concentration, aggregation or agglomeration state, water solubility and surface chemistry). As the evaluation of occupational exposure to NP in workplaces needs dimensional and chemical characterization, the main problem is the choice of the sampling and dimensional separation techniques. Therefore a convenient approach to allow a satisfactory risk assessment could be the contemporary use of different sampling and measuring techniques for particles with known toxicity in selected workplaces. Despite the lack of specific NP exposure limit values, exposure metrics, appropriate to nanoaerosols, are discussed in the Technical Report ISO/TR 27628:2007 with the aim to enable occupational hygienists to characterize and monitor nanoaerosols in workplaces. Moreover, NIOSH has developed the Document Approaches to Safe Nanotechnology (intended to be an information exchange with NIOSH) in order to address current and future research needs to understanding the potential risks that nanotechnology may have to workers.

  9. Clearance of the cervical spine in clinically unevaluable trauma patients.

    PubMed

    Halpern, Casey H; Milby, Andrew H; Guo, Wensheng; Schuster, James M; Gracias, Vicente H; Stein, Sherman C

    2010-08-15

    Meta-analytic costeffectiveness analysis. Our goal was to compare the results of different management strategies for trauma patients in whom the cervical spine was not clinically evaluable due to impaired consciousness, endotracheal intubation, or painful distracting injuries. We performed a structured literature review related to cervical spine trauma, radiographic clearance techniques (plain radiography, flexion/extension, CT, and MRI), and complications associated with semirigid collar use. Meta-analytic techniques were used to pool data from multiple sources to calculate pooled mean estimates of sensitivities and specificities of imaging techniques for cervical spinal clearance, rates of complications from various clearance strategies and from empirical use of semirigid collars. A decision analysis model was used to compare outcomes and costs among these strategies. Slightly more than 7.5% of patients who are clinically unevaluable have cervical spine injuries, and 42% of these injuries are associated with spinal instability. Sensitivity of plain radiography or fluoroscopy for spinal clearance was 57% (95% CI: 57%-60%). Sensitivities for CT and MRI alone were 83% (82%-84%) and 87% (84%-89%), respectively. Complications associated with collar use ranged from 1.3% (2 days) to 7.1% (10 days) but were usually minor and short-lived. Quadriplegia resulting from spinal instability missed by a clearance test had enormous impacts on longevity, quality of life, and costs. These impacts overshadowed the effects of prolonged collar application, even when the incidence of quadriplegia was extremely low. As currently used, neuroimaging studies for cervical spinal clearance in clinically unevaluable patients are not cost-effective compared with empirical immobilization in a semirigid collar.

  10. Quantitative diagnosis and prognosis framework for concrete degradation due to alkali-silica reaction

    NASA Astrophysics Data System (ADS)

    Mahadevan, Sankaran; Neal, Kyle; Nath, Paromita; Bao, Yanqing; Cai, Guowei; Orme, Peter; Adams, Douglas; Agarwal, Vivek

    2017-02-01

    This research is seeking to develop a probabilistic framework for health diagnosis and prognosis of aging concrete structures in nuclear power plants that are subjected to physical, chemical, environment, and mechanical degradation. The proposed framework consists of four elements: monitoring, data analytics, uncertainty quantification, and prognosis. The current work focuses on degradation caused by ASR (alkali-silica reaction). Controlled concrete specimens with reactive aggregate are prepared to develop accelerated ASR degradation. Different monitoring techniques — infrared thermography, digital image correlation (DIC), mechanical deformation measurements, nonlinear impact resonance acoustic spectroscopy (NIRAS), and vibro-acoustic modulation (VAM) — are studied for ASR diagnosis of the specimens. Both DIC and mechanical measurements record the specimen deformation caused by ASR gel expansion. Thermography is used to compare the thermal response of pristine and damaged concrete specimens and generate a 2-D map of the damage (i.e., ASR gel and cracked area), thus facilitating localization and quantification of damage. NIRAS and VAM are two separate vibration-based techniques that detect nonlinear changes in dynamic properties caused by the damage. The diagnosis results from multiple techniques are then fused using a Bayesian network, which also helps to quantify the uncertainty in the diagnosis. Prognosis of ASR degradation is then performed based on the current state of degradation obtained from diagnosis, by using a coupled thermo-hydro-mechanical-chemical (THMC) model for ASR degradation. This comprehensive approach of monitoring, data analytics, and uncertainty-quantified diagnosis and prognosis will facilitate the development of a quantitative, risk informed framework that will support continuous assessment and risk management of structural health and performance.

  11. Resonance Ionization, Mass Spectrometry.

    ERIC Educational Resources Information Center

    Young, J. P.; And Others

    1989-01-01

    Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)

  12. Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods

    ERIC Educational Resources Information Center

    Zhang, Ying

    2011-01-01

    Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…

  13. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  14. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  15. Analytical modeling of large amplitude free vibration of non-uniform beams carrying a both transversely and axially eccentric tip mass

    NASA Astrophysics Data System (ADS)

    Malaeke, Hasan; Moeenfard, Hamid

    2016-03-01

    The objective of this paper is to study large amplitude flexural-extensional free vibration of non-uniform cantilever beams carrying a both transversely and axially eccentric tip mass. The effects of variable axial force is also taken into account. Hamilton's principle is utilized to obtain the partial differential equations governing the nonlinear vibration of the system as well as the corresponding boundary conditions. A numerical finite difference scheme is proposed to find the natural frequencies and mode shapes of the system which is validated specifically for a beam with linearly varying cross section. Using a single mode approximation in conjunction with the Lagrange method, the governing equations are reduced to a set of two nonlinear ordinary differential equations in terms of end displacement components of the beam which are coupled due to the presence of the transverse eccentricity. These temporal coupled equations are then solved analytically using the multiple time scales perturbation technique. The obtained analytical results are compared with the numerical ones and excellent agreement is observed. The qualitative and quantitative knowledge resulting from this research is expected to enable the study of the effects of eccentric tip mass and non-uniformity on the large amplitude flexural-extensional vibration of beams for improved dynamic performance.

  16. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  17. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  18. New robust bilinear least squares method for the analysis of spectral-pH matrix data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C

    2005-07-01

    A new second-order multivariate method has been developed for the analysis of spectral-pH matrix data, based on a bilinear least-squares (BLLS) model achieving the second-order advantage and handling multiple calibration standards. A simulated Monte Carlo study of synthetic absorbance-pH data allowed comparison of the newly proposed BLLS methodology with constrained parallel factor analysis (PARAFAC) and with the combination multivariate curve resolution-alternating least-squares (MCR-ALS) technique under different conditions of sample-to-sample pH mismatch and analyte-background ratio. The results indicate an improved prediction ability for the new method. Experimental data generated by measuring absorption spectra of several calibration standards of ascorbic acid and samples of orange juice were subjected to second-order calibration analysis with PARAFAC, MCR-ALS, and the new BLLS method. The results indicate that the latter method provides the best analytical results in regard to analyte recovery in samples of complex composition requiring strict adherence to the second-order advantage. Linear dependencies appear when multivariate data are produced by using the pH or a reaction time as one of the data dimensions, posing a challenge to classical multivariate calibration models. The presently discussed algorithm is useful for these latter systems.

  19. Exploratory Bifactor Analysis: The Schmid-Leiman Orthogonalization and Jennrich-Bentler Analytic Rotations

    PubMed Central

    Mansolf, Maxwell; Reise, Steven P.

    2017-01-01

    Analytic bifactor rotations (Jennrich & Bentler, 2011, 2012) have been recently developed and made generally available, but are not well understood. The Jennrich-Bentler analytic bifactor rotations (bi-quartimin and bi-geomin) are an alternative to, and arguably an improvement upon, the less technically sophisticated Schmid-Leiman orthogonalization (Schmid & Leiman, 1957). We review the technical details that underlie the Schmid-Leiman and Jennrich-Bentler bifactor rotations, using simulated data structures to illustrate important features and limitations. For the Schmid-Leiman, we review the problem of inaccurate parameter estimates caused by the linear dependencies, sometimes called “proportionality constraints,” that are required to expand a p correlated factors solution into a (p+1) (bi)factor space. We also review the complexities involved when the data depart from perfect cluster structure (e.g., item cross-loading on group factors). For the Jennrich-Bentler rotations, we describe problems in parameter estimation caused by departures from perfect cluster structure. In addition, we illustrate the related problems of: (a) solutions that are not invariant under different starting values (i.e., local minima problems); and, (b) group factors collapsing onto the general factor. Recommendations are made for substantive researchers including examining all local minima and applying multiple exploratory techniques in an effort to identify an accurate model. PMID:27612521

  20. Gradient elution moving boundary electrophoresis enables rapid analysis of acids in complex biomass-derived streams

    DOE PAGES

    Munson, Matthew S.; Karp, Eric M.; Nimlos, Claire T.; ...

    2016-09-27

    Biomass conversion processes such as pretreatment, liquefaction, and pyrolysis often produce complex mixtures of intermediates that are a substantial challenge to analyze rapidly and reliably. To characterize these streams more comprehensively and efficiently, new techniques are needed to track species through biomass deconstruction and conversion processes. Here, we present the application of an emerging analytical method, gradient elution moving boundary electrophoresis (GEMBE), to quantify a suite of acids in a complex, biomass-derived streams from alkaline pretreatment of corn stover. GEMBE offers distinct advantages over common chromatography-spectrometry analytical approaches in terms of analysis time, sample preparation requirements, and cost of equipment.more » As demonstrated here, GEMBE is able to track 17 distinct compounds (oxalate, formate, succinate, malate, acetate, glycolate, protocatechuate, 3-hydroxypropanoate, lactate, glycerate, 2-hydroxybutanoate, 4-hydroxybenzoate, vanillate, p-coumarate, ferulate, sinapate, and acetovanillone). The lower limit of detection was compound dependent and ranged between 0.9 and 3.5 umol/L. Results from GEMBE were similar to recent results from an orthogonal method based on GCxGC-TOF/MS. Altogether, GEMBE offers a rapid, robust approach to analyze complex biomass-derived samples, and given the ease and convenience of deployment, may offer an analytical solution for online tracking of multiple types of biomass streams.« less

  1. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE PAGES

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...

    2016-02-01

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  2. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  3. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  4. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  5. Net growth rate of continuum heterogeneous biofilms with inhibition kinetics.

    PubMed

    Gonzo, Elio Emilio; Wuertz, Stefan; Rajal, Veronica B

    2018-01-01

    Biofilm systems can be modeled using a variety of analytical and numerical approaches, usually by making simplifying assumptions regarding biofilm heterogeneity and activity as well as effective diffusivity. Inhibition kinetics, albeit common in experimental systems, are rarely considered and analytical approaches are either lacking or consider effective diffusivity of the substrate and the biofilm density to remain constant. To address this obvious knowledge gap an analytical procedure to estimate the effectiveness factor (dimensionless substrate mass flux at the biofilm-fluid interface) was developed for a continuum heterogeneous biofilm with multiple limiting-substrate Monod kinetics to different types of inhibition kinetics. The simple perturbation technique, previously validated to quantify biofilm activity, was applied to systems where either the substrate or the inhibitor is the limiting component, and cases where the inhibitor is a reaction product or the substrate also acts as the inhibitor. Explicit analytical equations are presented for the effectiveness factor estimation and, therefore, the calculation of biomass growth rate or limiting substrate/inhibitor consumption rate, for a given biofilm thickness. The robustness of the new biofilm model was tested using kinetic parameters experimentally determined for the growth of Pseudomonas putida CCRC 14365 on phenol. Several additional cases have been analyzed, including examples where the effectiveness factor can reach values greater than unity, characteristic of systems with inhibition kinetics. Criteria to establish when the effectiveness factor can reach values greater than unity in each of the cases studied are also presented.

  6. Transcutaneous Measurement of Blood Analyte Concentration Using Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Barman, Ishan; Singh, Gajendra P.; Dasari, Ramachandra R.; Feld, Michael S.

    2008-11-01

    Diabetes mellitus is a chronic disorder, affecting nearly 200 million people worldwide. Acute complications, such as hypoglycemia, cardiovascular disease and retinal damage, may occur if the disease is not adequately controlled. As diabetes has no known cure, tight control of glucose levels is critical for the prevention of such complications. Given the necessity for regular monitoring of blood glucose, development of non-invasive glucose detection devices is essential to improve the quality of life in diabetic patients. The commercially available glucose sensors measure the interstitial fluid glucose by electrochemical detection. However, these sensors have severe limitations, primarily related to their invasive nature and lack of stability. This necessitates the development of a truly non-invasive glucose detection technique. NIR Raman Spectroscopy, which combines the substantial penetration depth of NIR light with the excellent chemical specificity of Raman spectroscopy, provides an excellent tool to meet the challenges involved. Additionally, it enables simultaneous determination of multiple blood analytes. Our laboratory has pioneered the use of Raman spectroscopy for blood analytes' detection in biological media. The preliminary success of our non-invasive glucose measurements both in vitro (such as in serum and blood) and in vivo has provided the foundation for the development of feasible clinical systems. However, successful application of this technology still faces a few hurdles, highlighted by the problems of tissue luminescence and selection of appropriate reference concentration. In this article we explore possible avenues to overcome these challenges so that prospective prediction accuracy of blood analytes can be brought to clinically acceptable levels.

  7. Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals

    NASA Astrophysics Data System (ADS)

    Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.

    2018-02-01

    Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.

  8. Direct tandem mass spectrometry for the simultaneous assay of opioids, cocaine and metabolites in dried urine spots.

    PubMed

    Otero-Fernández, Mara; Cocho, José Ángel; Tabernero, María Jesús; Bermejo, Ana María; Bermejo-Barrera, Pilar; Moreda-Piñeiro, Antonio

    2013-06-19

    A micro-analytical method based on spotting urine samples (20μL) onto blood/urine spot collection cards followed by air-drying and extraction (dried urine spot, DUS) was developed and validated for the screening/confirmation assay of morphine, 6-methylacetylmorphine (6-MAM), codeine, cocaine and benzoylecgonine (BZE). Acetonitrile (3 mL) was found to be a useful solvent for target extraction from DUSs under an orbital-horizontal stirring at 180 rpm for 10 min. Determinations were performed by direct electrospray ionization tandem mass spectrometry (ESI-MS/MS) under positive electrospray ionization conditions, and by using multiple reaction monitoring (MRM) with one precursor ion/product ion transition for the identification and quantification (deuterated analogs of each target as internal standards) of each analyte. The limits of detection of the method were 0.26, 0.94, 1.5, 1.1, and 2.0 ng mL(-1), for cocaine, BZE, codeine, morphine and 6-MAM, respectively; whereas, relative standard deviations of intra- and inter-day precision were lower than 8 and 11%, respectively, and intra- and inter-day analytical recoveries ranged from 94±4 to 105±3%. The small volume of urine required (20 μL), combined with the simplicity of the analytical technique makes it a useful procedure for screening/quantifying drugs of abuse. The method was successfully applied to the analysis of urine from polydrug abusers. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  10. Maternal factors predicting cognitive and behavioral characteristics of children with fetal alcohol spectrum disorders.

    PubMed

    May, Philip A; Tabachnick, Barbara G; Gossage, J Phillip; Kalberg, Wendy O; Marais, Anna-Susan; Robinson, Luther K; Manning, Melanie A; Blankenship, Jason; Buckley, David; Hoyme, H Eugene; Adnams, Colleen M

    2013-06-01

    To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASDs). Multivariate correlation techniques were used with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first-grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and used in structural equation models (SEMs) to assess correlates of child intelligence (verbal and nonverbal) and behavior. A first SEM using only 7 maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05) but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status [SES], and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model and were overpowered by SES and maternal physical traits. Although other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD.

  11. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the datasets distribution - and will include intercomparison, ensemble, and outlier analysis. The two-level workflow solution envisioned in INDIGO (coarse grain for distributed tasks orchestration, and fine grain, at the level of a single data analytics cluster instance) will be presented and discussed.

  12. Analytical multiple scattering correction to the Mie theory: Application to the analysis of the lidar signal

    NASA Technical Reports Server (NTRS)

    Flesia, C.; Schwendimann, P.

    1992-01-01

    The contribution of the multiple scattering to the lidar signal is dependent on the optical depth tau. Therefore, the radar analysis, based on the assumption that the multiple scattering can be neglected is limited to cases characterized by low values of the optical depth (tau less than or equal to 0.1) and hence it exclude scattering from most clouds. Moreover, all inversion methods relating lidar signal to number densities and particle size must be modified since the multiple scattering affects the direct analysis. The essential requests of a realistic model for lidar measurements which include the multiple scattering and which can be applied to practical situations follow. (1) Requested are not only a correction term or a rough approximation describing results of a certain experiment, but a general theory of multiple scattering tying together the relevant physical parameter we seek to measure. (2) An analytical generalization of the lidar equation which can be applied in the case of a realistic aerosol is requested. A pure analytical formulation is important in order to avoid the convergency and stability problems which, in the case of numerical approach, are due to the large number of events that have to be taken into account in the presence of large depth and/or a strong experimental noise.

  13. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  14. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  15. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  16. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  17. Space debris, asteroids and satellite orbits; Proceedings of the Fourth and Thirteenth Workshops, Graz, Austria, June 25-July 7, 1984

    NASA Technical Reports Server (NTRS)

    Kessler, D. J.; Gruen, E.; Sehnal, L.

    1985-01-01

    The workshops covered a variety of topics relevant to the identification, characterization and monitoring of near-earth solar system debris. Attention was given to man-made and naturally occurring microparticles, their hazards to present and future spacecraft, and ground- and space-based techniques for tracking both large and small debris. The studies are extended to solid fuel particulates in circular space. Asteroid rendezvous missions are discussed, including propulsion and instrumentation options, the possibility of encountering asteroids during Hohman transfer flights to Venus and/or Mars, and the benefits of multiple encounters by one spacecraft. Finally, equipment and analytical models for generating precise satellite orbits are reviewed.

  18. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations.

    PubMed

    Islam, Md Shafiqul; Khan, Kamruzzaman; Akbar, M Ali; Mastroberardino, Antonio

    2014-10-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin-Bona-Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering.

  19. Performance and capacity analysis of Poisson photon-counting based Iter-PIC OCDMA systems.

    PubMed

    Li, Lingbin; Zhou, Xiaolin; Zhang, Rong; Zhang, Dingchen; Hanzo, Lajos

    2013-11-04

    In this paper, an iterative parallel interference cancellation (Iter-PIC) technique is developed for optical code-division multiple-access (OCDMA) systems relying on shot-noise limited Poisson photon-counting reception. The novel semi-analytical tool of extrinsic information transfer (EXIT) charts is used for analysing both the bit error rate (BER) performance as well as the channel capacity of these systems and the results are verified by Monte Carlo simulations. The proposed Iter-PIC OCDMA system is capable of achieving two orders of magnitude BER improvements and a 0.1 nats of capacity improvement over the conventional chip-level OCDMA systems at a coding rate of 1/10.

  20. A note on improved F-expansion method combined with Riccati equation applied to nonlinear evolution equations

    PubMed Central

    Islam, Md. Shafiqul; Khan, Kamruzzaman; Akbar, M. Ali; Mastroberardino, Antonio

    2014-01-01

    The purpose of this article is to present an analytical method, namely the improved F-expansion method combined with the Riccati equation, for finding exact solutions of nonlinear evolution equations. The present method is capable of calculating all branches of solutions simultaneously, even if multiple solutions are very close and thus difficult to distinguish with numerical techniques. To verify the computational efficiency, we consider the modified Benjamin–Bona–Mahony equation and the modified Korteweg-de Vries equation. Our results reveal that the method is a very effective and straightforward way of formulating the exact travelling wave solutions of nonlinear wave equations arising in mathematical physics and engineering. PMID:26064530

  1. Method for analyzing the mass of a sample using a cold cathode ionization source mass filter

    DOEpatents

    Felter, Thomas E.

    2003-10-14

    An improved quadrupole mass spectrometer is described. The improvement lies in the substitution of the conventional hot filament electron source with a cold cathode field emitter array which in turn allows operating a small QMS at much high internal pressures then are currently achievable. By eliminating of the hot filament such problems as thermally "cracking" delicate analyte molecules, outgassing a "hot" filament, high power requirements, filament contamination by outgas species, and spurious em fields are avoid all together. In addition, the ability of produce FEAs using well-known and well developed photolithographic techniques, permits building a QMS having multiple redundancies of the ionization source at very low additional cost.

  2. Dynamic Testing of a Subscale Sunshield for the Next Generation Space Telescope (NGST)

    NASA Technical Reports Server (NTRS)

    Lienard, Sebastien; Johnston, John D.; Ross, Brian; Smith, James; Brodeur, Steve (Technical Monitor)

    2001-01-01

    The NGST sunshield is a lightweight, flexible structure consisting of multiple layers of pretensioned, thin-film membranes supported by deployable booms. The structural dynamic behavior of the sunshield must be well understood in order to predict its influence on observatory performance. Ground tests were carried out in a vacuum environment to characterize the structural dynamic behavior of a one-tenth scale model of the sunshield. Results from the tests will be used to validate analytical modeling techniques that can be used in conjunction with scaling laws to predict the performance of the full-sized structure. This paper summarizes the ground tests and presents representative results for the dynamic behavior of the sunshield.

  3. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  4. Manual-slide-engaged paper chip for parallel SERS-immunoassay measurement of clenbuterol from swine hair.

    PubMed

    Zheng, Tingting; Gao, Zhigang; Luo, Yong; Liu, Xianming; Zhao, Weijie; Lin, Bingcheng

    2016-02-01

    Clenbuterol (CL), as a feed additive, has been banned in many countries due to its potential threat to human health. In detection of CL, a fast, low-cost technique with high accuracy and specificity would be ideal for its administrative on-field inspections. Among the attempts to pursue a reliable detection tool of CL, a technique that combines surface enhanced Raman spectroscopy (SERS) and immunoassay, is close to meet the requirements as above. However, multiple steps of interactions between CL analyte, antibody, and antigen are involved in this method, and under conventional setup, the operation of SERS/immunoassay were unwieldy. In this paper, to facilitate a more manageable sample manipulation for SERS-immunoassay measurement, a 3D paper chip was suggested. A switch-on-chip multilayered (abbreviated as SoCM-) microfluidic paper-based analysis device (μPad) was fabricated to provide operators with manual switches on the interactions between different microfluids. Besides, on a detection slip we made on the main body of our SoCM-μPad, antigen was anchored in pattern. With this architecture, multistep interactions between the CL analyte in swine hair extract and the SERS probe-modified antibody and antigen, were managed for on-chip SERS-immunoassay detection. This would be very attractive for fast, cheap, accurate, and on-site specific detection of CL from real samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Fast targeted analysis of 132 acidic and neutral drugs and poisons in whole blood using LC-MS/MS.

    PubMed

    Di Rago, Matthew; Saar, Eva; Rodda, Luke N; Turfus, Sophie; Kotsos, Alex; Gerostamoulos, Dimitri; Drummer, Olaf H

    2014-10-01

    The aim of this study was to develop an LC-MS/MS based screening technique that covers a broad range of acidic and neutral drugs and poisons by combining a small sample volume and efficient extraction technique with simple automated data processing. After protein precipitation of 100μL of whole blood, 132 common acidic and neutral drugs and poisons including non-steroidal anti-inflammatory drugs, barbiturates, anticonvulsants, antidiabetics, muscle relaxants, diuretics and superwarfarin rodenticides (47 quantitated, 85 reported as detected) were separated using a Shimadzu Prominence HPLC system with a C18 separation column (Kinetex XB-C18, 4.6mm×150mm, 5μm), using gradient elution with a mobile phase of 25mM ammonium acetate buffer (pH 7.5)/acetonitrile. The drugs were detected using an ABSciex(®) API 2000 LC-MS/MS system (ESI+ and -, MRM mode, two transitions per analyte). The method was fully validated in accordance with international guidelines. Quantification data obtained using one-point calibration compared favorably to that using multiple calibrants. The presented LC-MS/MS assay has proven to be applicable for determination of the analytes in blood. The fast and reliable extraction method combined with automated processing gives the opportunity for high throughput and fast turnaround times for forensic and clinical toxicology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Novel strategy for the determination of illegal adulterants in health foods and herbal medicines using high-performance liquid chromatography with high-resolution mass spectrometry.

    PubMed

    Wang, Zhe; Wu, Caisheng; Wang, Gangli; Zhang, Qingsheng; Zhang, Jinlan

    2015-03-01

    The detection, confirmation, and quantification of multiple illegal adulterants in health foods and herbal medicines by using a single analytical method are a challenge. This paper reports on a new strategy to meet this challenge by employing high-performance liquid chromatography coupled with high-resolution mass spectrometry and a mass spectral tree similarity filter technique. This analytical method can rapidly collect high-resolution, high-accuracy, optionally multistage mass data for compounds in samples. After a preliminary screening by retention time and high-resolution mass spectral data, known illegal adulterants can be detected. The mass spectral tree similarity filter technique has been applied to rapidly confirm these adulterants and simultaneously discover unknown ones. By using full-scan mass spectra as stem and data-dependent subsequent stage mass spectra to form branches, mass spectrometry data from detected compounds are converted into mass spectral trees. The known or unknown illegal adulterants in the samples are confirmed or discovered based on the similarity between their mass spectral trees and those of the references in a library, and they are finally quantified against standard curves. This new strategy has been tested by using 50 samples, and the illegal adulterants were rapidly and effectively detected, confirmed and quantified. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Gravitational lensing by an ensemble of isothermal galaxies

    NASA Technical Reports Server (NTRS)

    Katz, Neal; Paczynski, Bohdan

    1987-01-01

    Calculation of 28,000 models of gravitational lensing of a distant quasar by an ensemble of randomly placed galaxies, each having a singular isothermal mass distribuiton, is reported. The average surface mass density was 0.2 of the critical value in all models. It is found that the surface mass density averaged over the area of the smallest circle that encompasses the multiple images is 0.82, only slightly smaller than expected from a simple analytical model of Turner et al. (1984). The probability of getting multiple images is also as large as expected analytically. Gravitational lensing is dominated by the matter in the beam; i.e., by the beam convergence. The cases where the multiple imaging is due to asymmetry in mass distribution (i.e., due to shear) are very rare. Therefore, the observed gravitational-lens candidates for which no lensing object has been detected between the images cannot be a result of asymmetric mass distribution outside the images, at least in a model with randomly distributed galaxies. A surprisingly large number of large separations between the multiple images is found: up to 25 percent of multiple images have their angular separation 2 to 4 times larger than expected in a simple analytical model.

  8. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  9. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  10. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  11. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  12. Multiplexed Colorimetric Solid-Phase Extraction

    NASA Technical Reports Server (NTRS)

    Gazda, Daniel B.; Fritz, James S.; Porter, Marc D.

    2009-01-01

    Multiplexed colorimetric solid-phase extraction (MC-SPE) is an extension of colorimetric solid-phase extraction (C-SPE) an analytical platform that combines colorimetric reagents, solid phase extraction, and diffuse reflectance spectroscopy to quantify trace analytes in water. In CSPE, analytes are extracted and complexed on the surface of an extraction membrane impregnated with a colorimetric reagent. The analytes are then quantified directly on the membrane surface using a handheld diffuse reflectance spectrophotometer. Importantly, the use of solid-phase extraction membranes as the matrix for impregnation of the colorimetric reagents creates a concentration factor that enables the detection of low concentrations of analytes in small sample volumes. In extending C-SPE to a multiplexed format, a filter holder that incorporates discrete analysis channels and a jig that facilitates the concurrent operation of multiple sample syringes have been designed, enabling the simultaneous determination of multiple analytes. Separate, single analyte membranes, placed in a readout cartridge create unique, analyte-specific addresses at the exit of each channel. Following sample exposure, the diffuse reflectance spectrum of each address is collected serially and the Kubelka-Munk function is used to quantify each water quality parameter via calibration curves. In a demonstration, MC-SPE was used to measure the pH of a sample and quantitate Ag(I) and Ni(II).

  13. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  14. Quantifying short-lived events in multistate ionic current measurements.

    PubMed

    Balijepalli, Arvind; Ettedgui, Jessica; Cornio, Andrew T; Robertson, Joseph W F; Cheung, Kin P; Kasianowicz, John J; Vaz, Canute

    2014-02-25

    We developed a generalized technique to characterize polymer-nanopore interactions via single channel ionic current measurements. Physical interactions between analytes, such as DNA, proteins, or synthetic polymers, and a nanopore cause multiple discrete states in the current. We modeled the transitions of the current to individual states with an equivalent electrical circuit, which allowed us to describe the system response. This enabled the estimation of short-lived states that are presently not characterized by existing analysis techniques. Our approach considerably improves the range and resolution of single-molecule characterization with nanopores. For example, we characterized the residence times of synthetic polymers that are three times shorter than those estimated with existing algorithms. Because the molecule's residence time follows an exponential distribution, we recover nearly 20-fold more events per unit time that can be used for analysis. Furthermore, the measurement range was extended from 11 monomers to as few as 8. Finally, we applied this technique to recover a known sequence of single-stranded DNA from previously published ion channel recordings, identifying discrete current states with subpicoampere resolution.

  15. Performance analysis of cooperative virtual MIMO systems for wireless sensor networks.

    PubMed

    Rafique, Zimran; Seet, Boon-Chong; Al-Anbuky, Adnan

    2013-05-28

    Multi-Input Multi-Output (MIMO) techniques can be used to increase the data rate for a given bit error rate (BER) and transmission power. Due to the small form factor, energy and processing constraints of wireless sensor nodes, a cooperative Virtual MIMO as opposed to True MIMO system architecture is considered more feasible for wireless sensor network (WSN) applications. Virtual MIMO with Vertical-Bell Labs Layered Space-Time (V-BLAST) multiplexing architecture has been recently established to enhance WSN performance. In this paper, we further investigate the impact of different modulation techniques, and analyze for the first time, the performance of a cooperative Virtual MIMO system based on V-BLAST architecture with multi-carrier modulation techniques. Through analytical models and simulations using real hardware and environment settings, both communication and processing energy consumptions, BER, spectral efficiency, and total time delay of multiple cooperative nodes each with single antenna are evaluated. The results show that cooperative Virtual-MIMO with Binary Phase Shift Keying-Wavelet based Orthogonal Frequency Division Multiplexing (BPSK-WOFDM) modulation is a promising solution for future high data-rate and energy-efficient WSNs.

  16. Performance Analysis of Cooperative Virtual MIMO Systems for Wireless Sensor Networks

    PubMed Central

    Rafique, Zimran; Seet, Boon-Chong; Al-Anbuky, Adnan

    2013-01-01

    Multi-Input Multi-Output (MIMO) techniques can be used to increase the data rate for a given bit error rate (BER) and transmission power. Due to the small form factor, energy and processing constraints of wireless sensor nodes, a cooperative Virtual MIMO as opposed to True MIMO system architecture is considered more feasible for wireless sensor network (WSN) applications. Virtual MIMO with Vertical-Bell Labs Layered Space-Time (V-BLAST) multiplexing architecture has been recently established to enhance WSN performance. In this paper, we further investigate the impact of different modulation techniques, and analyze for the first time, the performance of a cooperative Virtual MIMO system based on V-BLAST architecture with multi-carrier modulation techniques. Through analytical models and simulations using real hardware and environment settings, both communication and processing energy consumptions, BER, spectral efficiency, and total time delay of multiple cooperative nodes each with single antenna are evaluated. The results show that cooperative Virtual-MIMO with Binary Phase Shift Keying-Wavelet based Orthogonal Frequency Division Multiplexing (BPSK-WOFDM) modulation is a promising solution for future high data-rate and energy-efficient WSNs. PMID:23760087

  17. Characterizing plant cell wall derived oligosaccharides using hydrophilic interaction chromatography with mass spectrometry detection.

    PubMed

    Leijdekkers, A G M; Sanders, M G; Schols, H A; Gruppen, H

    2011-12-23

    Analysis of complex mixtures of plant cell wall derived oligosaccharides is still challenging and multiple analytical techniques are often required for separation and characterization of these mixtures. In this work it is demonstrated that hydrophilic interaction chromatography coupled with evaporative light scattering and mass spectrometry detection (HILIC-ELSD-MS(n)) is a valuable tool for identification of a wide range of neutral and acidic cell wall derived oligosaccharides. The separation potential for acidic oligosaccharides observed with HILIC is much better compared to other existing techniques, like capillary electrophoresis, reversed phase and porous-graphitized carbon chromatography. Important structural information, such as presence of methyl esters and acetyl groups, is retained during analysis. Separation of acidic oligosaccharides with equal charge yet with different degrees of polymerization can be obtained. The efficient coupling of HILIC with ELSD and MS(n)-detection enables characterization and quantification of many different oligosaccharide structures present in complex mixtures. This makes HILIC-ELSD-MS(n) a versatile and powerful additional technique in plant cell wall analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Nondestructive analysis of automotive paints with spectral domain optical coherence tomography.

    PubMed

    Dong, Yue; Lawman, Samuel; Zheng, Yalin; Williams, Dominic; Zhang, Jinke; Shen, Yao-Chun

    2016-05-01

    We have demonstrated for the first time, to our knowledge, the use of optical coherence tomography (OCT) as an analytical tool for nondestructively characterizing the individual paint layer thickness of multiple layered automotive paints. A graph-based segmentation method was used for automatic analysis of the thickness distribution for the top layers of solid color paints. The thicknesses measured with OCT were in good agreement with the optical microscope and ultrasonic techniques that are the current standard in the automobile industry. Because of its high axial resolution (5.5 μm), the OCT technique was shown to be able to resolve the thickness of individual paint layers down to 11 μm. With its high lateral resolution (12.4 μm), the OCT system was also able to measure the cross-sectional area of the aluminum flakes in a metallic automotive paint. The range of values measured was 300-1850  μm2. In summary, the proposed OCT is a noncontact, high-resolution technique that has the potential for inclusion as part of the quality assurance process in automobile coating.

  19. Fourier transform ion cyclotron resonance mass spectrometry

    NASA Astrophysics Data System (ADS)

    Marshall, Alan G.

    1998-06-01

    As for Fourier transform infrared (FT-IR) interferometry and nuclear magnetic resonance (NMR) spectroscopy, the introduction of pulsed Fourier transform techniques revolutionized ion cyclotron resonance mass spectrometry: increased speed (factor of 10,000), increased sensitivity (factor of 100), increased mass resolution (factor of 10,000-an improvement not shared by the introduction of FT techniques to IR or NMR spectroscopy), increased mass range (factor of 500), and automated operation. FT-ICR mass spectrometry is the most versatile technique for unscrambling and quantifying ion-molecule reaction kinetics and equilibria in the absence of solvent (i.e., the gas phase). In addition, FT-ICR MS has the following analytically important features: speed (~1 second per spectrum); ultrahigh mass resolution and ultrahigh mass accuracy for analysis of mixtures and polymers; attomole sensitivity; MSn with one spectrometer, including two-dimensional FT/FT-ICR/MS; positive and/or negative ions; multiple ion sources (especially MALDI and electrospray); biomolecular molecular weight and sequencing; LC/MS; and single-molecule detection up to 108 Dalton. Here, some basic features and recent developments of FT-ICR mass spectrometry are reviewed, with applications ranging from crude oil to molecular biology.

  20. Laser direct-write for fabrication of three-dimensional paper-based devices.

    PubMed

    He, P J W; Katis, I N; Eason, R W; Sones, C L

    2016-08-16

    We report the use of a laser-based direct-write (LDW) technique that allows the design and fabrication of three-dimensional (3D) structures within a paper substrate that enables implementation of multi-step analytical assays via a 3D protocol. The technique is based on laser-induced photo-polymerisation, and through adjustment of the laser writing parameters such as the laser power and scan speed we can control the depths of hydrophobic barriers that are formed within a substrate which, when carefully designed and integrated, produce 3D flow paths. So far, we have successfully used this depth-variable patterning protocol for stacking and sealing of multi-layer substrates, for assembly of backing layers for two-dimensional (2D) lateral flow devices and finally for fabrication of 3D devices. Since the 3D flow paths can also be formed via a single laser-writing process by controlling the patterning parameters, this is a distinct improvement over other methods that require multiple complicated and repetitive assembly procedures. This technique is therefore suitable for cheap, rapid and large-scale fabrication of 3D paper-based microfluidic devices.

  1. Incorporating Aptamers in the Multiple Analyte Profiling Assays (xMAP): Detection of C-Reactive Protein.

    PubMed

    Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio

    2017-01-01

    Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.

  2. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.

    PubMed

    Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria

    2017-06-15

    Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.

  3. Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data

    PubMed Central

    Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria

    2017-01-01

    Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372

  4. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  5. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  6. Tailoring noise frequency spectrum to improve NIR determinations.

    PubMed

    Xie, Shaofei; Xiang, Bingren; Yu, Liyan; Deng, Haishan

    2009-12-15

    Near infrared spectroscopy (NIR) contains excessive background noise and weak analytical signals caused by near infrared overtones and combinations. That makes it difficult to achieve quantitative determinations of low concentration samples by NIR. A simple chemometric approach has been established to modify the noise frequency spectrum to improve NIR determinations. The proposed method is to multiply one Savitzky-Golay filtered NIR spectrum with another reference spectrum added with thermal noises before the other Savitzky-Golay filter. Since Savitzky-Golay filter is a kind of low-pass filter and cannot eliminate low frequency components of NIR spectrum, using one step or two consecutive Savitzky-Golay filter procedures cannot improve the determination of NIR greatly. Meanwhile, significant improvement is achieved via the Savitzky-Golay filtered NIR spectrum processed with the multiplication alteration before the other Savitzky-Golay filter. The frequency range of the modified noise spectrum shifts toward higher frequency regime via multiplication operation. So the second Savitzky-Golay filter is able to provide better filtering efficiency to obtain satisfied result. The improvement of NIR determination with tailoring noise frequency spectrum technique was demonstrated by both simulated dataset and two measured NIR spectral datasets. It is expected that noise frequency spectrum technique will be adopted mostly in applications where quantitative determination of low concentration sample is crucial.

  7. A multiple hollow fibre liquid-phase microextraction method for the determination of halogenated solvent residues in olive oil.

    PubMed

    Manso, J; García-Barrera, T; Gómez-Ariza, J L; González, A G

    2014-02-01

    The present paper describes a method based on the extraction of analytes by multiple hollow fibre liquid-phase microextraction and detection by ion-trap mass spectrometry and electron capture detectors after gas chromatographic separation. The limits of detection are in the range of 0.13-0.67 μg kg(-1), five orders of magnitude lower than those reached with the European Commission Official method of analysis, with three orders of magnitude of linear range (from the quantification limits to 400 μg kg(-1) for all the analytes) and recoveries in fortified olive oils in the range of 78-104 %. The main advantages of the analytical method are the absence of sample carryover (due to the disposable nature of the membranes), high enrichment factors in the range of 79-488, high throughput and low cost. The repeatability of the analytical method ranged from 8 to 15 % for all the analytes, showing a good performance.

  8. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  10. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  11. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  12. IGSN at Work in the Land Down Under: Exploiting an International Sample Identifier System to Enhance Reproducibility of Australian Geochemcial and Geochronological Data.

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Klump, J. F.; McInnes, B.; Wyborn, L. A.; Brown, A.

    2015-12-01

    The International Geo-Sample Number (IGSN) provides a globally unique identifier for physical samples used to generate analytical data. This unique identifier provides the ability to link each physical sample to any analytical data undertaken on that sample, as well as to any publications derived from any data derived on the sample. IGSN is particularly important for geochemical and geochronological data, where numerous analytical techniques can be undertaken at multiple analytical facilities not only on the parent rock sample itself, but also on derived sample splits and mineral separates. Australia now has three agencies implementing IGSN: Geoscience Australia, CSIRO and Curtin University. All three have now combined into a single project, funded by the Australian Research Data Services program, to better coordinate the implementation of IGSN in Australia, in particular how these agencies allocate IGSN identifiers. The project will register samples from pilot applications in each agency including the CSIRO National Collection of Mineral Spectra database, the Geoscience Australia sample collection, and the Digital Mineral Library of the John De Laeter Centre for Isotope Research at Curtin University. These local agency catalogues will then be aggregated into an Australian portal, which will ultimately be expanded for all geoscience specimens. The development of this portal will also involve developing a common core metadata schema for the description of Australian geoscience specimens, as well as formulating agreed governance models for registering Australian samples. These developments aim to enable a common approach across Australian academic, research organisations and government agencies for the unique identification of geoscience specimens and any analytical data and/or publications derived from them. The emerging pattern of governance and technical collaboration established in Australia may also serve as a blueprint for similar collaborations internationally.

  13. Evaluation of selected methods for determining streamflow during periods of ice effect

    USGS Publications Warehouse

    Melcher, Norwood B.; Walker, J.F.

    1992-01-01

    Seventeen methods for estimating ice-affected streamflow are evaluated for potential use with the U.S. Geological Survey streamflow-gaging station network. The methods evaluated were identified by written responses from U.S. Geological Survey field offices and by a comprehensive literature search. The methods selected and techniques used for applying the methods are described in this report. The methods are evaluated by comparing estimated results with data collected at three streamflow-gaging stations in Iowa during the winter of 1987-88. Discharge measurements were obtained at 1- to 5-day intervals during the ice-affected periods at the three stations to define an accurate baseline record. Discharge records were compiled for each method based on data available, assuming a 6-week field schedule. The methods are classified into two general categories-subjective and analytical--depending on whether individual judgment is necessary for method application. On the basis of results of the evaluation for the three Iowa stations, two of the subjective methods (discharge ratio and hydrographic-and-climatic comparison) were more accurate than the other subjective methods and approximately as accurate as the best analytical method. Three of the analytical methods (index velocity, adjusted rating curve, and uniform flow) could potentially be used at streamflow-gaging stations, where the need for accurate ice-affected discharge estimates justifies the expense of collecting additional field data. One analytical method (ice-adjustment factor) may be appropriate for use at stations with extremely stable stage-discharge ratings and measuring sections. Further research is needed to refine the analytical methods. The discharge-ratio and multiple-regression methods produce estimates of streamflow for varying ice conditions using information obtained from the existing U.S. Geological Survey streamflow-gaging network.

  14. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  15. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  16. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  17. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  18. Predictive Big Data Analytics: A Study of Parkinson's Disease Using Large, Complex, Heterogeneous, Incongruent, Multi-Source and Incomplete Observations.

    PubMed

    Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W

    2016-01-01

    A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.

  19. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  20. [High-sensitive detection of multiple allergenic proteins in infant food with high-resolution mass spectrometry].

    PubMed

    Wu, Ci; Chen, Xi; Liu, Jianhui; Zhang, Xiaolin; Xue, Weifeng; Liang, Zhen; Liu, Mengyao; Cui, Yan; Huang, Daliang; Zhang, Lihua

    2017-10-08

    A novel method of the simultaneous detection of multiple kinds of allergenic proteins in infant food with parallel reaction monitoring (PRM) mode using liquid chromatography-tandem mass spectrometry (LC-MS/MS) was established. In this method, unique peptides with good stability and high sensibility were used to quantify the corresponding allergenic proteins. Furthermore, multiple kinds of allergenic proteins are inspected simultaneously with high sensitivity. In addition, such method was successfully used for the detection of multiple allergenic proteins in infant food. As for the sample preparation for infant food, compared with the traditional acetone precipitation strategy, the protein extraction efficiency and capacity of resisting disturbance are both higher with in-situ filter-aided sample pretreatment (i-FASP) method. All allergenic proteins gave a good linear response with the correlation coefficients ( R 2 ) ≥ 0.99, and the largest concentration range of the allergenic proteins could be four orders of magnitude, and the lowest detection limit was 0.028 mg/L, which was better than that reported in references. Finally, the method was conveniently used to detect the allergens from four imported infant food real samples. All the results demonstrate that this novel strategy is of great significance for providing a rapid and reliable analytical technique for allergen proteomics.

  1. A microfluidic device integrating dual CMOS polysilicon nanowire sensors for on-chip whole blood processing and simultaneous detection of multiple analytes.

    PubMed

    Kuan, Da-Han; Wang, I-Shun; Lin, Jiun-Rue; Yang, Chao-Han; Huang, Chi-Hsien; Lin, Yen-Hung; Lin, Chih-Ting; Huang, Nien-Tsu

    2016-08-02

    The hemoglobin-A1c test, measuring the ratio of glycated hemoglobin (HbA1c) to hemoglobin (Hb) levels, has been a standard assay in diabetes diagnosis that removes the day-to-day glucose level variation. Currently, the HbA1c test is restricted to hospitals and central laboratories due to the laborious, time-consuming whole blood processing and bulky instruments. In this paper, we have developed a microfluidic device integrating dual CMOS polysilicon nanowire sensors (MINS) for on-chip whole blood processing and simultaneous detection of multiple analytes. The micromachined polymethylmethacrylate (PMMA) microfluidic device consisted of a serpentine microchannel with multiple dam structures designed for non-lysed cells or debris trapping, uniform plasma/buffer mixing and dilution. The CMOS-fabricated polysilicon nanowire sensors integrated with the microfluidic device were designed for the simultaneous, label-free electrical detection of multiple analytes. Our study first measured the Hb and HbA1c levels in 11 clinical samples via these nanowire sensors. The results were compared with those of standard Hb and HbA1c measurement methods (Hb: the sodium lauryl sulfate hemoglobin detection method; HbA1c: cation-exchange high-performance liquid chromatography) and showed comparable outcomes. Finally, we successfully demonstrated the efficacy of the MINS device's on-chip whole blood processing followed by simultaneous Hb and HbA1c measurement in a clinical sample. Compared to current Hb and HbA1c sensing instruments, the MINS platform is compact and can simultaneously detect two analytes with only 5 μL of whole blood, which corresponds to a 300-fold blood volume reduction. The total assay time, including the in situ sample processing and analyte detection, was just 30 minutes. Based on its on-chip whole blood processing and simultaneous multiple analyte detection functionalities with a lower sample volume requirement and shorter process time, the MINS device can be effectively applied to real-time diabetes diagnostics and monitoring in point-of-care settings.

  2. Imaging of enzyme activity using bio-LSI system enables simultaneous immunosensing of different analytes in multiple specimens.

    PubMed

    Hokuto, Toshiki; Yasukawa, Tomoyuki; Kunikata, Ryota; Suda, Atsushi; Inoue, Kumi Y; Ino, Kosuke; Matsue, Tomokazu; Mizutani, Fumio

    2016-06-01

    Electrochemical imaging is an excellent technique to characterize an activity of biomaterials, such as enzymes and cells. Large scale integration-based amperometric sensor (Bio-LSI) has been developed for the simultaneous and continuous detection of the concentration distribution of redox species generated by reactions of biomolecules. In this study, the Bio-LSI system was demonstrated to be applicable for simultaneous detection of different anaytes in multiple specimens. The multiple specimens containing human immunoglobulin G (hIgG) and mouse IgG (mIgG) were introduced into each channel of the upper substrate across the antibody lines for hIgG and mIgG on the lower substrate. Hydrogen peroxide generated by the enzyme reaction of glucose oxidase captured at intersections was simultaneously detected by 400 microelectrodes of Bio-LSI chip. The oxidation current increased with increasing the concentrations of hIgG, which can be detected in the range of 0.01-1.0 µg mL(-1) . Simultaneous detection of hIgG and mIgG in multiple specimens was achieved by using line pattern of both antibodies. Therefore, the presence of different target molecules in the multiple samples would be quantitatively and simultaneously visualized as a current image by the Bio-LSI system. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Quantitative and qualitative sensing techniques for biogenic volatile organic compounds and their oxidation products.

    PubMed

    Kim, Saewung; Guenther, Alex; Apel, Eric

    2013-07-01

    The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.

  4. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Tests of multiplicative models in psychology: a case study using the unified theory of implicit attitudes, stereotypes, self-esteem, and self-concept.

    PubMed

    Blanton, Hart; Jaccard, James

    2006-01-01

    Theories that posit multiplicative relationships between variables are common in psychology. A. G. Greenwald et al. recently presented a theory that explicated relationships between group identification, group attitudes, and self-esteem. Their theory posits a multiplicative relationship between concepts when predicting a criterion variable. Greenwald et al. suggested analytic strategies to test their multiplicative model that researchers might assume are appropriate for testing multiplicative models more generally. The theory and analytic strategies of Greenwald et al. are used as a case study to show the strong measurement assumptions that underlie certain tests of multiplicative models. It is shown that the approach used by Greenwald et al. can lead to declarations of theoretical support when the theory is wrong as well as rejection of the theory when the theory is correct. A simple strategy for testing multiplicative models that makes weaker measurement assumptions than the strategy proposed by Greenwald et al. is suggested and discussed.

  6. Self-interaction of NPM1 modulates multiple mechanisms of liquid–liquid phase separation

    DOE PAGES

    Mitrea, Diana M.; Cika, Jaclyn A.; Stanley, Christopher B.; ...

    2018-02-26

    Nucleophosmin (NPM1) is an abundant, oligomeric protein in the granular component of the nucleolus with roles in ribosome biogenesis. Pentameric NPM1 undergoes liquid–liquid phase separation (LLPS) via heterotypic interactions with nucleolar components, including ribosomal RNA (rRNA) and proteins which display multivalent arginine-rich linear motifs (R-motifs), and is integral to the liquid-like nucleolar matrix. Here we show that NPM1 can also undergo LLPS via homotypic interactions between its polyampholytic intrinsically disordered regions, a mechanism that opposes LLPS via heterotypic interactions. Using a combination of biophysical techniques, including confocal microscopy, SAXS, analytical ultracentrifugation, and single-molecule fluorescence, we describe how conformational changes withinmore » NPM1 control valency and switching between the different LLPS mechanisms. We propose that this newly discovered interplay between multiple LLPS mechanisms may influence the direction of vectorial pre-ribosomal particle assembly within, and exit from the nucleolus as part of the ribosome biogenesis process.« less

  7. Editorial for Special Issue on Herbal Medicines and Natural Products.

    PubMed

    Zhou, Zhi-Wei; Zhou, Shu-Feng

    2015-11-16

    Herbal medicines and natural products have been the most productive source of drug development and there is a large line of evidence on the applications of herbal medicines and natural products for the management of body function and the treatment of aliments. The multiple bioactive components in herbal medicines and natural products can explain the multiple targets effect in their medical applications. The increasing usage of state-of-art computational, molecular biological, and analytical chemistry techniques will promote the exploration of the pharmacological effect of previously inaccessible sources of herbal medicines and natural products. Notably, with the increasing reports on the safety issues regarding the medical use of herbal medicines and natural products, the awareness of pharmacovigilance in herbal medicines and natural products needs to be strengthened. To prevent the adverse drug reactions related to herbal medicines and natural products, physicians need to be aware of potential risks and alert patients in the use of herbal medicines and natural products.

  8. Multidimensional Methods for the Formulation of Biopharmaceuticals and Vaccines

    PubMed Central

    Maddux, Nathaniel R.; Joshi, Sangeeta B.; Volkin, David B.; Ralston, John P.; Middaugh, C. Russell

    2013-01-01

    Determining and preserving the higher order structural integrity and conformational stability of proteins, plasmid DNA and macromolecular complexes such as viruses, virus-like particles and adjuvanted antigens is often a significant barrier to the successful stabilization and formulation of biopharmaceutical drugs and vaccines. These properties typically must be investigated with multiple lower resolution experimental methods, since each technique monitors only a narrow aspect of the overall conformational state of a macromolecular system. This review describes the use of empirical phase diagrams (EPDs) to combine large amounts of data from multiple high-throughput instruments and construct a map of a target macromolecule's physical state as a function of temperature, solvent conditions, and other stress variables. We present a tutorial on the mathematical methodology, an overview of some of the experimental methods typically used, and examples of some of the previous major formulation applications. We also explore novel applications of EPDs including potential new mathematical approaches as well as possible new biopharmaceutical applications such as analytical comparability, chemical stability, and protein dynamics. PMID:21647886

  9. Temperature Distribution in a Composite of Opaque and Semitransparent Spectral Layers

    NASA Technical Reports Server (NTRS)

    Siegel, Robert

    1997-01-01

    The analysis of radiative transfer becomes computationally complex for a composite when there are multiple layers and multiple spectral bands. A convenient analytical method is developed for combined radiation and conduction in a composite of alternating semitransparent and opaque layers. The semi- transparent layers absorb, scatter, and emit radiation, and spectral properties with large scattering are included. The two-flux method is used, and its applicability is verified by comparison with a basic solution in the literature. The differential equation in the two-flux method Is solved by deriving a Green's function. The solution technique is applied to analyze radiation effects in a multilayer zirconia thermal barrier coating with internal radiation shields for conditions in an aircraft engine combustor. The zirconia radiative properties are modeled by two spectral bands. Thin opaque layers within the coating are used to decrease radiant transmission that can degrade the zirconia insulating ability. With radiation shields, the temperature distributions more closely approach the opaque limit that provides the lowest metal wall temperatures.

  10. Monoclonal gammopathy with double M-bands: An atypical presentation on serum protein electrophoresis simulating biclonal gammopathy.

    PubMed

    Bora, Kaustubh; Das, Umesh; Barman, Bhupen; Ruram, Alice Abraham

    2017-01-01

    Monoclonal gammopathies, such as multiple myeloma, typically exhibit high levels of a monoclonal immunoglobulin (M-protein), produced by a clone of abnormally proliferating B-lymphocytes and/or plasma cells. The M-protein can be evaluated by serum protein electrophoresis (SPEP), which yields a single discrete band (M-band), usually in the γ-globulin region. Rarely, two M-bands appear simultaneously at different positions during SPEP - a condition known as biclonal gammopathy, which is a result of clonal expansion of two different neoplastic cell lines. Here, we describe an atypical case of IgA-λ multiple myeloma, where double M-bands (one in β- and the other in γ-globulin region) were found during SPEP simulating biclonal gammopathy, although it was monoclonal in nature. This peculiar presentation of double M-bands in monoclonal gammopathy was attributed to polymeric forms of IgA by systematic workup. Further, we discuss how true and apparent biclonality can be distinguished by inexpensive analytical techniques in resource-constrained settings.

  11. Synchronization of tunable asymmetric square-wave pulses in delay-coupled optoelectronic oscillators.

    PubMed

    Martínez-Llinàs, Jade; Colet, Pere; Erneux, Thomas

    2015-03-01

    We consider a model for two delay-coupled optoelectronic oscillators under positive delayed feedback as prototypical to study the conditions for synchronization of asymmetric square-wave oscillations, for which the duty cycle is not half of the period. We show that the scenario arising for positive feedback is much richer than with negative feedback. First, it allows for the coexistence of multiple in- and out-of-phase asymmetric periodic square waves for the same parameter values. Second, it is tunable: The period of all the square-wave periodic pulses can be tuned with the ratio of the delays, and the duty cycle of the asymmetric square waves can be changed with the offset phase while the total period remains constant. Finally, in addition to the multiple in- and out-of-phase periodic square waves, low-frequency periodic asymmetric solutions oscillating in phase may coexist for the same values of the parameters. Our analytical results are in agreement with numerical simulations and bifurcation diagrams obtained by using continuation techniques.

  12. Self-interaction of NPM1 modulates multiple mechanisms of liquid–liquid phase separation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitrea, Diana M.; Cika, Jaclyn A.; Stanley, Christopher B.

    Nucleophosmin (NPM1) is an abundant, oligomeric protein in the granular component of the nucleolus with roles in ribosome biogenesis. Pentameric NPM1 undergoes liquid–liquid phase separation (LLPS) via heterotypic interactions with nucleolar components, including ribosomal RNA (rRNA) and proteins which display multivalent arginine-rich linear motifs (R-motifs), and is integral to the liquid-like nucleolar matrix. Here we show that NPM1 can also undergo LLPS via homotypic interactions between its polyampholytic intrinsically disordered regions, a mechanism that opposes LLPS via heterotypic interactions. Using a combination of biophysical techniques, including confocal microscopy, SAXS, analytical ultracentrifugation, and single-molecule fluorescence, we describe how conformational changes withinmore » NPM1 control valency and switching between the different LLPS mechanisms. We propose that this newly discovered interplay between multiple LLPS mechanisms may influence the direction of vectorial pre-ribosomal particle assembly within, and exit from the nucleolus as part of the ribosome biogenesis process.« less

  13. Strategies for Multi-Modal Analysis

    NASA Astrophysics Data System (ADS)

    Hexemer, Alexander; Wang, Cheng; Pandolfi, Ronald; Kumar, Dinesh; Venkatakrishnan, Singanallur; Sethian, James; Camera Team

    This section on soft materials will be dedicated to discuss the extraction of the chemical distribution and spatial arrangement of constituent elements and functional groups at multiple length scales and, thus, the examination of collective dynamics, transport, and electronic ordering phenomena. Traditional measures of structure in soft materials have relied heavily on scattering and imaging based techniques due to their capacity to measure nanoscale dimensions and their capacity to monitor structure under conditions of dynamic stress loading. Special attentions are planned to focus on the application of resonant x-ray scattering, contrast-varied neutron scattering, analytical transmission electron microscopy, and their combinations. This session aims to bring experts in both scattering and electron microscope fields to discuss recent advances in selectively characterizing structural architectures of complex soft materials, which have often multi-components with a wide range of length scales and multiple functionalities, and thus hopes to foster novel ideas to decipher a higher level of structural complexity in soft materials in future. CAMERA, Early Career Award.

  14. Novel and general approach to linear filter design for contrast-to-noise ratio enhancement of magnetic resonance images with multiple interfering features in the scene

    NASA Astrophysics Data System (ADS)

    Soltanian-Zadeh, Hamid; Windham, Joe P.

    1992-04-01

    Maximizing the minimum absolute contrast-to-noise ratios (CNRs) between a desired feature and multiple interfering processes, by linear combination of images in a magnetic resonance imaging (MRI) scene sequence, is attractive for MRI analysis and interpretation. A general formulation of the problem is presented, along with a novel solution utilizing the simple and numerically stable method of Gram-Schmidt orthogonalization. We derive explicit solutions for the case of two interfering features first, then for three interfering features, and, finally, using a typical example, for an arbitrary number of interfering feature. For the case of two interfering features, we also provide simplified analytical expressions for the signal-to-noise ratios (SNRs) and CNRs of the filtered images. The technique is demonstrated through its applications to simulated and acquired MRI scene sequences of a human brain with a cerebral infarction. For these applications, a 50 to 100% improvement for the smallest absolute CNR is obtained.

  15. Screening of groundwater remedial alternatives for brownfield sites: a comprehensive method integrated MCDA with numerical simulation.

    PubMed

    Li, Wei; Zhang, Min; Wang, Mingyu; Han, Zhantao; Liu, Jiankai; Chen, Zhezhou; Liu, Bo; Yan, Yan; Liu, Zhu

    2018-06-01

    Brownfield sites pollution and remediation is an urgent environmental issue worldwide. The screening and assessment of remedial alternatives is especially complex owing to its multiple criteria that involves technique, economy, and policy. To help the decision-makers selecting the remedial alternatives efficiently, the criteria framework conducted by the U.S. EPA is improved and a comprehensive method that integrates multiple criteria decision analysis (MCDA) with numerical simulation is conducted in this paper. The criteria framework is modified and classified into three categories: qualitative, semi-quantitative, and quantitative criteria, MCDA method, AHP-PROMETHEE (analytical hierarchy process-preference ranking organization method for enrichment evaluation) is used to determine the priority ranking of the remedial alternatives and the solute transport simulation is conducted to assess the remedial efficiency. A case study was present to demonstrate the screening method in a brownfield site in Cangzhou, northern China. The results show that the systematic method provides a reliable way to quantify the priority of the remedial alternatives.

  16. Modeling of mode-locked fiber lasers

    NASA Astrophysics Data System (ADS)

    Shaulov, Gary

    This thesis presents the results of analytical and numerical simulations of mode-locked fiber lasers and their components: multiple quantum well saturable absorbers and nonlinear optical loop mirrors. Due to the growing interest in fiber lasers as a compact source of ultrashort pulses there is a need to develop a full understanding of the advantages and limitations of the different mode-locked techniques. The mode-locked fiber laser study performed in this thesis can be used to optimize the design and performance of mode-locked fiber laser systems. A group at Air Force Research Laboratory reported a fiber laser mode-locked by multiple quantum well (MQW) saturable absorber with stable pulses generated as short as 2 ps [21]. The laser cavity incorporates a chirped fiber Bragg grating as a dispersion element; our analysis showed that the laser operates in the soliton regime. Soliton perturbation theory was applied and conditions for stable pulse operation were investigated. Properties of MQW saturable absorbers and their effect on cavity dynamics were studied and the cases of fast and slow saturable absorbers were considered. Analytical and numerical results are in a good agreement with experimental data. In the case of the laser cavity with a regular fiber Bragg grating, the properties of MQW saturable absorbers dominate the cavity dynamics. It was shown that despite the lack of a soliton shaping mechanism, there is a regime in parameter space where stable or quasi-stable solitary waves solutions can exist. Further a novel technique of fiber laser mode-locking by nonlinear polarization rotation was proposed. Polarization rotation of vector solitons was simulated in a birefringent nonlinear optical loop mirror (NOLM) and the switching characteristics of this device was studied. It was shown that saturable absorber-like action of NOLM allows mode-locked operation of the two fiber laser designs. Laser cavity designs were proposed: figure-eight-type and sigma-type cavity.

  17. Interactive 3D geodesign tool for multidisciplinary wind turbine planning.

    PubMed

    Rafiee, Azarakhsh; Van der Male, Pim; Dias, Eduardo; Scholten, Henk

    2018-01-01

    Wind turbine site planning is a multidisciplinary task comprising of several stakeholder groups from different domains and with different priorities. An information system capable of integrating the knowledge on the multiple aspects of a wind turbine plays a crucial role on providing a common picture to the involved groups. In this study, we have developed an interactive and intuitive 3D system (Falcon) for planning wind turbine locations. This system supports iterative design loops (wind turbine configurations), based on the emerging field of geodesign. The integration of GIS, game engine and the analytical models has resulted in an interactive platform with real-time feedback on the multiple wind turbine aspects which performs efficiently for different use cases and different environmental settings. The implementation of tiling techniques and open standard web services support flexible and on-the-fly loading and querying of different (massive) geospatial elements from different resources. This boosts data accessibility and interoperability that are of high importance in a multidisciplinary process. The incorporation of the analytical models in Falcon makes this system independent from external tools for different environmental impacts estimations and results in a unified platform for performing different environmental analysis in every stage of the scenario design. Game engine techniques, such as collision detection, are applied in Falcon for the real-time implementation of different environmental models (e.g. noise and visibility). The interactivity and real-time performance of Falcon in any location in the whole country assist the stakeholders in the seamless exploration of various scenarios and their resulting environmental effects and provides a scope for an interwoven discussion process. The flexible architecture of the system enables the effortless application of Falcon in other countries, conditional to input data availability. The embedded open web standards in Falcon results in a smooth integration of different input data which are increasingly available online and through standardized access mechanisms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Investigation of Air-Liquid Interface Rings in Buffer Preparation Vessels: the Role of Slip Agents.

    PubMed

    Shi, Ting; Ding, Wei; Kessler, Donald W; De Mas, Nuria; Weaver, Douglas G; Pathirana, Charles; Martin, Russell D; Mackin, Nancy A; Casati, Michael; Miller, Scott A; Pla, Itzcoatl A

    2016-01-01

    Air-liquid interface rings were observed on the side walls of stainless steel buffer vessels after certain downstream buffer preparations. Those rings were resistant to regular cleaning-in-place procedures but could be removed by manual means. To investigate the root cause of this issue, multiple analytical techniques, including liquid chromatography with tandem mass spectrometry detection (LC-MS/MS), high-resolution accurate mass liquid chromatography with mass spectrometry, nuclear magnetic resonance, Fourier transform infrared spectroscopy, and scanning electron microscopy with energy-dispersive X-ray spectroscopy have been employed to characterize the chemical composition of air-liquid interface rings. The main component of air-liquid interface rings was determined to be slip agents, and the origin of the slip agents can be traced back to their presence on raw material packaging liners. Slip agents are commonly used in plastic industry as additives to reduce the coefficient of friction during the manufacturing process of thin films. To mitigate this air-liquid interface ring issue, an alternate liner with low slip agent was identified and implemented with minimal additional cost. We have also proactively tested the packaging liners of other raw materials currently used in our downstream buffer preparation to ensure slip agent levels are appropriate. Air-liquid interface rings were observed on the side walls of stainless steel buffer vessels after certain downstream buffer preparations. To investigate the root cause of this issue, multiple analytical techniques have been employed to characterize the chemical composition of air-liquid interface rings. The main components of air-liquid interface rings were determined to be slip agents, which are common additives used in the manufacturing process of thin films. The origin of the slip agents can be traced back to their presence on certain raw material packaging liners. To mitigate this air-liquid interface ring issue, an alternate liner with low slip agent was identified and implemented. © PDA, Inc. 2016.

  19. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Impact desolvation of electrosprayed microdroplets--a new ionization method for mass spectrometry of large biomolecules.

    PubMed

    Aksyonov, S A; Williams, P

    2001-01-01

    Impact desolvation of electrosprayed microdroplets (IDEM) is a new method for producing gas-phase ions of large biomolecules. Analytes are dissolved in an electrolyte solution which is electrosprayed in vacuum, producing highly charged micron and sub-micron sized droplets (microdroplets). These microdroplets are accelerated through potential differences approximately 5 - 10 kV to velocities of several km/s and allowed to impact a target surface. The energetic impacts vaporize the droplets and release desolvated gas-phase ions of the analyte molecules. Oligonucleotides (2- to 12-mer) and peptides (bradykinin, neurotensin) yield singly and doubly charged molecular ions with no detectable fragmentation. Because the extent of multiple charging is significantly less than in atmospheric pressure electrospray ionization, and the method produces ions largely free of adducts from solutions of high ionic strength, IDEM has some promise as a method for coupling to liquid chromatographic techniques and for mixture analysis. Ions are produced in vacuum at a flat equipotential surface, potentially allowing efficient ion extraction. Copyright 2001 John Wiley & Sons, Ltd.

  1. Evolution of evaluation criteria in the College of American Pathologists Surveys.

    PubMed

    Ross, J W

    1988-04-01

    This review of the evolution of evaluation criteria in the College of American Pathologists Survey and of theoretical grounds proposed for evaluation criteria explores the complex nature of the evaluation process. Survey professionals balance multiple variables to seek relevant and meaningful evaluations. These include the state of the art, the reliability of target values, the nature of available control materials, the perceived medical "nonusefulness" of the extremes of performance (good or poor), this extent of laboratory services provided, and the availability of scientific data and theory by which clinically relevant criteria of medical usefulness may be established. The evaluation process has consistently sought peer concensus, to stimulate improvement in state of the art, to increase medical usefulness, and to monitor the state of the art. Recent factors that are likely to promote change from peer group evaluation to fixed criteria evaluation are the high degree of proficiency in the state of the art for many analytes, accurate target values, increased knowledge of biologic variation, and the availability of statistical modeling techniques simulating biologic and diagnostic processes as well as analytic processes.

  2. Design and testing of digitally manufactured paraffin Acrylonitrile-butadiene-styrene hybrid rocket motors

    NASA Astrophysics Data System (ADS)

    McCulley, Jonathan M.

    This research investigates the application of additive manufacturing techniques for fabricating hybrid rocket fuel grains composed of porous Acrylonitrile-butadiene-styrene impregnated with paraffin wax. The digitally manufactured ABS substrate provides mechanical support for the paraffin fuel material and serves as an additional fuel component. The embedded paraffin provides an enhanced fuel regression rate while having no detrimental effect on the thermodynamic burn properties of the fuel grain. Multiple fuel grains with various ABS-to-Paraffin mass ratios were fabricated and burned with nitrous oxide. Analytical predictions for end-to-end motor performance and fuel regression are compared against static test results. Baseline fuel grain regression calculations use an enthalpy balance energy analysis with the material and thermodynamic properties based on the mean paraffin/ABS mass fractions within the fuel grain. In support of these analytical comparisons, a novel method for propagating the fuel port burn surface was developed. In this modeling approach the fuel cross section grid is modeled as an image with white pixels representing the fuel and black pixels representing empty or burned grid cells.

  3. Automated multi-radionuclide separation and analysis with combined detection capability

    NASA Astrophysics Data System (ADS)

    Plionis, Alexander Asterios

    The radiological dispersal device (RDD) is a weapon of great concern to those agencies responsible for protecting the public from the modern age of terrorism. In order to effectively respond to an RDD event, these agencies need to possess the capability to rapidly identify the radiological agents involved in the incident and assess the uptake of each individual victim. Since medical treatment for internal radiation poisoning is radionuclide-specific, it is critical to identify and quantify the radiological uptake of each individual victim. This dissertation describes the development of automated analytical components that could be used to determine and quantify multiple radionuclides in human urine bioassays. This is accomplished through the use of extraction chromatography that is plumbed in-line with one of a variety of detection instruments. Flow scintillation analysis is used for 90Sr and 210Po determination, flow gamma analysis is used assess 60 Co and 137Cs, and inductively coupled plasma mass spectrometry is used to determine actinides. Detection limits for these analytes were determined for the appropriate technique and related to their implications for health physics.

  4. Analytical Challenge in Postmortem Toxicology Applied to a Human Body Found into a Lake after Three Years Immersion.

    PubMed

    Morini, Luca; Vignali, Claudia; Tricomi, Paolo; Groppi, Angelo

    2015-09-01

    The body of a 30-year-old woman was found in Como lake at a depth of about 120 meters in her own car after 3 years of immersion. The aim of this study was to evaluate psychoactive drugs as well as alcohol biomarkers in biological matrices. The following analyses were initially performed: GC-MS systematic toxicological analysis on biological fluids and tissues; GC-MS analysis of drugs of abuse on pubic hair; direct ethanol metabolite determination in pubic hair by LC-MS/MS. After 7 years, the samples, that had been stored at -20°C, were re-analyzed and submitted to an LC-MS/MS targeted screening method, using multiple reaction monitoring mode. These analyses detected citalopram (150-3000 ng/mL), desmethylcitalopram (50-2300 ng/mL), clotiapine (20-65 ng/mL), and ethyl glucuronide (97 pg/mg). The methods showed an acceptable reproducibility, and the concentrations of citalopram and desmethylcitalopram calculated through the two analytical techniques did not significantly differ in biological fluids. © 2015 American Academy of Forensic Sciences.

  5. The effect of ion plated silver and sliding friction on tensile stress-induced cracking in aluminum oxide

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Spalvins, Talivaldis

    1991-01-01

    A Hertzian analysis of the effect of sliding friction on contact stresses in alumina is used to predict the critical load for crack generation. The results for uncoated alumina and alumina coated with ion plated silver are compared. Friction coefficient inputs to the analysis are determined experimentally with a scratch test instrument employing an 0.2 mm radius diamond stylus. A series of scratches were made at constant load increments on coated and uncoated flat alumina surfaces. Critical loads for cracking are detected by microscopic examination of cross sections of scratches made at various loads and friction coefficients. Acoustic emission (AE) and friction trends were also evaluated as experimental techniques for determining critical loads for cracking. Analytical predictions correlate well with micrographic evidence and with the lowest load at which AE is detected in multiple scratch tests. Friction/load trends are not good indicators of early crack formation. Lubrication with silver films reduced friction and thereby increased the critical load for crack initiation in agreement with analytical predictions.

  6. The effect of ion-plated silver and sliding friction on tensile stress-induced cracking in aluminum oxide

    NASA Technical Reports Server (NTRS)

    Sliney, Harold E.; Spalvins, Talivaldis

    1993-01-01

    A Hertzian analysis of the effect of sliding friction on contact stresses in alumina is used to predict the critical load for crack generation. The results for uncoated alumina and alumina coated with ion plated silver are compared. Friction coefficient inputs to the analysis are determined experimentally with a scratch test instrument employing an 0.2 mm radius diamond stylus. A series of scratches were made at constant load increments on coated and uncoated flat alumina surfaces. Critical loads for cracking are detected by microscopic examination of cross sections of scratches made at various loads and friction coefficients. Acoustic emission (AE) and friction trends were also evaluated as experimental techniques for determining critical loads for cracking. Analytical predictions correlate well with micrographic evidence and with the lowest load at which AE is detected in multiple scratch tests. Friction/load trends are not good indicators of early crack formation. Lubrication with silver films reduced friction and thereby increased the critical load for crack initiation in agreement with analytical predictions.

  7. An Empirical Study of Chronic Diseases in the United States: A Visual Analytics Approach to Public Health

    PubMed Central

    Raghupathi, Wullianallur; Raghupathi, Viju

    2018-01-01

    In this research we explore the current state of chronic diseases in the United States, using data from the Centers for Disease Control and Prevention and applying visualization and descriptive analytics techniques. Five main categories of variables are studied, namely chronic disease conditions, behavioral health, mental health, demographics, and overarching conditions. These are analyzed in the context of regions and states within the U.S. to discover possible correlations between variables in several categories. There are widespread variations in the prevalence of diverse chronic diseases, the number of hospitalizations for specific diseases, and the diagnosis and mortality rates for different states. Identifying such correlations is fundamental to developing insights that will help in the creation of targeted management, mitigation, and preventive policies, ultimately minimizing the risks and costs of chronic diseases. As the population ages and individuals suffer from multiple conditions, or comorbidity, it is imperative that the various stakeholders, including the government, non-governmental organizations (NGOs), policy makers, health providers, and society as a whole, address these adverse effects in a timely and efficient manner. PMID:29494555

  8. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.

  9. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  10. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  11. Aptamer- and nucleic acid enzyme-based systems for simultaneous detection of multiple analytes

    DOEpatents

    Lu, Yi [Champaign, IL; Liu, Juewen [Albuquerque, NM

    2011-11-15

    The present invention provides aptamer- and nucleic acid enzyme-based systems for simultaneously determining the presence and optionally the concentration of multiple analytes in a sample. Methods of utilizing the system and kits that include the sensor components are also provided. The system includes a first reactive polynucleotide that reacts to a first analyte; a second reactive polynucleotide that reacts to a second analyte; a third polynucleotide; a fourth polynucleotide; a first particle, coupled to the third polynucleotide; a second particle, coupled to the fourth polynucleotide; and at least one quencher, for quenching emissions of the first and second quantum dots, coupled to the first and second reactive polynucleotides. The first particle includes a quantum dot having a first emission wavelength. The second particle includes a second quantum dot having a second emission wavelength different from the first emission wavelength. The third polynucleotide and the fourth polynucleotide are different.

  12. On the nonlinear dynamics of trolling-mode AFM: Analytical solution using multiple time scales method

    NASA Astrophysics Data System (ADS)

    Sajjadi, Mohammadreza; Pishkenari, Hossein Nejat; Vossoughi, Gholamreza

    2018-06-01

    Trolling mode atomic force microscopy (TR-AFM) has resolved many imaging problems by a considerable reduction of the liquid-resonator interaction forces in liquid environments. The present study develops a nonlinear model of the meniscus force exerted to the nanoneedle of TR-AFM and presents an analytical solution to the distributed-parameter model of TR-AFM resonator utilizing multiple time scales (MTS) method. Based on the developed analytical solution, the frequency-response curves of the resonator operation in air and liquid (for different penetration length of the nanoneedle) are obtained. The closed-form analytical solution and the frequency-response curves are validated by the comparison with both the finite element solution of the main partial differential equations and the experimental observations. The effect of excitation angle of the resonator on horizontal oscillation of the probe tip and the effect of different parameters on the frequency-response of the system are investigated.

  13. Cross-reactivity profiles of legumes and tree nuts using the xMAP® multiplex food allergen detection assay.

    PubMed

    Cho, Chung Y; Oles, Carolyn; Nowatzke, William; Oliver, Kerry; Garber, Eric A E

    2017-10-01

    The homology between proteins in legumes and tree nuts makes it common for individuals with food allergies to be allergic to multiple legumes and tree nuts. This propensity for allergenic and antigenic cross-reactivity means that commonly employed commercial immunodiagnostic assays (e.g., dipsticks) for the detection of food allergens may not always accurately detect, identify, and quantitate legumes and tree nuts unless additional orthogonal analytical methods or secondary measures of analysis are employed. The xMAP ® Multiplex Food Allergen Detection Assay (FADA) was used to determine the cross-reactivity patterns and the utility of multi-antibody antigenic profiling to distinguish between legumes and tree nuts. Pure legumes and tree nuts extracted using buffered detergent displayed a high level of cross-reactivity that decreased upon dilution or by using a buffer (UD buffer) designed to increase the stringency of binding conditions and reduce the occurrence of false positives due to plant-derived lectins. Testing for unexpected food allergens or the screening for multiple food allergens often involves not knowing the identity of the allergen present, its concentration, or the degree of modification during processing. As such, the analytical response measured may represent multiple antigens of varying antigenicity (cross-reactivity). This problem of multiple potential analytes is usually unresolved and the focus becomes the primary analyte, the antigen the antibody was raised against, or quantitative interpretation of the content of the analytical sample problematic. The alternative solution offered here to this problem is the use of an antigenic profile as generated by the xMAP FADA using multiple antibodies (bead sets). By comparing the antigenic profile to standards, the allergen may be identified along with an estimate of the concentration present. Cluster analysis of the xMAP FADA data was also performed and agreed with the known phylogeny of the legumes and tree nuts being analyzed. Graphical abstract The use of cluster analysis to compare the multi-antigen profiles of food allergens.

  14. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  15. Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975

    DTIC Science & Technology

    1975-09-01

    and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the

  16. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  17. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  18. Predicting failure to return to work.

    PubMed

    Mills, R

    2012-08-01

    The research question is: is it possible to predict, at the time of workers' compensation claim lodgement, which workers will have a prolonged return to work (RTW) outcome? This paper illustrates how a traditional analytic approach to the analysis of an existing large database can be insufficient to answer the research question, and suggests an alternative data management and analysis approach. This paper retrospectively analyses 9018 workers' compensation claims from two different workers' compensation jurisdictions in Australia (two data sets) over a 4-month period in 2007. De-identified data, submitted at the time of claim lodgement, were compared with RTW outcomes for up to 3 months. Analysis consisted of descriptive, parametric (analysis of variance and multiple regression), survival (proportional hazards) and data mining (partitioning) analysis. No significant associations were found on parametric analysis. Multiple associations were found between the predictor variables and RTW outcome on survival analysis, with marked differences being found between some sub-groups on partitioning--where diagnosis was found to be the strongest discriminator (particularly neck and shoulder injuries). There was a consistent trend for female gender to be associated with a prolonged RTW outcome. The supplied data were not sufficient to enable the development of a predictive model. If we want to predict early who will have a prolonged RTW in Australia, workers' compensation claim forms should be redesigned, data management improved and specialised analytic techniques used. © 2011 The Author. Internal Medicine Journal © 2011 Royal Australasian College of Physicians.

  19. A new technique to transfer metallic nanoscale patterns to small and non-planar surfaces: Application to a fiber optic device for surface enhanced Raman scattering detection

    NASA Astrophysics Data System (ADS)

    Smythe, Elizabeth Jennings

    This thesis focuses on the development of a bidirectional fiber optic probe for the detection of surface enhanced Raman scattering (SERS). One facet of this fiber-based probe featured an array of coupled optical antennas, which we designed to enhance the Raman signal of nearby analytes. When this array interacted with an analyte, it generated SERS signals specific to the chemical composition of the sample; some of these SERS signals coupled back into the fiber. We used the other facet of the probe to input light into the fiber and collect the SERS signals that coupled into the probe. In this dissertation, the development of the probe is broken into three sections: (i) characterization of antenna arrays, (ii) fabrication of the probe, and (iii) device measurements. In the first section we present a comprehensive study of metallic antenna arrays. We carried out this study to determine the effects of antenna geometry, spacing, and composition on the surface plasmon resonance (SPR) of a coupled antenna array; the wavelength range and strength of the SPR are functions of the shape and interactions of the antennas. The SPR of the array ultimately amplified the Raman signal of analytes and produced a measurable SERS signal, thus determination of the optimal array geometries for SERS generation was an important first step in the development of the SERS fiber probe. We then introduce a new technique developed to fabricate the SERS fiber probes. This technique involves transferring antenna arrays (created by standard lithographic methods) from a large silicon substrate to a fiber facet. We developed this fabrication technique to bypass many of the limitations presented by previously developed methods for patterning unconventional substrates (i.e. small and/or non-planar substrates), such as focused ion-beam milling and soft lithography. In the third section of this thesis, we present SERS measurements taken with the fiber probe. We constructed a measurement system to couple light into the probe and filter out background noise; this allowed simultaneous detection of multiple chemicals. Antenna array enhancement factor (EF) calculations are shown; these allowed us to determine that the probe efficiently collected SERS signals.

  20. Fast detection and visualization of minced lamb meat adulteration using NIR hyperspectral imaging and multivariate image analysis.

    PubMed

    Kamruzzaman, Mohammed; Sun, Da-Wen; ElMasry, Gamal; Allen, Paul

    2013-01-15

    Many studies have been carried out in developing non-destructive technologies for predicting meat adulteration, but there is still no endeavor for non-destructive detection and quantification of adulteration in minced lamb meat. The main goal of this study was to develop and optimize a rapid analytical technique based on near-infrared (NIR) hyperspectral imaging to detect the level of adulteration in minced lamb. Initial investigation was carried out using principal component analysis (PCA) to identify the most potential adulterate in minced lamb. Minced lamb meat samples were then adulterated with minced pork in the range 2-40% (w/w) at approximately 2% increments. Spectral data were used to develop a partial least squares regression (PLSR) model to predict the level of adulteration in minced lamb. Good prediction model was obtained using the whole spectral range (910-1700 nm) with a coefficient of determination (R(2)(cv)) of 0.99 and root-mean-square errors estimated by cross validation (RMSECV) of 1.37%. Four important wavelengths (940, 1067, 1144 and 1217 nm) were selected using weighted regression coefficients (Bw) and a multiple linear regression (MLR) model was then established using these important wavelengths to predict adulteration. The MLR model resulted in a coefficient of determination (R(2)(cv)) of 0.98 and RMSECV of 1.45%. The developed MLR model was then applied to each pixel in the image to obtain prediction maps to visualize the distribution of adulteration of the tested samples. The results demonstrated that the laborious and time-consuming tradition analytical techniques could be replaced by spectral data in order to provide rapid, low cost and non-destructive testing technique for adulterate detection in minced lamb meat. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  2. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  3. Robotics in scansorial environments

    NASA Astrophysics Data System (ADS)

    Autumn, Kellar; Buehler, Martin; Cutkosky, Mark; Fearing, Ronald; Full, Robert J.; Goldman, Daniel; Groff, Richard; Provancher, William; Rizzi, Alfred A.; Saranli, Uluc; Saunders, Aaron; Koditschek, Daniel E.

    2005-05-01

    We review a large multidisciplinary effort to develop a family of autonomous robots capable of rapid, agile maneuvers in and around natural and artificial vertical terrains such as walls, cliffs, caves, trees and rubble. Our robot designs are inspired by (but not direct copies of) biological climbers such as cockroaches, geckos, and squirrels. We are incorporating advanced materials (e.g., synthetic gecko hairs) into these designs and fabricating them using state of the art rapid prototyping techniques (e.g., shape deposition manufacturing) that permit multiple iterations of design and testing with an effective integration path for the novel materials and components. We are developing novel motion control techniques to support dexterous climbing behaviors that are inspired by neuroethological studies of animals and descended from earlier frameworks that have proven analytically tractable and empirically sound. Our near term behavioral targets call for vertical climbing on soft (e.g., bark) or rough surfaces and for ascents on smooth, hard steep inclines (e.g., 60 degree slopes on metal or glass sheets) at one body length per second.

  4. Chemical analyses of fossil bone.

    PubMed

    Zheng, Wenxia; Schweitzer, Mary Higby

    2012-01-01

    The preservation of microstructures consistent with soft tissues, cells, and other biological components in demineralized fragments of dinosaur bone tens of millions of years old was unexpected, and counter to current hypotheses of tissue, cellular, and molecular degradation. Although the morphological similarity of these tissues to extant counterparts was unmistakable, after at least 80 million years exposed to geochemical influences, morphological similarity is insufficient to support an endogenous source. To test this hypothesis, and to characterize these materials at a molecular level, we applied multiple independent chemical, molecular, and microscopic analyses to identify the presence of original components produced by the extinct organisms. Microscopic techniques included field emission scanning electron microscopy, analytical transmission electron microscopy, transmitted light microscopy (LM), and fluorescence microscopy (FM). The chemical and molecular techniques include enzyme-linked immunosorbant assay, sodium dodecyl sulfate polyacrylamide gel electrophoresis, western blot (immunoblot), and attenuated total reflectance infrared spectroscopy. In situ analyses performed directly on tissues included immunohistochemistry and time-of-flight secondary ion mass spectrometry. The details of sample preparation and methodology are described in detail herein.

  5. Integrating multi-criteria evaluation techniques with geographic information systems for landfill site selection: a case study using ordered weighted average.

    PubMed

    Gorsevski, Pece V; Donevska, Katerina R; Mitrovski, Cvetko D; Frizado, Joseph P

    2012-02-01

    This paper presents a GIS-based multi-criteria decision analysis approach for evaluating the suitability for landfill site selection in the Polog Region, Macedonia. The multi-criteria decision framework considers environmental and economic factors which are standardized by fuzzy membership functions and combined by integration of analytical hierarchy process (AHP) and ordered weighted average (OWA) techniques. The AHP is used for the elicitation of attribute weights while the OWA operator function is used to generate a wide range of decision alternatives for addressing uncertainty associated with interaction between multiple criteria. The usefulness of the approach is illustrated by different OWA scenarios that report landfill suitability on a scale between 0 and 1. The OWA scenarios are intended to quantify the level of risk taking (i.e., optimistic, pessimistic, and neutral) and to facilitate a better understanding of patterns that emerge from decision alternatives involved in the decision making process. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1

    NASA Astrophysics Data System (ADS)

    Lee, F. C.; Mahmoud, M. F.; Yu, Y.

    1980-04-01

    The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.

  7. Imaging-based molecular barcoding with pixelated dielectric metasurfaces.

    PubMed

    Tittl, Andreas; Leitis, Aleksandrs; Liu, Mingkai; Yesilkoy, Filiz; Choi, Duk-Yong; Neshev, Dragomir N; Kivshar, Yuri S; Altug, Hatice

    2018-06-08

    Metasurfaces provide opportunities for wavefront control, flat optics, and subwavelength light focusing. We developed an imaging-based nanophotonic method for detecting mid-infrared molecular fingerprints and implemented it for the chemical identification and compositional analysis of surface-bound analytes. Our technique features a two-dimensional pixelated dielectric metasurface with a range of ultrasharp resonances, each tuned to a discrete frequency; this enables molecular absorption signatures to be read out at multiple spectral points, and the resulting information is then translated into a barcode-like spatial absorption map for imaging. The signatures of biological, polymer, and pesticide molecules can be detected with high sensitivity, covering applications such as biosensing and environmental monitoring. Our chemically specific technique can resolve absorption fingerprints without the need for spectrometry, frequency scanning, or moving mechanical parts, thereby paving the way toward sensitive and versatile miniaturized mid-infrared spectroscopy devices. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  8. Physical-geometric optics method for large size faceted particles.

    PubMed

    Sun, Bingqiang; Yang, Ping; Kattawar, George W; Zhang, Xiaodong

    2017-10-02

    A new physical-geometric optics method is developed to compute the single-scattering properties of faceted particles. It incorporates a general absorption vector to accurately account for inhomogeneous wave effects, and subsequently yields the relevant analytical formulas effective and computationally efficient for absorptive scattering particles. A bundle of rays incident on a certain facet can be traced as a single beam. For a beam incident on multiple facets, a systematic beam-splitting technique based on computer graphics is used to split the original beam into several sub-beams so that each sub-beam is incident only on an individual facet. The new beam-splitting technique significantly reduces the computational burden. The present physical-geometric optics method can be generalized to arbitrary faceted particles with either convex or concave shapes and with a homogeneous or an inhomogeneous (e.g., a particle with a core) composition. The single-scattering properties of irregular convex homogeneous and inhomogeneous hexahedra are simulated and compared to their counterparts from two other methods including a numerically rigorous method.

  9. Optomechanically induced transparency with Bose–Einstein condensate in double-cavity optomechanical system

    NASA Astrophysics Data System (ADS)

    Liu, Li-Wei; Gengzang, Duo-Jie; An, Xiu-Jia; Wang, Pei-Yu

    2018-03-01

    We propose a novel technique of generating multiple optomechanically induced transparency (OMIT) of a weak probe field in hybrid optomechanical system. This system consists of a cigar-shaped Bose–Einstein condensate (BEC), trapped inside each high finesse Fabry-Pérot cavity. In the resolved sideband regime, the analytic solutions of the absorption and the dispersion spectrum are given. The tunneling strength of the two resonators and the coupling parameters of the each BEC in combination with the cavity field have the appearance of three distinct OMIT windows in the absorption spectrum. Furthermore, whether there is BEC in each cavity is a key factor in the number of OMIT windows determination. The technique presented may have potential applications in quantum engineering and quantum information networks. Project supported by the National Natural Science Foundation of China (Grant Nos. 11564034, 11105062, and 21663026) and the Scientific Research Funds of College of Electrical Engineering, Northwest University, China (Grant No. xbmuyjrc201115).

  10. Application handbook for a Standardized Control Module (SCM) for DC-DC converters, volume 1

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Mahmoud, M. F.; Yu, Y.

    1980-01-01

    The standardized control module (SCM) was developed for application in the buck, boost and buck/boost DC-DC converters. The SCM used multiple feedback loops to provide improved input line and output load regulation, stable feedback control system, good dynamic transient response and adaptive compensation of the control loop for changes in open loop gain and output filter time constraints. The necessary modeling and analysis tools to aid the design engineer in the application of the SCM to DC-DC Converters were developed. The SCM functional block diagram and the different analysis techniques were examined. The average time domain analysis technique was chosen as the basic analytical tool. The power stage transfer functions were developed for the buck, boost and buck/boost converters. The analog signal and digital signal processor transfer functions were developed for the three DC-DC Converter types using the constant on time, constant off time and constant frequency control laws.

  11. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  12. Microchip electrophoresis with background electrolyte containing polyacrylic acid and high content organic solvent in cyclic olefin copolymer microchips for easily adsorbed dyes.

    PubMed

    Wei, Xuan; Sun, Ping; Yang, Shenghong; Zhao, Lei; Wu, Jing; Li, Fengyun; Pu, Qiaosheng

    2016-07-29

    Plastic microchips can significantly reduce the fabrication cost but the adsorption of some analytes limits their application. In this work, background electrolyte containing ionic polymer and high content of organic solvent was adopted to eliminate the analyte adsorption and achieve highly efficient separation in microchip electrophoresis. Two dyes, rhodamine 6G (Rh6G) and rhodamine B (RhB) were used as the model analytes. By using methanol as the organic solvent and polyacrylic acid (PAA) as a multifunctional additive, successful separation of the two dyes within 75μm id. microchannels was realized. The role of PAA is multiple, including viscosity regulator, selectivity modifier and active additive for counteracting analyte adsorption on the microchannel surface. The number of theoretical plate of 7.0×10(5)/m was attained within an effective separation distance of 2cm using background electrolyte consisting 80% methanol, 0.36% PAA and 30mmol/L phosphate at pH 5.0. Under optimized conditions, relative standard deviations of Rh6G and RhB detection (n=5) were no more than 1.5% for migration time and 2.0% for peak area, respectively. The limit of detection (S/N=3) was 0.1nmol/L for Rh6G. The proposed technique was applied in the determination of both Rh6G and RhB in chilli powder and lipstick samples with satisfactory recoveries of 81.3-103.7%. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality

    PubMed Central

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; MacEachren, Alan M

    2008-01-01

    Background Kulldorff's spatial scan statistic and its software implementation – SaTScan – are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. Results We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. Conclusion The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. Method We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit. PMID:18992163

  14. Geovisual analytics to enhance spatial scan statistic interpretation: an analysis of U.S. cervical cancer mortality.

    PubMed

    Chen, Jin; Roth, Robert E; Naito, Adam T; Lengerich, Eugene J; Maceachren, Alan M

    2008-11-07

    Kulldorff's spatial scan statistic and its software implementation - SaTScan - are widely used for detecting and evaluating geographic clusters. However, two issues make using the method and interpreting its results non-trivial: (1) the method lacks cartographic support for understanding the clusters in geographic context and (2) results from the method are sensitive to parameter choices related to cluster scaling (abbreviated as scaling parameters), but the system provides no direct support for making these choices. We employ both established and novel geovisual analytics methods to address these issues and to enhance the interpretation of SaTScan results. We demonstrate our geovisual analytics approach in a case study analysis of cervical cancer mortality in the U.S. We address the first issue by providing an interactive visual interface to support the interpretation of SaTScan results. Our research to address the second issue prompted a broader discussion about the sensitivity of SaTScan results to parameter choices. Sensitivity has two components: (1) the method can identify clusters that, while being statistically significant, have heterogeneous contents comprised of both high-risk and low-risk locations and (2) the method can identify clusters that are unstable in location and size as the spatial scan scaling parameter is varied. To investigate cluster result stability, we conducted multiple SaTScan runs with systematically selected parameters. The results, when scanning a large spatial dataset (e.g., U.S. data aggregated by county), demonstrate that no single spatial scan scaling value is known to be optimal to identify clusters that exist at different scales; instead, multiple scans that vary the parameters are necessary. We introduce a novel method of measuring and visualizing reliability that facilitates identification of homogeneous clusters that are stable across analysis scales. Finally, we propose a logical approach to proceed through the analysis of SaTScan results. The geovisual analytics approach described in this manuscript facilitates the interpretation of spatial cluster detection methods by providing cartographic representation of SaTScan results and by providing visualization methods and tools that support selection of SaTScan parameters. Our methods distinguish between heterogeneous and homogeneous clusters and assess the stability of clusters across analytic scales. We analyzed the cervical cancer mortality data for the United States aggregated by county between 2000 and 2004. We ran SaTScan on the dataset fifty times with different parameter choices. Our geovisual analytics approach couples SaTScan with our visual analytic platform, allowing users to interactively explore and compare SaTScan results produced by different parameter choices. The Standardized Mortality Ratio and reliability scores are visualized for all the counties to identify stable, homogeneous clusters. We evaluated our analysis result by comparing it to that produced by other independent techniques including the Empirical Bayes Smoothing and Kafadar spatial smoother methods. The geovisual analytics approach introduced here is developed and implemented in our Java-based Visual Inquiry Toolkit.

  15. High temperature ion channels and pores

    NASA Technical Reports Server (NTRS)

    Cheley, Stephen (Inventor); Gu, Li Qun (Inventor); Bayley, Hagan (Inventor); Kang, Xiaofeng (Inventor)

    2011-01-01

    The present invention includes an apparatus, system and method for stochastic sensing of an analyte to a protein pore. The protein pore may be an engineer protein pore, such as an ion channel at temperatures above 55.degree. C. and even as high as near 100.degree. C. The analyte may be any reactive analyte, including chemical weapons, environmental toxins and pharmaceuticals. The analyte covalently bonds to the sensor element to produce a detectable electrical current signal. Possible signals include change in electrical current. Detection of the signal allows identification of the analyte and determination of its concentration in a sample solution. Multiple analytes present in the same solution may also be detected.

  16. Leak location using the pattern of the frequency response diagram in pipelines: a numerical study

    NASA Astrophysics Data System (ADS)

    Lee, Pedro J.; Vítkovský, John P.; Lambert, Martin F.; Simpson, Angus R.; Liggett, James A.

    2005-06-01

    This paper presents a method of leak detection in a single pipe where the behaviour of the system frequency response diagram (FRD) is used as an indicator of the pipe integrity. The presence of a leak in a pipe imposes a pattern on the resonance peaks of the FRD that can be used as a clear indication of leakage. Analytical expressions describing the pattern of the resonance peaks are derived. Illustrations of how this pattern can be used to individually locate and size multiple leaks within the system are presented. Practical issues with the technique, such as the procedure for frequency response extraction, the impact of measurement position, noise- and frequency-dependent friction are also discussed.

  17. Highly charged ion secondary ion mass spectroscopy

    DOEpatents

    Hamza, Alex V.; Schenkel, Thomas; Barnes, Alan V.; Schneider, Dieter H.

    2001-01-01

    A secondary ion mass spectrometer using slow, highly charged ions produced in an electron beam ion trap permits ultra-sensitive surface analysis and high spatial resolution simultaneously. The spectrometer comprises an ion source producing a primary ion beam of highly charged ions that are directed at a target surface, a mass analyzer, and a microchannel plate detector of secondary ions that are sputtered from the target surface after interaction with the primary beam. The unusually high secondary ion yield permits the use of coincidence counting, in which the secondary ion stops are detected in coincidence with a particular secondary ion. The association of specific molecular species can be correlated. The unique multiple secondary nature of the highly charged ion interaction enables this new analytical technique.

  18. Optimal design application on the advanced aeroelastic rotor blade

    NASA Technical Reports Server (NTRS)

    Wei, F. S.; Jones, R.

    1985-01-01

    The vibration and performance optimization procedure using regression analysis was successfully applied to an advanced aeroelastic blade design study. The major advantage of this regression technique is that multiple optimizations can be performed to evaluate the effects of various objective functions and constraint functions. The data bases obtained from the rotorcraft flight simulation program C81 and Myklestad mode shape program are analytically determined as a function of each design variable. This approach has been verified for various blade radial ballast weight locations and blade planforms. This method can also be utilized to ascertain the effect of a particular cost function which is composed of several objective functions with different weighting factors for various mission requirements without any additional effort.

  19. Data extraction for complex meta-analysis (DECiMAL) guide.

    PubMed

    Pedder, Hugo; Sarri, Grammati; Keeney, Edna; Nunes, Vanessa; Dias, Sofia

    2016-12-13

    As more complex meta-analytical techniques such as network and multivariate meta-analyses become increasingly common, further pressures are placed on reviewers to extract data in a systematic and consistent manner. Failing to do this appropriately wastes time, resources and jeopardises accuracy. This guide (data extraction for complex meta-analysis (DECiMAL)) suggests a number of points to consider when collecting data, primarily aimed at systematic reviewers preparing data for meta-analysis. Network meta-analysis (NMA), multiple outcomes analysis and analysis combining different types of data are considered in a manner that can be useful across a range of data collection programmes. The guide has been shown to be both easy to learn and useful in a small pilot study.

  20. Optimization of paper bridge loading for 2-DE analysis in the basic pH region: application to the mitochondrial subproteome.

    PubMed

    Kane, Lesley A; Yung, Christina K; Agnetti, Giulio; Neverova, Irina; Van Eyk, Jennifer E

    2006-11-01

    Separation of basic proteins with 2-DE presents technical challenges involving protein precipitation, load limitations, and streaking. Cardiac mitochondria are enriched in basic proteins and difficult to resolve by 2-DE. We investigated two methods, cup and paper bridge, for sample loading of this subproteome into the basic range (pH 6-11) gels. Paper bridge loading consistently produced improved resolution of both analytical and preparative protein loads. A unique benefit of this technique is that proteins retained in the paper bridge after loading basic gels can be reloaded onto lower pH gradients (pH 4-7), allowing valued samples to be analyzed on multiple pH ranges.

  1. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  2. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  3. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  4. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets

    PubMed Central

    Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2010-01-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595

  5. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  6. Testing for Divergent Transmission Histories among Cultural Characters: A Study Using Bayesian Phylogenetic Methods and Iranian Tribal Textile Data

    PubMed Central

    Matthews, Luke J.; Tehrani, Jamie J.; Jordan, Fiona M.; Collard, Mark; Nunn, Charles L.

    2011-01-01

    Background Archaeologists and anthropologists have long recognized that different cultural complexes may have distinct descent histories, but they have lacked analytical techniques capable of easily identifying such incongruence. Here, we show how Bayesian phylogenetic analysis can be used to identify incongruent cultural histories. We employ the approach to investigate Iranian tribal textile traditions. Methods We used Bayes factor comparisons in a phylogenetic framework to test two models of cultural evolution: the hierarchically integrated system hypothesis and the multiple coherent units hypothesis. In the hierarchically integrated system hypothesis, a core tradition of characters evolves through descent with modification and characters peripheral to the core are exchanged among contemporaneous populations. In the multiple coherent units hypothesis, a core tradition does not exist. Rather, there are several cultural units consisting of sets of characters that have different histories of descent. Results For the Iranian textiles, the Bayesian phylogenetic analyses supported the multiple coherent units hypothesis over the hierarchically integrated system hypothesis. Our analyses suggest that pile-weave designs represent a distinct cultural unit that has a different phylogenetic history compared to other textile characters. Conclusions The results from the Iranian textiles are consistent with the available ethnographic evidence, which suggests that the commercial rug market has influenced pile-rug designs but not the techniques or designs incorporated in the other textiles produced by the tribes. We anticipate that Bayesian phylogenetic tests for inferring cultural units will be of great value for researchers interested in studying the evolution of cultural traits including language, behavior, and material culture. PMID:21559083

  7. Maternal Factors Predicting Cognitive and Behavioral Characteristics of Children with Fetal Alcohol Spectrum Disorders

    PubMed Central

    May, Philip A.; Tabachnick, Barbara G.; Gossage, J. Phillip; Kalberg, Wendy O.; Marais, Anna-Susan; Robinson, Luther K.; Manning, Melanie A.; Blankenship, Jason; Buckley, David; Hoyme, H. Eugene; Adnams, Colleen M.

    2013-01-01

    Objective To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASD). Method Multivariate correlation techniques were employed with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and employed in structural equation models (SEM) to assess correlates of child intelligence (verbal and non-verbal) and behavior. Results A first SEM utilizing only seven maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05), but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status (SES), and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model, and were overpowered by SES and maternal physical traits. Conclusions While other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly-controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD. PMID:23751886

  8. Eukaryotic membrane tethers revisited using magnetic tweezers.

    PubMed

    Hosu, Basarab G; Sun, Mingzhai; Marga, Françoise; Grandbois, Michel; Forgacs, Gabor

    2007-04-19

    Membrane nanotubes, under physiological conditions, typically form en masse. We employed magnetic tweezers (MTW) to extract tethers from human brain tumor cells and compared their biophysical properties with tethers extracted after disruption of the cytoskeleton and from a strongly differing cell type, Chinese hamster ovary cells. In this method, the constant force produced with the MTW is transduced to cells through super-paramagnetic beads attached to the cell membrane. Multiple sudden jumps in bead velocity were manifest in the recorded bead displacement-time profiles. These discrete events were interpreted as successive ruptures of individual tethers. Observation with scanning electron microscopy supported the simultaneous existence of multiple tethers. The physical characteristics, in particular, the number and viscoelastic properties of the extracted tethers were determined from the analytic fit to bead trajectories, provided by a standard model of viscoelasticity. Comparison of tethers formed with MTW and atomic force microscopy (AFM), a technique where the cantilever-force transducer is moved at constant velocity, revealed significant differences in the two methods of tether formation. Our findings imply that extreme care must be used to interpret the outcome of tether pulling experiments performed with single molecular techniques (MTW, AFM, optical tweezers, etc). First, the different methods may be testing distinct membrane structures with distinct properties. Second, as soon as a true cell membrane (as opposed to that of a vesicle) can attach to a substrate, upon pulling on it, multiple nonspecific membrane tethers may be generated. Therefore, under physiological conditions, distinguishing between tethers formed through specific and nonspecific interactions is highly nontrivial if at all possible.

  9. Eukaryotic membrane tethers revisited using magnetic tweezers

    NASA Astrophysics Data System (ADS)

    Hosu, Basarab G.; Sun, Mingzhai; Marga, Françoise; Grandbois, Michel; Forgacs, Gabor

    2007-06-01

    Membrane nanotubes, under physiological conditions, typically form en masse. We employed magnetic tweezers (MTW) to extract tethers from human brain tumor cells and compared their biophysical properties with tethers extracted after disruption of the cytoskeleton and from a strongly differing cell type, Chinese hamster ovary cells. In this method, the constant force produced with the MTW is transduced to cells through super-paramagnetic beads attached to the cell membrane. Multiple sudden jumps in bead velocity were manifest in the recorded bead displacement-time profiles. These discrete events were interpreted as successive ruptures of individual tethers. Observation with scanning electron microscopy supported the simultaneous existence of multiple tethers. The physical characteristics, in particular, the number and viscoelastic properties of the extracted tethers were determined from the analytic fit to bead trajectories, provided by a standard model of viscoelasticity. Comparison of tethers formed with MTW and atomic force microscopy (AFM), a technique where the cantilever-force transducer is moved at constant velocity, revealed significant differences in the two methods of tether formation. Our findings imply that extreme care must be used to interpret the outcome of tether pulling experiments performed with single molecular techniques (MTW, AFM, optical tweezers, etc). First, the different methods may be testing distinct membrane structures with distinct properties. Second, as soon as a true cell membrane (as opposed to that of a vesicle) can attach to a substrate, upon pulling on it, multiple nonspecific membrane tethers may be generated. Therefore, under physiological conditions, distinguishing between tethers formed through specific and nonspecific interactions is highly nontrivial if at all possible.

  10. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist

    NASA Astrophysics Data System (ADS)

    Reveil, Mardochee; Sorg, Victoria C.; Cheng, Emily R.; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O.

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  11. The Analog Revolution and Its On-Going Role in Modern Analytical Measurements.

    PubMed

    Enke, Christie G

    2015-12-15

    The electronic revolution in analytical instrumentation began when we first exceeded the two-digit resolution of panel meters and chart recorders and then took the first steps into automated control. It started with the first uses of operational amplifiers (op amps) in the analog domain 20 years before the digital computer entered the analytical lab. Their application greatly increased both accuracy and precision in chemical measurement and they provided an elegant means for the electronic control of experimental quantities. Later, laboratory and personal computers provided an unlimited readout resolution and enabled programmable control of instrument parameters as well as storage and computation of acquired data. However, digital computers did not replace the op amp's critical role of converting the analog sensor's output to a robust and accurate voltage. Rather it added a new role: converting that voltage into a number. These analog operations are generally the limiting portions of our computerized instrumentation systems. Operational amplifier performance in gain, input current and resistance, offset voltage, and rise time have improved by a remarkable 3-4 orders of magnitude since their first implementations. Each 10-fold improvement has opened the doors for the development of new techniques in all areas of chemical analysis. Along with some interesting history, the multiple roles op amps play in modern instrumentation are described along with a number of examples of new areas of analysis that have been enabled by their improvements.

  12. Finite element and analytical solutions for van der Pauw and four-point probe correction factors when multiple non-ideal measurement conditions coexist.

    PubMed

    Reveil, Mardochee; Sorg, Victoria C; Cheng, Emily R; Ezzyat, Taha; Clancy, Paulette; Thompson, Michael O

    2017-09-01

    This paper presents an extensive collection of calculated correction factors that account for the combined effects of a wide range of non-ideal conditions often encountered in realistic four-point probe and van der Pauw experiments. In this context, "non-ideal conditions" refer to conditions that deviate from the assumptions on sample and probe characteristics made in the development of these two techniques. We examine the combined effects of contact size and sample thickness on van der Pauw measurements. In the four-point probe configuration, we examine the combined effects of varying the sample's lateral dimensions, probe placement, and sample thickness. We derive an analytical expression to calculate correction factors that account, simultaneously, for finite sample size and asymmetric probe placement in four-point probe experiments. We provide experimental validation of the analytical solution via four-point probe measurements on a thin film rectangular sample with arbitrary probe placement. The finite sample size effect is very significant in four-point probe measurements (especially for a narrow sample) and asymmetric probe placement only worsens such effects. The contribution of conduction in multilayer samples is also studied and found to be substantial; hence, we provide a map of the necessary correction factors. This library of correction factors will enable the design of resistivity measurements with improved accuracy and reproducibility over a wide range of experimental conditions.

  13. Current Technical Approaches for the Early Detection of Foodborne Pathogens: Challenges and Opportunities.

    PubMed

    Cho, Il-Hoon; Ku, Seockmo

    2017-09-30

    The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.

  14. Electrochemical concentration measurements for multianalyte mixtures in simulated electrorefiner salt

    NASA Astrophysics Data System (ADS)

    Rappleye, Devin Spencer

    The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).

  15. Analytical methods for gelatin differentiation from bovine and porcine origins and food products.

    PubMed

    Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B

    2012-01-01

    Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®

  16. An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. L.; Cosby, R. M.

    1976-01-01

    Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.

  17. Three-dimensional circulation dynamics of along-channel flow in stratified estuaries

    NASA Astrophysics Data System (ADS)

    Musiak, Jeffery Daniel

    Estuaries are vital because they are the major interface between humans and the oceans and provide valuable habitat for a wide range of organisms. Therefore it is important to model estuarine circulation to gain a better comprehension of the mechanics involved and how people effect estuaries. To this end, this dissertation combines analysis of data collected in the Columbia River estuary (CRE) with novel data processing and modeling techniques to further the understanding of estuaries that are strongly forced by riverflow and tides. The primary hypothesis tested in this work is that the three- dimensional (3-D) variability in along-channel currents in a strongly forced estuary can be largely accounted for by including the lateral variations in density and bathymetry but neglecting the secondary, or lateral, flow. Of course, the forcing must also include riverflow and oceanic tides. Incorporating this simplification and the modeling ideas put forth by others with new modeling techniques and new ideas on estuarine circulation will allow me to create a semi-analytical quasi 3-D profile model. This approach was chosen because it is of intermediate complexity to purely analytical models, that, if tractable, are too simple to be useful, and 3-D numerical models which can have excellent resolution but require large amounts of time, computer memory and computing power. Validation of the model will be accomplished using velocity and density data collected in the Columbia River Estuary and by comparison to analytical solutions. Components of the modeling developed here include: (1) development of a 1-D barotropic model for tidal wave propagation in frictionally dominated systems with strong topography. This model can have multiple tidal constituents and multiply connected channels. (2) Development and verification of a new quasi 3-D semi-analytical velocity profile model applicable to estuarine systems which are strongly forced by both oceanic tides and riverflow. This model includes diurnal and semi-diurnal tidal and non- linearly generated overtide circulation and residual circulation driven by riverflow, baroclinic forcing, surface wind stress and non-linear tidal forcing. (3) Demonstration that much of the lateral variation in along-channel currents is caused by variations in along- channel density forcing and bathymetry.

  18. On the first crossing distributions in fractional Brownian motion and the mass function of dark matter haloes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiotelis, Nicos; Popolo, Antonino Del, E-mail: adelpopolo@oact.inaf.it, E-mail: hiotelis@ipta.demokritos.gr

    We construct an integral equation for the first crossing distributions for fractional Brownian motion in the case of a constant barrier and we present an exact analytical solution. Additionally we present first crossing distributions derived by simulating paths from fractional Brownian motion. We compare the results of the analytical solutions with both those of simulations and those of some approximated solutions which have been used in the literature. Finally, we present multiplicity functions for dark matter structures resulting from our analytical approach and we compare with those resulting from N-body simulations. We show that the results of analytical solutions aremore » in good agreement with those of path simulations but differ significantly from those derived from approximated solutions. Additionally, multiplicity functions derived from fractional Brownian motion are poor fits of the those which result from N-body simulations. We also present comparisons with other models which are exist in the literature and we discuss different ways of improving the agreement between analytical results and N-body simulations.« less

  19. Applications of Flow Cytometry to Clinical Microbiology†

    PubMed Central

    Álvarez-Barrientos, Alberto; Arroyo, Javier; Cantón, Rafael; Nombela, César; Sánchez-Pérez, Miguel

    2000-01-01

    Classical microbiology techniques are relatively slow in comparison to other analytical techniques, in many cases due to the need to culture the microorganisms. Furthermore, classical approaches are difficult with unculturable microorganisms. More recently, the emergence of molecular biology techniques, particularly those on antibodies and nucleic acid probes combined with amplification techniques, has provided speediness and specificity to microbiological diagnosis. Flow cytometry (FCM) allows single- or multiple-microbe detection in clinical samples in an easy, reliable, and fast way. Microbes can be identified on the basis of their peculiar cytometric parameters or by means of certain fluorochromes that can be used either independently or bound to specific antibodies or oligonucleotides. FCM has permitted the development of quantitative procedures to assess antimicrobial susceptibility and drug cytotoxicity in a rapid, accurate, and highly reproducible way. Furthermore, this technique allows the monitoring of in vitro antimicrobial activity and of antimicrobial treatments ex vivo. The most outstanding contribution of FCM is the possibility of detecting the presence of heterogeneous populations with different responses to antimicrobial treatments. Despite these advantages, the application of FCM in clinical microbiology is not yet widespread, probably due to the lack of access to flow cytometers or the lack of knowledge about the potential of this technique. One of the goals of this review is to attempt to mitigate this latter circumstance. We are convinced that in the near future, the availability of commercial kits should increase the use of this technique in the clinical microbiology laboratory. PMID:10755996

  20. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

Top