Sample records for methodologies analytical techniques

  1. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  2. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  3. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    PubMed

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  4. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  5. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  7. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  9. Closed-loop, pilot/vehicle analysis of the approach and landing task

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.; Anderson, M. R.

    1985-01-01

    Optimal-control-theoretic modeling and frequency-domain analysis is the methodology proposed to evaluate analytically the handling qualities of higher-order manually controlled dynamic systems. Fundamental to the methodology is evaluating the interplay between pilot workload and closed-loop pilot/vehicle performance and stability robustness. The model-based metric for pilot workload is the required pilot phase compensation. Pilot/vehicle performance and loop stability is then evaluated using frequency-domain techniques. When these techniques were applied to the flight-test data for thirty-two highly-augmented fighter configurations, strong correlation was obtained between the analytical and experimental results.

  10. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  11. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  13. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  14. Multimodal system planning technique : an analytical approach to peak period operation

    DOT National Transportation Integrated Search

    1995-11-01

    The multimodal system planning technique described in this report is an improvement of the methodology used in the Dallas System Planning Study. The technique includes a spreadsheet-based process to identify the costs of congestion, construction, and...

  15. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  16. Antimony in the environment as a global pollutant: a review on analytical methodologies for its determination in atmospheric aerosols.

    PubMed

    Smichowski, Patricia

    2008-03-15

    This review summarizes and discusses the research carried out on the determination of antimony and its predominant chemical species in atmospheric aerosols. Environmental matrices such as airborne particulate matter, fly ash and volcanic ash present a number of complex analytical challenges as very sensitive analytical techniques and highly selective separation methodologies for speciation studies. Given the diversity of instrumental approaches and methodologies employed for the determination of antimony and its species in environmental matrices, the objective of this review is to briefly discuss the most relevant findings reported in the last years for this remarkable element and to identify the future needs and trends. The survey includes 92 references and covers principally the literature published over the last decade.

  17. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    ERIC Educational Resources Information Center

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  18. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Treesearch

    Toddi A. Steelman; Branda Nowell; Deena Bayoumi; Sarah McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  20. The role of chromatographic and chiroptical spectroscopic techniques and methodologies in support of drug discovery for atropisomeric drug inhibitors of Bruton's tyrosine kinase.

    PubMed

    Dai, Jun; Wang, Chunlei; Traeger, Sarah C; Discenza, Lorell; Obermeier, Mary T; Tymiak, Adrienne A; Zhang, Yingru

    2017-03-03

    Atropisomers are stereoisomers resulting from hindered bond rotation. From synthesis of pure atropisomers, characterization of their interconversion thermodynamics to investigation of biological stereoselectivity, the evaluation of drug candidates subject to atropisomerism creates special challenges and can be complicated in both early drug discovery and later drug development. In this paper, we demonstrate an array of analytical techniques and systematic approaches to study the atropisomerism of drug molecules to meet these challenges. Using a case study of Bruton's tyrosine kinase (BTK) inhibitor drug candidates at Bristol-Myers Squibb, we present the analytical strategies and methodologies used during drug discovery including the detection of atropisomers, the determination of their relative composition, the identification of relative chirality, the isolation of individual atropisomers, the evaluation of interconversion kinetics, and the characterization of chiral stability in the solid state and in solution. In vivo and in vitro stereo-stability and stereo-selectivity were investigated as well as the pharmacological significance of any changes in atropisomer ratios. Techniques applied in these studies include analytical and preparative enantioselective supercritical fluid chromatography (SFC), enantioselective high performance liquid chromatography (HPLC), circular dichroism (CD), and mass spectrometry (MS). Our experience illustrates how atropisomerism can be a very complicated issue in drug discovery and why a thorough understanding of this phenomenon is necessary to provide guidance for pharmaceutical development. Analytical techniques and methodologies facilitate key decisions during the discovery of atropisomeric drug candidates by characterizing time-dependent physicochemical properties that can have significant biological implications and relevance to pharmaceutical development plans. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Background for Joint Systems Aspects of AIR 6000

    DTIC Science & Technology

    2000-04-01

    Checkland’s Soft Systems Methodology [7, 8,9]. The analytical techniques that are proposed for joint systems work are based on calculating probability...Supporting Global Interests 21 DSTO-CR-0155 SLMP Structural Life Management Plan SOW Stand-Off Weapon SSM Soft Systems Methodology UAV Uninhabited Aerial... Systems Methodology in Action, John Wiley & Sons, Chichester, 1990. [101 Pearl, Judea, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible

  2. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  3. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Critical review of dog detection and the influences of physiology, training, and analytical methodologies.

    PubMed

    Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M

    2018-08-01

    Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  6. Electrochemical concentration measurements for multianalyte mixtures in simulated electrorefiner salt

    NASA Astrophysics Data System (ADS)

    Rappleye, Devin Spencer

    The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).

  7. 75 FR 71131 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... impacts. To complete this task with scientific rigor, it will be necessary to collect high quality survey... instruments, methodologies, procedures, and analytical techniques for this task. Moreover, they have been pilot tested in 11 States. The tools and techniques were submitted for review, and were approved, by...

  8. Industrial Demand Module - NEMS Documentation

    EIA Publications

    2014-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Industrial Demand Module. The report catalogues and describes model assumptions, computational methodology, parameter estimation techniques, and model source code.

  9. Hybrid perturbation methods based on statistical time series models

    NASA Astrophysics Data System (ADS)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.

  10. DE-CERTS: A Decision Support System for a Comparative Evaluation Method for Risk Management Methodologies and Tools

    DTIC Science & Technology

    1991-09-01

    iv III. THE ANALYTIC HIERARCHY PROCESS ..... ........ 15 A. INTRODUCTION ...... ................. 15 B. THE AHP PROCESS ...... ................ 16 C...INTRODUCTION ...... ................. 26 B. IMPLEMENTATION OF CERTS USING AHP ........ .. 27 1. Consistency ...... ................ 29 2. User Interface...the proposed technique into a Decision Support System. Expert Choice implements the Analytic Hierarchy Process ( AHP ), an approach to multi- criteria

  11. International Natural Gas Model 2011, Model Documentation Report

    EIA Publications

    2013-01-01

    This report documents the objectives, analytical approach and development of the International Natural Gas Model (INGM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  12. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  13. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  14. Analytical Methodologies for the Determination of Endocrine Disrupting Compounds in Biological and Environmental Samples

    PubMed Central

    Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan

    2013-01-01

    Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329

  15. Validation of Multilevel Constructs: Validation Methods and Empirical Findings for the EDI

    ERIC Educational Resources Information Center

    Forer, Barry; Zumbo, Bruno D.

    2011-01-01

    The purposes of this paper are to highlight the foundations of multilevel construct validation, describe two methodological approaches and associated analytic techniques, and then apply these approaches and techniques to the multilevel construct validation of a widely-used school readiness measure called the Early Development Instrument (EDI;…

  16. On the design of decoupling controllers for advanced rotorcraft in the hover case

    NASA Technical Reports Server (NTRS)

    Fan, M. K. H.; Tits, A.; Barlow, J.; Tsing, N. K.; Tischler, M.; Takahashi, M.

    1991-01-01

    A methodology for design of helicopter control systems is proposed that can account for various types of concurrent specifications: stability, decoupling between longitudinal and lateral motions, handling qualities, and physical limitations of the swashplate motions. This is achieved by synergistic use of analytical techniques (Q-parameterization of all stabilizing controllers, transfer function interpolation) and advanced numerical optimization techniques. The methodology is used to design a controller for the UH-60 helicopter in hover. Good results are achieved for decoupling and handling quality specifications.

  17. [Meta-analyses of quarks, baryons and mesons--a "Cochrane Collaboration" in particle physics].

    PubMed

    Sauerland, Stefan; Sauerland, Thankmar; Antes, Gerd; Barnett, R Michael

    2002-02-01

    Within the last 20 years meta-analysis has become an important research technique in medicine for integrating the results of independent studies. Meta-analytical techniques, however, are much older. In particle physics for 50 years now the properties of huge numbers of particles have been assessed in meta-analyses. The Cochrane Collaboration's counterpart in physics is the Particle Data Group. This article compares methodological and organisational aspects of meta-analyses in medicine and physics. Several interesting parallels exist, especially with regard to methodology.

  18. Improved Design of Tunnel Supports : Executive Summary

    DOT National Transportation Integrated Search

    1979-12-01

    This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...

  19. Residential Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Model Documentation - Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Residential Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, and FORTRAN source code.

  20. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  1. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  2. World Energy Projection System Plus Model Documentation: Coal Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Coal Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  3. World Energy Projection System Plus Model Documentation: Transportation Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) International Transportation model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  4. World Energy Projection System Plus Model Documentation: Residential Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Residential Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  5. World Energy Projection System Plus Model Documentation: Refinery Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Refinery Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  6. World Energy Projection System Plus Model Documentation: Main Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Main Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  7. Transportation Sector Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Transportation Model (TRAN). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated by the model.

  8. World Energy Projection System Plus Model Documentation: Electricity Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Electricity Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  9. HPLC-PFD determination of priority pollutant PAHs in water, sediment, and semipermeable membrane devices

    USGS Publications Warehouse

    Williamson, K.S.; Petty, J.D.; Huckins, J.N.; Lebo, J.A.; Kaiser, E.M.

    2002-01-01

    High performance liquid chromatography coupled with programmable fluorescence detection was employed for the determination of 15 priority pollutant polycyclic aromatic hydrocarbons (PPPAHs) in water, sediment, and semipermeable membrane devices (SPMDs). Chromatographic separation using this analytical method facilitates selectivity, sensitivity (ppt levels), and can serve as a non-destructive technique for subsequent analysis by other chromatographic and spectroscopic techniques. Extraction and sample cleanup procedures were also developed for water, sediment, and SPMDs using various chromatographic and wet chemical methods. The focus of this publication is to examine the enrichment techniques and the analytical methodologies used in the isolation, characterization, and quantitation of 15 PPPAHs in different sample matrices.

  10. Size analysis of polyglutamine protein aggregates using fluorescence detection in an analytical ultracentrifuge.

    PubMed

    Polling, Saskia; Hatters, Danny M; Mok, Yee-Foong

    2013-01-01

    Defining the aggregation process of proteins formed by poly-amino acid repeats in cells remains a challenging task due to a lack of robust techniques for their isolation and quantitation. Sedimentation velocity methodology using fluorescence detected analytical ultracentrifugation is one approach that can offer significant insight into aggregation formation and kinetics. While this technique has traditionally been used with purified proteins, it is now possible for substantial information to be collected with studies using cell lysates expressing a GFP-tagged protein of interest. In this chapter, we describe protocols for sample preparation and setting up the fluorescence detection system in an analytical ultracentrifuge to perform sedimentation velocity experiments on cell lysates containing aggregates formed by poly-amino acid repeat proteins.

  11. Advances in Instrumental Analysis of Brominated Flame Retardants: Current Status and Future Perspectives

    PubMed Central

    2014-01-01

    This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482

  12. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  13. World Energy Projection System Plus Model Documentation: Greenhouse Gases Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Greenhouse Gases Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  14. World Energy Projection System Plus Model Documentation: Natural Gas Module

    EIA Publications

    2011-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) Natural Gas Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  15. World Energy Projection System Plus Model Documentation: District Heat Module

    EIA Publications

    2017-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) District Heat Model. It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  16. World Energy Projection System Plus Model Documentation: Industrial Module

    EIA Publications

    2016-01-01

    This report documents the objectives, analytical approach and development of the World Energy Projection System Plus (WEPS ) World Industrial Model (WIM). It also catalogues and describes critical assumptions, computational methodology, parameter estimation techniques, and model source code.

  17. Decision analysis to complete diagnostic research by closing the gap between test characteristics and cost-effectiveness.

    PubMed

    Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik

    2009-12-01

    The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).

  18. Coprecipitation-assisted coacervative extraction coupled to high-performance liquid chromatography: An approach for determining organophosphorus pesticides in water samples.

    PubMed

    Mammana, Sabrina B; Berton, Paula; Camargo, Alejandra B; Lascalea, Gustavo E; Altamirano, Jorgelina C

    2017-05-01

    An analytical methodology based on coprecipitation-assisted coacervative extraction coupled to HPLC-UV was developed for determination of five organophosphorus pesticides (OPPs), including fenitrothion, guthion, parathion, methidathion, and chlorpyrifos, in water samples. It involves a green technique leading to an efficient and simple analytical methodology suitable for high-throughput analysis. Relevant physicochemical variables were studied and optimized on the analytical response of each OPP. Under optimized conditions, the resulting methodology was as follows: an aliquot of 9 mL of water sample was placed into a centrifuge tube and 0.5 mL sodium citrate 0.1 M, pH 4; 0.08 mL Al 2 (SO 4 ) 3 0.1 M; and 0.7 mL SDS 0.1 M were added and homogenized. After centrifugation the supernatant was discarded. A 700 μL aliquot of the coacervate-rich phase obtained was dissolved with 300 μL of methanol and 20 μL of the resulting solution was analyzed by HPLC-UV. The resulting LODs ranged within 0.7-2.5 ng/mL and the achieved RSD and recovery values were <8% (n = 3) and >81%, respectively. The proposed analytical methodology was successfully applied for the analysis of five OPPs in water samples for human consumption of different locations of Mendoza. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Measurement and Modelling: Sequential Use of Analytical Techniques in a Study of Risk-Taking in Decision-Making by School Principals

    ERIC Educational Resources Information Center

    Trimmer, Karen

    2016-01-01

    This paper investigates reasoned risk-taking in decision-making by school principals using a methodology that combines sequential use of psychometric and traditional measurement techniques. Risk-taking is defined as when decisions are made that are not compliant with the regulatory framework, the primary governance mechanism for public schools in…

  20. POLLUTION PREVENTION AND ENHANCEMENT OF BIODEGRADABILITY VIA ISOMER ELIMINATION IN CONSUMER PRODUCTS

    EPA Science Inventory

    The purpose of this project is to develop novel methodologies for the analysis and detection of chiral environmental contaminants. Conventional analytical techniques do not discriminate between enantiomers. By using newly developed enantioselective methods, the environmental pers...

  1. One Controller at a Time (1-CAT): A mimo design methodology

    NASA Technical Reports Server (NTRS)

    Mitchell, J. R.; Lucas, J. C.

    1987-01-01

    The One Controller at a Time (1-CAT) methodology for designing digital controllers for Large Space Structures (LSS's) is introduced and illustrated. The flexible mode problem is first discussed. Next, desirable features of a LSS control system design methodology are delineated. The 1-CAT approach is presented, along with an analytical technique for carrying out the 1-CAT process. Next, 1-CAT is used to design digital controllers for the proposed Space Based Laser (SBL). Finally, the SBL design is evaluated for dynamical performance, noise rejection, and robustness.

  2. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  3. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  4. LC-MS based analysis of endogenous steroid hormones in human hair.

    PubMed

    Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias

    2016-09-01

    The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Modern analytical methods for the detection of food fraud and adulteration by food category.

    PubMed

    Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook

    2017-09-01

    This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  6. [Composition of chicken and quail eggs].

    PubMed

    Closa, S J; Marchesich, C; Cabrera, M; Morales, J C

    1999-06-01

    Qualified food composition data on lipids composition are needed to evaluate intakes as a risk factor in the development of heart disease. Proximal composition, cholesterol and fatty acid content of chicken and quail eggs, usually consumed or traded, were analysed. Proximal composition were determined using AOAC (1984) specific techniques; lipids were extracted by a Folch's modified technique and cholesterol and fatty acids were determined by gas chromatography. Results corroborate the stability of eggs composition. Cholesterol content of quail eggs is similar to chicken eggs, but it is almost the half content of data registered in Handbook 8. Differences may be attributed to the analytical methodology used to obtain them. This study provides data obtained with up-date analytical techniques and accessory information useful for food composition tables.

  7. Macroeconomic Activity Module - NEMS Documentation

    EIA Publications

    2016-01-01

    Documents the objectives, analytical approach, and development of the National Energy Modeling System (NEMS) Macroeconomic Activity Module (MAM) used to develop the Annual Energy Outlook for 2016 (AEO2016). The report catalogues and describes the module assumptions, computations, methodology, parameter estimation techniques, and mainframe source code

  8. Quantitative Determination of Noa (Naturally Occurring Asbestos) in Rocks : Comparison Between Pcom and SEM Analysis

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Amodeo, Francesco; Giorgis, Ilaria; Vitaliti, Martina

    2017-04-01

    The quantification of NOA (Naturally Occurring Asbestos) in a rock or soil matrix is complex and subject to numerous errors. The purpose of this study is to compare two fundamental methodologies used for the analysis: the first one uses Phase Contrast Optical Microscope (PCOM) while the second one uses Scanning Electron Microscope (SEM). The two methods, although they provide the same result, which is the asbestos mass to total mass ratio, have completely different characteristics and both present pros and cons. The current legislation in Italy involves the use of SEM, DRX, FTIR, PCOM (DM 6/9/94) for the quantification of asbestos in bulk materials and soils and the threshold beyond which the material is considered as hazardous waste is a concentration of asbestos fiber of 1000 mg/kg.(DM 161/2012). The most used technology is the SEM which is the one among these with the better analytical sensitivity.(120mg/Kg DM 6 /9/94) The fundamental differences among the analyses are mainly: - Amount of analyzed sample portion - Representativeness of the sample - Resolution - Analytical precision - Uncertainty of the methodology - Operator errors Due to the problem of quantification of DRX and FTIR (1% DM 6/9/94) our Asbestos Laboratory (DIATI POLITO) since more than twenty years apply the PCOM methodology and in the last years the SEM methodology for quantification of asbestos content. The aim of our research is to compare the results obtained from a PCOM analysis with the results provided by SEM analysis on the base of more than 100 natural samples both from cores (tunnel-boring or explorative-drilling) and from tunnelling excavation . The results obtained show, in most cases, a good correlation between the two techniques. Of particular relevance is the fact that both techniques are reliable for very low quantities of asbestos, even lower than the analytical sensitivity. This work highlights the comparison between the two techniques emphasizing strengths and weaknesses of the two procedures and suggests how an integrated approach, together with the skills and experience of the operator may be the best way forward in order to obtain a constructive improvement of analysis techniques.

  9. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Qualitative carbonyl profile in coffee beans through GDME-HPLC-DAD-MS/MS for coffee preliminary characterization.

    PubMed

    Cordeiro, Liliana; Valente, Inês M; Santos, João Rodrigo; Rodrigues, José A

    2018-05-01

    In this work, an analytical methodology for volatile carbonyl compounds characterization in green and roasted coffee beans was developed. The methodology relied on a recent and simple sample preparation technique, gas diffusion microextraction for extraction of the samples' volatiles, followed HPLC-DAD-MS/MS analysis. The experimental conditions in terms of extraction temperature and extraction time were studied. A profile for carbonyl compounds was obtained for both arabica and robusta coffee species (green and roasted samples). Twenty-seven carbonyl compounds were identified and further discussed, in light of reported literature, with different coffee characteristics: coffee ageing, organoleptic impact, presence of defective beans, authenticity, human's health implication, post-harvest coffee processing and roasting. The applied methodology showed to be a powerful analytical tool to be used for coffee characterization as it measures marker compounds of different coffee characteristics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Commercial Demand Module - NEMS Documentation

    EIA Publications

    2017-01-01

    Documents the objectives, analytical approach and development of the National Energy Modeling System (NEMS) Commercial Sector Demand Module. The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  12. METHODOLOGY TO EVALUATE THE POTENTIAL FOR GROUND WATER CONTAMINATION FROM GEOTHERMAL FLUID RELEASES

    EPA Science Inventory

    This report provides analytical methods and graphical techniques to predict potential ground water contamination from geothermal energy development. Overflows and leaks from ponds, pipe leaks, well blowouts, leaks from well casing, and migration from injection zones can be handle...

  13. Analytical simulation and PROFAT II: a new methodology and a computer automated tool for fault tree analysis in chemical process industries.

    PubMed

    Khan, F I; Abbasi, S A

    2000-07-10

    Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.

  14. Selected analytical challenges in the determination of pharmaceuticals in drinking/marine waters and soil/sediment samples.

    PubMed

    Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr

    2016-03-20

    Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Characteristics of Effective Leadership Networks

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Azah, Vera Ndifor

    2016-01-01

    Purpose: The purpose of this paper is to inquire about the characteristics of effective school leadership networks and the contribution of such networks to the development of individual leaders' professional capacities. Design/methodology/approach: The study used path-analytic techniques with survey data provided by 450 school and district leaders…

  16. The utility of arginine-citrulline stable isotope tracer infusion technique in the assessment of nitric oxide production in MELAS syndrome

    USDA-ARS?s Scientific Manuscript database

    The correspondence titled “Analytical challenges in the assessment of NO synthesis from L-arginine in the MELAS syndrome” suggested challenges that can limit the utility of stable isotope infusion methodology in assessing NO production....

  17. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  18. Making Mass Spectrometry See the Light: The Promises and Challenges of Cryogenic Infrared Ion Spectroscopy as a Bioanalytical Technique

    PubMed Central

    Cismesia, Adam P.; Bailey, Laura S.; Bell, Matthew R.; Tesler, Larry F.; Polfer, Nicolas C.

    2016-01-01

    The detailed chemical information contained in the vibrational spectrum of a cryogenically cooled analyte would, in principle, make infrared (IR) ion spectroscopy a gold standard technique for molecular identification in mass spectrometry. Despite this immense potential, there are considerable challenges in both instrumentation and methodology to overcome before the technique is analytically useful. Here, we discuss the promise of IR ion spectroscopy for small molecule analysis in the context of metabolite identification. Experimental strategies to address sensitivity constraints, poor overall duty cycle, and speed of the experiment are intimately tied to the development of a mass-selective cryogenic trap. Therefore, the most likely avenues for success, in the authors? opinion, are presented here, alongside alternative approaches and some thoughts on data interpretation. PMID:26975370

  19. Combined use of optical and electron microscopic techniques for the measurement of hygroscopic property, chemical composition, and morphology of individual aerosol particles.

    PubMed

    Ahn, Kang-Ho; Kim, Sun-Man; Jung, Hae-Jin; Lee, Mi-Jung; Eom, Hyo-Jin; Maskey, Shila; Ro, Chul-Un

    2010-10-01

    In this work, an analytical method for the characterization of the hygroscopic property, chemical composition, and morphology of individual aerosol particles is introduced. The method, which is based on the combined use of optical and electron microscopic techniques, is simple and easy to apply. An optical microscopic technique was used to perform the visual observation of the phase transformation and hygroscopic growth of aerosol particles on a single particle level. A quantitative energy-dispersive electron probe X-ray microanalysis, named low-Z particle EPMA, was used to perform a quantitative chemical speciation of the same individual particles after the measurement of the hygroscopic property. To validate the analytical methodology, the hygroscopic properties of artificially generated NaCl, KCl, (NH(4))(2)SO(4), and Na(2)SO(4) aerosol particles of micrometer size were investigated. The practical applicability of the analytical method for studying the hygroscopic property, chemical composition, and morphology of ambient aerosol particles is demonstrated.

  20. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  1. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology.

    PubMed

    Jesus, Mafalda; Martins, Ana P J; Gallardo, Eugenia; Silvestre, Samuel

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata , Smilax China, and Trigonella foenum graecum . This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well.

  2. Diosgenin: Recent Highlights on Pharmacology and Analytical Methodology

    PubMed Central

    2016-01-01

    Diosgenin, a steroidal sapogenin, occurs abundantly in plants such as Dioscorea alata, Smilax China, and Trigonella foenum graecum. This bioactive phytochemical not only is used as an important starting material for the preparation of several steroidal drugs in the pharmaceutical industry, but has revealed also high potential and interest in the treatment of various types of disorders such as cancer, hypercholesterolemia, inflammation, and several types of infections. Due to its pharmacological and industrial importance, several extraction and analytical procedures have been developed and applied over the years to isolate, detect, and quantify diosgenin, not only in its natural sources and pharmaceutical compositions, but also in animal matrices for pharmacodynamic, pharmacokinetic, and toxicological studies. Within these, HPLC technique coupled to different detectors is the most commonly analytical procedure described for this compound. However, other alternative methods were also published. Thus, the present review aims to provide collective information on the most recent pharmacological data on diosgenin and on the most relevant analytical techniques used to isolate, detect, and quantify this compound as well. PMID:28116217

  3. Calculation of Shuttle Base Heating Environments and Comparison with Flight Data

    NASA Technical Reports Server (NTRS)

    Greenwood, T. F.; Lee, Y. C.; Bender, R. L.; Carter, R. E.

    1983-01-01

    The techniques, analytical tools, and experimental programs used initially to generate and later to improve and validate the Shuttle base heating design environments are discussed. In general, the measured base heating environments for STS-1 through STS-5 were in good agreement with the preflight predictions. However, some changes were made in the methodology after reviewing the flight data. The flight data is described, preflight predictions are compared with the flight data, and improvements in the prediction methodology based on the data are discussed.

  4. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    PubMed

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  5. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  6. Analytical methodologies for aluminium speciation in environmental and biological samples--a review.

    PubMed

    Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W

    2001-08-01

    It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.

  7. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  8. Dietary exposure to trace elements and radionuclides: the methodology of the Italian Total Diet Study 2012-2014.

    PubMed

    D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco

    2013-01-01

    This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.

  9. e-Research and Learning Theory: What Do Sequence and Process Mining Methods Contribute?

    ERIC Educational Resources Information Center

    Reimann, Peter; Markauskaite, Lina; Bannert, Maria

    2014-01-01

    This paper discusses the fundamental question of how data-intensive e-research methods could contribute to the development of learning theories. Using methodological developments in research on self-regulated learning as an example, it argues that current applications of data-driven analytical techniques, such as educational data mining and its…

  10. Performance modeling of automated manufacturing systems

    NASA Astrophysics Data System (ADS)

    Viswanadham, N.; Narahari, Y.

    A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.

  11. Oral Reading Fluency Growth: A Sample of Methodology and Findings. Research Brief 6

    ERIC Educational Resources Information Center

    Tindal, Gerald; Nese, Joseph F. T.

    2013-01-01

    For the past 20 years, the growth of students' oral reading fluency has been investigated by a number of researchers using curriculum-based measurement. These researchers have used varied methods (student samples, measurement procedures, and analytical techniques) and yet have converged on a relatively consistent finding: General education…

  12. Single-particle mineralogy of Chinese soil particles by the combined use of low-Z particle electron probe X-ray microanalysis and attenuated total reflectance-FT-IR imaging techniques.

    PubMed

    Malek, Md Abdul; Kim, Bowha; Jung, Hae-Jin; Song, Young-Chul; Ro, Chul-Un

    2011-10-15

    Our previous work on the speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis technique (EPMA), low-Z particle EPMA, demonstrated that the combined use of these two techniques is a powerful approach for looking at the single-particle mineralogy of externally heterogeneous minerals. In this work, this analytical methodology was applied to characterize six soil samples collected at arid areas in China, in order to identify mineral types present in the samples. The six soil samples were collected from two types of soil, i.e., loess and desert soils, for which overall 665 particles were analyzed on a single particle basis. The six soil samples have different mineralogical characteristics, which were clearly differentiated in this work. As this analytical methodology provides complementary information, the ATR-FT-IR imaging on mineral types, and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles, more detailed information can be obtained using this approach than when either low-Z particle EPMA or ATR-FT-IR imaging techniques are used alone, which has a great potential for the characterization of Asian dust and mineral dust particles. © 2011 American Chemical Society

  13. New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.

    PubMed

    Shaaban, Heba

    2016-10-01

    Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.

  14. Newcomer adjustment during organizational socialization: a meta-analytic review of antecedents, outcomes, and methods.

    PubMed

    Bauer, Talya N; Bodner, Todd; Erdogan, Berrin; Truxillo, Donald M; Tucker, Jennifer S

    2007-05-01

    The authors tested a model of antecedents and outcomes of newcomer adjustment using 70 unique samples of newcomers with meta-analytic and path modeling techniques. Specifically, they proposed and tested a model in which adjustment (role clarity, self-efficacy, and social acceptance) mediated the effects of organizational socialization tactics and information seeking on socialization outcomes (job satisfaction, organizational commitment, job performance, intentions to remain, and turnover). The results generally supported this model. In addition, the authors examined the moderating effects of methodology on these relationships by coding for 3 methodological issues: data collection type (longitudinal vs. cross-sectional), sample characteristics (school-to-work vs. work-to-work transitions), and measurement of the antecedents (facet vs. composite measurement). Discussion focuses on the implications of the findings and suggestions for future research. 2007 APA, all rights reserved

  15. A simple {sup 197}Hg RNAA procedure for the determination of mercury in urine, blood, and tissue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blotcky, A.J.; Rack, E.P.; Meade, A.G.

    1995-12-31

    Mercury has been implicated as a causal agent in such central nervous system diseases as Alzheimer`s and Parkinson`s. Consequently, there has been increased interest in the determination of ultra-trace-level mercury in biological matrices, especially in tissue. While such nonnuclear techniques as cold vapor atomic absorption spectrometry and cold vapor atomic fluorescence spectrometry have been employed routinely for mercury determinations in urine and blood, there is a paucity of nonnuclear techniques for the determination of mercury in the low parts-per-billion range in biological tissue. As pointed out by Fardy and Warner, instrumental and radiochemical neutron activation analysis (INAA and RNAA) requiremore » no blank determinations in contrast to nonnuclear analytical techniques employing digestion and/or chemical operations. Therefore, INAA and RNAA become the obvious choices for determination of ultra-trace levels of mercury in tissue. Most separation methods reported in the literature require different and separate methodologies for mercury determinations in urine, blood, or tissue. The purposes of this study are to develop a single methodology for the determination of low levels of mercury in all biological matrices by RNAA and to optimize parameters necessary for an efficacious trace-level determination. Previously, few studies have taken into account the effects of the Szilard-Chalmers reactions of the radioactivatable analyte within a biological matrix. It also would appear that little attention has been given to the optimum postirradiation carrier concentration of the analyte species necessary. This study discusses these various considerations.« less

  16. Application of thin layer activation technique for surface wear studies in Zr based materials using charged particle induced nuclear reactions

    NASA Astrophysics Data System (ADS)

    Chowdhury, D. P.; Pal, Sujit; Parthasarathy, R.; Mathur, P. K.; Kohli, A. K.; Limaye, P. K.

    1998-09-01

    Thin layer activation (TLA) technique has been developed in Zr based alloy materials, e.g., zircaloy II, using 40 MeV α-particles from Variable Energy Cyclotron Centre at Calcutta. A brief description of the methodology of TLA technique is presented to determine the surface wear. The sensitivity of the measurement of surface wear in zircaloy material is found to be 0.22±0.05 μm. The surface wear is determined by TLA technique in zircaloy material which is used in pressurised heavy water reactor and the values have been compared with that obtained by conventional technique for the analytical validation of the TLA technique.

  17. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    NASA Technical Reports Server (NTRS)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  18. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  19. Laboratory Training Manual on the Use of Nuclear Techniques in Pesticide Research. Technical Reports Series No. 225.

    ERIC Educational Resources Information Center

    International Atomic Energy Agency, Vienna (Austria).

    Radiolabelled pesticides are used: in studies involving improved formulations of pesticides, to assist in developing standard residue analytical methodology, and in obtaining metabolism data to support registration of pesticides. This manual is designed to give the scientist involved in pesticide research the basic terms and principles for…

  20. Exploring the Micro-Social Geography of Children's Interactions in Preschool: A Long-Term Observational Study and Analysis Using Geographic Information Technologies

    ERIC Educational Resources Information Center

    Torrens, Paul M.; Griffin, William A.

    2013-01-01

    The authors describe an observational and analytic methodology for recording and interpreting dynamic microprocesses that occur during social interaction, making use of space--time data collection techniques, spatial-statistical analysis, and visualization. The scheme has three investigative foci: Structure, Activity Composition, and Clustering.…

  1. Model documentation: Electricity Market Module, Electricity Fuel Dispatch Submodule

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report documents the objectives, analytical approach and development of the National Energy Modeling System Electricity Fuel Dispatch Submodule (EFD), a submodule of the Electricity Market Module (EMM). The report catalogues and describes the model assumptions, computational methodology, parameter estimation techniques, model source code, and forecast results generated through the synthesis and scenario development based on these components.

  2. A micromechanics-based strength prediction methodology for notched metal matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1992-01-01

    An analytical micromechanics based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and post fatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  3. A micromechanics-based strength prediction methodology for notched metal-matrix composites

    NASA Technical Reports Server (NTRS)

    Bigelow, C. A.

    1993-01-01

    An analytical micromechanics-based strength prediction methodology was developed to predict failure of notched metal matrix composites. The stress-strain behavior and notched strength of two metal matrix composites, boron/aluminum (B/Al) and silicon-carbide/titanium (SCS-6/Ti-15-3), were predicted. The prediction methodology combines analytical techniques ranging from a three-dimensional finite element analysis of a notched specimen to a micromechanical model of a single fiber. In the B/Al laminates, a fiber failure criteria based on the axial and shear stress in the fiber accurately predicted laminate failure for a variety of layups and notch-length to specimen-width ratios with both circular holes and sharp notches when matrix plasticity was included in the analysis. For the SCS-6/Ti-15-3 laminates, a fiber failure based on the axial stress in the fiber correlated well with experimental results for static and postfatigue residual strengths when fiber matrix debonding and matrix cracking were included in the analysis. The micromechanics-based strength prediction methodology offers a direct approach to strength prediction by modeling behavior and damage on a constituent level, thus, explicitly including matrix nonlinearity, fiber matrix debonding, and matrix cracking.

  4. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007*

    PubMed Central

    Gore, Sally A.; Nordberg, Judith M.; Palmer, Lisa A.

    2009-01-01

    Objective: This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Methodology: Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991–2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Results: Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. Conclusion: This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians. PMID:19626146

  5. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.

  7. A Methodology for Determining Statistical Performance Compliance for Airborne Doppler Radar with Forward-Looking Turbulence Detection Capability

    NASA Technical Reports Server (NTRS)

    Bowles, Roland L.; Buck, Bill K.

    2009-01-01

    The objective of the research developed and presented in this document was to statistically assess turbulence hazard detection performance employing airborne pulse Doppler radar systems. The FAA certification methodology for forward looking airborne turbulence radars will require estimating the probabilities of missed and false hazard indications under operational conditions. Analytical approaches must be used due to the near impossibility of obtaining sufficient statistics experimentally. This report describes an end-to-end analytical technique for estimating these probabilities for Enhanced Turbulence (E-Turb) Radar systems under noise-limited conditions, for a variety of aircraft types, as defined in FAA TSO-C134. This technique provides for one means, but not the only means, by which an applicant can demonstrate compliance to the FAA directed ATDS Working Group performance requirements. Turbulence hazard algorithms were developed that derived predictive estimates of aircraft hazards from basic radar observables. These algorithms were designed to prevent false turbulence indications while accurately predicting areas of elevated turbulence risks to aircraft, passengers, and crew; and were successfully flight tested on a NASA B757-200 and a Delta Air Lines B737-800. Application of this defined methodology for calculating the probability of missed and false hazard indications taking into account the effect of the various algorithms used, is demonstrated for representative transport aircraft and radar performance characteristics.

  8. RLV Turbine Performance Optimization

    NASA Technical Reports Server (NTRS)

    Griffin, Lisa W.; Dorney, Daniel J.

    2001-01-01

    A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.

  9. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  10. Design, Implementation, and Operational Methodologies for Sub-arcsecond Attitude Determination, Control, and Stabilization of the Super-pressure Balloon-Borne Imaging Telescope (SuperBIT)

    NASA Astrophysics Data System (ADS)

    Javier Romualdez, Luis

    Scientific balloon-borne instrumentation offers an attractive, competitive, and effective alternative to space-borne missions when considering the overall scope, cost, and development timescale required to design and launch scientific instruments. In particular, the balloon-borne environment provides a near-space regime that is suitable for a number of modern astronomical and cosmological experiments, where the atmospheric interference suffered by ground-based instrumentation is negligible at stratospheric altitudes. This work is centered around the analytical strategies and implementation considerations for the attitude determination and control of SuperBIT, a scientific balloon-borne payload capable of meeting the strict sub-arcsecond pointing and image stability requirements demanded by modern cosmological experiments. Broadly speaking, the designed stability specifications of SuperBIT coupled with its observational efficiency, image quality, and accessibility rivals state-of-the-art astronomical observatories such as the Hubble Space Telescope. To this end, this work presents an end-to-end design methodology for precision pointing balloon-borne payloads such as SuperBIT within an analytical yet implementationally grounded context. Simulation models of SuperBIT are analytically derived to aid in pre-assembly trade-off and case studies that are pertinent to the dynamic balloon-borne environment. From these results, state estimation techniques and control methodologies are extensively developed, leveraging the analytical framework of simulation models and design studies. This pre-assembly design phase is physically validated during assembly, integration, and testing through implementation in real-time hardware and software, which bridges the gap between analytical results and practical application. SuperBIT attitude determination and control is demonstrated throughout two engineering test flights that verify pointing and image stability requirements in flight, where the post-flight results close the overall design loop by suggesting practical improvements to pre-design methodologies. Overall, the analytical and practical results presented in this work, though centered around the SuperBIT project, provide generically useful and implementationally viable methodologies for high precision balloon-borne instrumentation, all of which are validated, justified, and improved both theoretically and practically. As such, the continuing development of SuperBIT, built from the work presented in this thesis, strives to further the potential for scientific balloon-borne astronomy in the near future.

  11. Development of garlic bioactive compounds analytical methodology based on liquid phase microextraction using response surface design. Implications for dual analysis: Cooked and biological fluids samples.

    PubMed

    Ramirez, Daniela Andrea; Locatelli, Daniela Ana; Torres-Palazzolo, Carolina Andrea; Altamirano, Jorgelina Cecilia; Camargo, Alejandra Beatriz

    2017-01-15

    Organosulphur compounds (OSCs) present in garlic (Allium sativum L.) are responsible of several biological properties. Functional foods researches indicate the importance of quantifying these compounds in food matrices and biological fluids. For this purpose, this paper introduces a novel methodology based on dispersive liquid-liquid microextraction (DLLME) coupled to high performance liquid chromatography with ultraviolet detector (HPLC-UV) for the extraction and determination of organosulphur compounds in different matrices. The target analytes were allicin, (E)- and (Z)-ajoene, 2-vinyl-4H-1,2-dithiin (2-VD), diallyl sulphide (DAS) and diallyl disulphide (DADS). The microextraction technique was optimized using an experimental design, and the analytical performance was evaluated under optimum conditions. The desirability function presented an optimal value for 600μL of chloroform as extraction solvent using acetonitrile as dispersant. The method proved to be reliable, precise and accurate. It was successfully applied to determine OSCs in cooked garlic samples as well as blood plasma and digestive fluids. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Computational Methodology for Absolute Calibration Curves for Microfluidic Optical Analyses

    PubMed Central

    Chang, Chia-Pin; Nagel, David J.; Zaghloul, Mona E.

    2010-01-01

    Optical fluorescence and absorption are two of the primary techniques used for analytical microfluidics. We provide a thorough yet tractable method for computing the performance of diverse optical micro-analytical systems. Sample sizes range from nano- to many micro-liters and concentrations from nano- to milli-molar. Equations are provided to trace quantitatively the flow of the fundamental entities, namely photons and electrons, and the conversion of energy from the source, through optical components, samples and spectral-selective components, to the detectors and beyond. The equations permit facile computations of calibration curves that relate the concentrations or numbers of molecules measured to the absolute signals from the system. This methodology provides the basis for both detailed understanding and improved design of microfluidic optical analytical systems. It saves prototype turn-around time, and is much simpler and faster to use than ray tracing programs. Over two thousand spreadsheet computations were performed during this study. We found that some design variations produce higher signal levels and, for constant noise levels, lower minimum detection limits. Improvements of more than a factor of 1,000 were realized. PMID:22163573

  13. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  14. Methodology for the systems engineering process. Volume 3: Operational availability

    NASA Technical Reports Server (NTRS)

    Nelson, J. H.

    1972-01-01

    A detailed description and explanation of the operational availability parameter is presented. The fundamental mathematical basis for operational availability is developed, and its relationship to a system's overall performance effectiveness is illustrated within the context of identifying specific availability requirements. Thus, in attempting to provide a general methodology for treating both hypothetical and existing availability requirements, the concept of an availability state, in conjunction with the more conventional probability-time capability, is investigated. In this respect, emphasis is focused upon a balanced analytical and pragmatic treatment of operational availability within the system design process. For example, several applications of operational availability to typical aerospace systems are presented, encompassing the techniques of Monte Carlo simulation, system performance availability trade-off studies, analytical modeling of specific scenarios, as well as the determination of launch-on-time probabilities. Finally, an extensive bibliography is provided to indicate further levels of depth and detail of the operational availability parameter.

  15. Two-factor theory – at the intersection of health care management and patient satisfaction

    PubMed Central

    Bohm, Josef

    2012-01-01

    Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg’s motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants’ self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables. PMID:23055755

  16. Two-factor theory - at the intersection of health care management and patient satisfaction.

    PubMed

    Bohm, Josef

    2012-01-01

    Using data obtained from the 2004 Joint Canadian/United States Survey of Health, an analytic model using principles derived from Herzberg's motivational hygiene theory was developed for evaluating patient satisfaction with health care. The analysis sought to determine whether survey variables associated with consumer satisfaction act as Hertzberg factors and contribute to survey participants' self-reported levels of health care satisfaction. To validate the technique, data from the survey were analyzed using logistic regression methods and then compared with results obtained from the two-factor model. The findings indicate a high degree of correlation between the two methods. The two-factor analytical methodology offers advantages due to its ability to identify whether a factor assumes a motivational or hygienic role and assesses the influence of a factor within select populations. Its ease of use makes this methodology well suited for assessment of multidimensional variables.

  17. "Reagentless" flow injection determination of ammonia and urea using membrane separation and solid phase basification

    NASA Technical Reports Server (NTRS)

    Akse, J. R.; Thompson, J. O.; Sauer, R. L.; Atwater, J. E.

    1998-01-01

    Flow injection analysis instrumentation and methodology for the determination of ammonia and ammonium ions in an aqueous solution are described. Using in-line solid phase basification beds containing crystalline media. the speciation of ammoniacal nitrogen is shifted toward the un-ionized form. which diffuses in the gas phase across a hydrophobic microporous hollow fiber membrane into a pure-water-containing analytical stream. The two streams flow in a countercurrent configuration on opposite sides of the membrane. The neutral pH of the analytical stream promotes the formation of ammonium cations, which are detected using specific conductance. The methodology provides a lower limit of detection of 10 microgram/L and a dynamic concentration range spanning three orders of magnitude using a 315-microliters sample injection volume. Using immobilized urease to enzymatically promote the hydrolysis of urea to produce ammonia and carbon dioxide, the technique has been extended to the determination of urea.

  18. Nanometrology and its perspectives in environmental research.

    PubMed

    Kim, Hyun-A; Seo, Jung-Kwan; Kim, Taksoo; Lee, Byung-Tae

    2014-01-01

    Rapid increase in engineered nanoparticles (ENPs) in many goods has raised significant concern about their environmental safety. Proper methodologies are therefore needed to conduct toxicity and exposure assessment of nanoparticles in the environment. This study reviews several analytical techniques for nanoparticles and summarizes their principles, advantages and disadvantages, reviews the state of the art, and offers the perspectives of nanometrology in relation to ENP studies. Nanometrology is divided into five techniques with regard to the instrumental principle: microscopy, light scattering, spectroscopy, separation, and single particle inductively coupled plasma-mass spectrometry. Each analytical method has its own drawbacks, such as detection limit, ability to quantify or qualify ENPs, and matrix effects. More than two different analytical methods should be used to better characterize ENPs. In characterizing ENPs, the researchers should understand the nanometrology and its demerits, as well as its merits, to properly interpret their experimental results. Challenges lie in the nanometrology and pretreatment of ENPs from various matrices; in the extraction without dissolution or aggregation, and concentration of ENPs to satisfy the instrumental detection limit.

  19. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities.

    PubMed

    Cabrera-Barona, Pablo; Ghorbanzadeh, Omid

    2018-01-16

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.

  20. Comparing Classic and Interval Analytical Hierarchy Process Methodologies for Measuring Area-Level Deprivation to Analyze Health Inequalities

    PubMed Central

    Cabrera-Barona, Pablo

    2018-01-01

    Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas. PMID:29337915

  1. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  2. The Recovery Care and Treatment Center: A Database Design and Development Case

    ERIC Educational Resources Information Center

    Harris, Ranida B.; Vaught, Kara L.

    2008-01-01

    The advantages of active learning methodologies have been suggested and empirically shown by a number of IS educators. Case studies are one such teaching technique that offers students the ability to think analytically, apply material learned, and solve a real-world problem. This paper presents a case study designed to be used in a database design…

  3. Determinants of project success

    NASA Technical Reports Server (NTRS)

    Murphy, D. C.; Baker, B. N.; Fisher, D.

    1974-01-01

    The interactions of numerous project characteristics, with particular reference to project performance, were studied. Determinants of success are identified along with the accompanying implications for client organization, parent organization, project organization, and future research. Variables are selected which are found to have the greatest impact on project outcome, and the methodology and analytic techniques to be employed in identification of those variables are discussed.

  4. Endogenous glucocorticoid analysis by liquid chromatography-tandem mass spectrometry in routine clinical laboratories.

    PubMed

    Hawley, James M; Keevil, Brian G

    2016-09-01

    Liquid chromatography-tandem mass spectrometry (LC-MS/MS) is a powerful analytical technique that offers exceptional selectivity and sensitivity. Used optimally, LC-MS/MS provides accurate and precise results for a wide range of analytes at concentrations that are difficult to quantitate with other methodologies. Its implementation into routine clinical biochemistry laboratories has revolutionised our ability to analyse small molecules such as glucocorticoids. Whereas immunoassays can suffer from matrix effects and cross-reactivity due to interactions with structural analogues, the selectivity offered by LC-MS/MS has largely overcome these limitations. As many clinical guidelines are now beginning to acknowledge the importance of the methodology used to provide results, the advantages associated with LC-MS/MS are gaining wider recognition. With their integral role in both the diagnosis and management of hypo- and hyperadrenal disorders, coupled with their widespread pharmacological use, the accurate measurement of glucocorticoids is fundamental to effective patient care. Here, we provide an up-to-date review of the LC-MS/MS techniques used to successfully measure endogenous glucocorticoids, particular reference is made to serum, urine and salivary cortisol. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A guide for the application of analytics on healthcare processes: A dynamic view on patient pathways.

    PubMed

    Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan

    2016-10-01

    The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. 1990 National Water Quality Laboratory Services Catalog

    USGS Publications Warehouse

    Pritt, Jeffrey; Jones, Berwyn E.

    1989-01-01

    PREFACE This catalog provides information about analytical services available from the National Water Quality Laboratory (NWQL) to support programs of the Water Resources Division of the U.S. Geological Survey. To assist personnel in the selection of analytical services, the catalog lists cost, sample volume, applicable concentration range, detection level, precision of analysis, and preservation techniques for samples to be submitted for analysis. Prices for services reflect operationa1 costs, the complexity of each analytical procedure, and the costs to ensure analytical quality control. The catalog consists of five parts. Part 1 is a glossary of terminology; Part 2 lists the bottles, containers, solutions, and other materials that are available through the NWQL; Part 3 describes the field processing of samples to be submitted for analysis; Part 4 describes analytical services that are available; and Part 5 contains indices of analytical methodology and Chemical Abstract Services (CAS) numbers. Nomenclature used in the catalog is consistent with WATSTORE and STORET. The user is provided with laboratory codes and schedules that consist of groupings of parameters which are measured together in the NWQL. In cases where more than one analytical range is offered for a single element or compound, different laboratory codes are given. Book 5 of the series 'Techniques of Water Resources Investigations of the U.S. Geological Survey' should be consulted for more information about the analytical procedures included in the tabulations. This catalog supersedes U.S. Geological Survey Open-File Report 86-232 '1986-87-88 National Water Quality Laboratory Services Catalog', October 1985.

  7. Training the next generation analyst using red cell analytics

    NASA Astrophysics Data System (ADS)

    Graham, Meghan N.; Graham, Jacob L.

    2016-05-01

    We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.

  8. Coalescence computations for large samples drawn from populations of time-varying sizes

    PubMed Central

    Polanski, Andrzej; Szczesna, Agnieszka; Garbulowski, Mateusz; Kimmel, Marek

    2017-01-01

    We present new results concerning probability distributions of times in the coalescence tree and expected allele frequencies for coalescent with large sample size. The obtained results are based on computational methodologies, which involve combining coalescence time scale changes with techniques of integral transformations and using analytical formulae for infinite products. We show applications of the proposed methodologies for computing probability distributions of times in the coalescence tree and their limits, for evaluation of accuracy of approximate expressions for times in the coalescence tree and expected allele frequencies, and for analysis of large human mitochondrial DNA dataset. PMID:28170404

  9. S.S. Annunziata Church (L'Aquila, Italy) unveiled by non- and micro-destructive testing techniques

    NASA Astrophysics Data System (ADS)

    Sfarra, Stefano; Cheilakou, Eleni; Theodorakeas, Panagiotis; Paoletti, Domenica; Koui, Maria

    2017-03-01

    The present research work explores the potential of an integrated inspection methodology, combining Non-destructive testing and micro-destructive analytical techniques, for both the structural assessment of the S.S. Annunziata Church located in Roio Colle (L'Aquila, Italy) and the characterization of its wall paintings' pigments. The study started by applying passive thermal imaging for the structural monitoring of the church before and after the application of a consolidation treatment, while active thermal imaging was further used for assessing this consolidation procedure. After the earthquake of 2009, which seriously damaged the city of L'Aquila and its surroundings, part of the internal plaster fell off revealing the presence of an ancient mural painting that was subsequently investigated by means of a combined analytical approach involving portable VIS-NIR fiber optics diffuse reflectance spectroscopy (FORS) and laboratory methods, such as environmental scanning electron microscopy (ESEM) coupled with energy dispersive X-ray analysis (EDX), and attenuated total reflectance-fourier transform infrared spectroscopy (ATR-FTIR). The results obtained from the thermographic analysis provided information concerning the two different constrictive phases of the Church, enabled the assessment of the consolidation treatment, and contributed to the detection of localized problems mainly related to the rising damp phenomenon and to biological attack. In addition, the results obtained from the combined analytical approach allowed the identification of the wall painting pigments (red and yellow ochre, green earth, and smalt) and provided information on the binding media and the painting technique possibly applied by the artist. From the results of the present study, it is possible to conclude that the joint use of the above stated methods into an integrated methodology can produce the complete set of useful information required for the planning of the Church's restoration phase.

  10. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  11. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    USGS Publications Warehouse

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  12. A rapid and sensitive analytical method for the determination of 14 pyrethroids in water samples.

    PubMed

    Feo, M L; Eljarrat, E; Barceló, D

    2010-04-09

    A simple, efficient and environmentally friendly analytical methodology is proposed for extracting and preconcentrating pyrethroids from water samples prior to gas chromatography-negative ion chemical ionization mass spectrometry (GC-NCI-MS) analysis. Fourteen pyrethroids were selected for this work: bifenthrin, cyfluthrin, lambda-cyhalothrin, cypermethrin, deltamethrin, esfenvalerate, fenvalerate, fenpropathrin, tau-fluvalinate, permethrin, phenothrin, resmethrin, tetramethrin and tralomethrin. The method is based on ultrasound-assisted emulsification-extraction (UAEE) of a water-immiscible solvent in an aqueous medium. Chloroform was used as extraction solvent in the UAEE technique. Target analytes were quantitatively extracted achieving an enrichment factor of 200 when 20 mL aliquot of pure water spiked with pyrethroid standards was extracted. The method was also evaluated with tap water and river water samples. Method detection limits (MDLs) ranged from 0.03 to 35.8 ng L(-1) with RSDs values < or =3-25% (n=5). The coefficients of estimation of the calibration curves obtained following the proposed methodology were > or =0.998. Recovery values were in the range of 45-106%, showing satisfactory robustness of the method for analyzing pyrethroids in water samples. The proposed methodology was applied for the analysis of river water samples. Cypermethrin was detected at concentration levels ranging from 4.94 to 30.5 ng L(-1). Copyright 2010 Elsevier B.V. All rights reserved.

  13. Analytical robustness of quantitative NIR chemical imaging for Islamic paper characterization

    NASA Astrophysics Data System (ADS)

    Mahgoub, Hend; Gilchrist, John R.; Fearn, Thomas; Strlič, Matija

    2017-07-01

    Recently, spectral imaging techniques such as Multispectral (MSI) and Hyperspectral Imaging (HSI) have gained importance in the field of heritage conservation. This paper explores the analytical robustness of quantitative chemical imaging for Islamic paper characterization by focusing on the effect of different measurement and processing parameters, i.e. acquisition conditions and calibration on the accuracy of the collected spectral data. This will provide a better understanding of the technique that can provide a measure of change in collections through imaging. For the quantitative model, special calibration target was devised using 105 samples from a well-characterized reference Islamic paper collection. Two material properties were of interest: starch sizing and cellulose degree of polymerization (DP). Multivariate data analysis methods were used to develop discrimination and regression models which were used as an evaluation methodology for the metrology of quantitative NIR chemical imaging. Spectral data were collected using a pushbroom HSI scanner (Gilden Photonics Ltd) in the 1000-2500 nm range with a spectral resolution of 6.3 nm using a mirror scanning setup and halogen illumination. Data were acquired at different measurement conditions and acquisition parameters. Preliminary results showed the potential of the evaluation methodology to show that measurement parameters such as the use of different lenses and different scanning backgrounds may not have a great influence on the quantitative results. Moreover, the evaluation methodology allowed for the selection of the best pre-treatment method to be applied to the data.

  14. Analytical capillary isotachophoresis after 50 years of development: Recent progress 2014-2016.

    PubMed

    Malá, Zdena; Gebauer, Petr; Boček, Petr

    2017-01-01

    This review brings a survey of papers on analytical ITP published since 2014 until the first quarter of 2016. The 50th anniversary of ITP as a modern analytical method offers the opportunity to present a brief view on its beginnings and to discuss the present state of the art from the viewpoint of the history of its development. Reviewed papers from the field of theory and principles confirm the continuing importance of computer simulations in the discovery of new and unexpected phenomena. The strongly developing field of instrumentation and techniques shows novel channel methodologies including use of porous media and new on-chip assays, where ITP is often included in a preseparative or even preparative function. A number of new analytical applications are reported, with ITP appearing almost exclusively in combination with other principles and methods. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Analytical model for real time, noninvasive estimation of blood glucose level.

    PubMed

    Adhyapak, Anoop; Sidley, Matthew; Venkataraman, Jayanti

    2014-01-01

    The paper presents an analytical model to estimate blood glucose level from measurements made non-invasively and in real time by an antenna strapped to a patient's wrist. Some promising success has been shown by the RIT ETA Lab research group that an antenna's resonant frequency can track, in real time, changes in glucose concentration. Based on an in-vitro study of blood samples of diabetic patients, the paper presents a modified Cole-Cole model that incorporates a factor to represent the change in glucose level. A calibration technique using the input impedance technique is discussed and the results show a good estimation as compared to the glucose meter readings. An alternate calibration methodology has been developed that is based on the shift in the antenna resonant frequency using an equivalent circuit model containing a shunt capacitor to represent the shift in resonant frequency with changing glucose levels. Work under progress is the optimization of the technique with a larger sample of patients.

  16. Shuttle TPS thermal performance and analysis methodology

    NASA Technical Reports Server (NTRS)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  17. A Learning Framework for Control-Oriented Modeling of Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubio-Herrero, Javier; Chandan, Vikas; Siegel, Charles M.

    Buildings consume a significant amount of energy worldwide. Several building optimization and control use cases require models of energy consumption which are control oriented, have high predictive capability, imposes minimal data pre-processing requirements, and have the ability to be adapted continuously to account for changing conditions as new data becomes available. Data driven modeling techniques, that have been investigated so far, while promising in the context of buildings, have been unable to simultaneously satisfy all the requirements mentioned above. In this context, deep learning techniques such as Recurrent Neural Networks (RNNs) hold promise, empowered by advanced computational capabilities and bigmore » data opportunities. In this paper, we propose a deep learning based methodology for the development of control oriented models for building energy management and test in on data from a real building. Results show that the proposed methodology outperforms other data driven modeling techniques significantly. We perform a detailed analysis of the proposed methodology along dimensions such as topology, sensitivity, and downsampling. Lastly, we conclude by envisioning a building analytics suite empowered by the proposed deep framework, that can drive several use cases related to building energy management.« less

  18. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Functionality of empirical model-based predictive analytics for the early detection of hemodynamic instabilty.

    PubMed

    Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C

    2014-01-01

    Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patient’s pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (“SBM”), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or “QCP”) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patient’s physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patient’s condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.

  20. Fluorescence Spectroscopy for the Monitoring of Food Processes.

    PubMed

    Ahmad, Muhammad Haseeb; Sahar, Amna; Hitzmann, Bernd

    Different analytical techniques have been used to examine the complexity of food samples. Among them, fluorescence spectroscopy cannot be ignored in developing rapid and non-invasive analytical methodologies. It is one of the most sensitive spectroscopic approaches employed in identification, classification, authentication, quantification, and optimization of different parameters during food handling, processing, and storage and uses different chemometric tools. Chemometrics helps to retrieve useful information from spectral data utilized in the characterization of food samples. This contribution discusses in detail the potential of fluorescence spectroscopy of different foods, such as dairy, meat, fish, eggs, edible oil, cereals, fruit, vegetables, etc., for qualitative and quantitative analysis with different chemometric approaches.

  1. [Enzymatic analysis of the quality of foodstuffs].

    PubMed

    Kolesnov, A Iu

    1997-01-01

    Enzymatic analysis is an independent and separate branch of enzymology and analytical chemistry. It has become one of the most important methodologies used in food analysis. Enzymatic analysis allows the quick, reliable determination of many food ingredients. Often these contents cannot be determined by conventional methods, or if methods are available, they are determined only with limited accuracy. Today, methods of enzymatic analysis are being increasingly used in the investigation of foodstuffs. Enzymatic measurement techniques are used in industry, scientific and food inspection laboratories for quality analysis. This article describes the requirements of an optimal analytical method: specificity, sample preparation, assay performance, precision, sensitivity, time requirement, analysis cost, safety of reagents.

  2. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  3. Evaluative methodology for prioritizing transportation energy conservation strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pang, L.M.G.

    An analytical methodology was developed for the purpose of prioritizing a set of transportation energy conservation (TEC) strategies within an urban environment. Steps involved in applying the methodology consist of 1) defining the goals, objectives and constraints of the given urban community, 2) identifying potential TEC strategies, 3) assessing the impact of the strategies, 4) applying the TEC evaluation model, and 5) utilizing a selection process to determine the optimal set of strategies for implementation. This research provides an overview of 21 TEC strategies, a quick-response technique for estimating energy savings, a multiattribute utility theory approach for assessing subjective impacts,more » and a computer program for making the strategy evaluations, all of which assist in expediting the execution of the entire methodology procedure. The critical element of the methodology is the strategy evaluation model which incorporates a number of desirable concepts including 1) a comprehensive accounting of all relevant impacts, 2) the application of multiobjective decision-making techniques, 3) an approach to assure compatibilty among quantitative and qualitative impact measures, 4) the inclusion of the decision maker's preferences in the evaluation procedure, and 5) the cost-effectiveness concept. Application of the methodolgy to Salt Lake City, Utah demonstrated its utility, ease of use and favorability by decision makers.« less

  4. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part II: applications.

    PubMed

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    The organoleptic quality of virgin olive oil depends on positive and negative sensory attributes. These attributes are related to volatile organic compounds and phenolic compounds that represent the aroma and taste (flavour) of the virgin olive oil. The flavour is the characteristic that can be measured by a taster panel. However, as for any analytical measuring device, the tasters, individually, and the panel, as a whole, should be harmonized and validated and proper olive oil standards are needed. In the present study, multivariate approaches are put into practice in addition to the rules to build a multivariate control chart from chromatographic volatile fingerprinting and chemometrics. Fingerprinting techniques provide analytical information without identify and quantify the analytes. This methodology is used to monitor the stability of sensory reference materials. The similarity indices have been calculated to build multivariate control chart with two olive oils certified reference materials that have been used as examples to monitor their stabilities. This methodology with chromatographic data could be applied in parallel with the 'panel test' sensory method to reduce the work of sensory analysis. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  5. New techniques for imaging and analyzing lung tissue.

    PubMed Central

    Roggli, V L; Ingram, P; Linton, R W; Gutknecht, W F; Mastin, P; Shelburne, J D

    1984-01-01

    The recent technological revolution in the field of imaging techniques has provided pathologists and toxicologists with an expanding repertoire of analytical techniques for studying the interaction between the lung and the various exogenous materials to which it is exposed. Analytical problems requiring elemental sensitivity or specificity beyond the range of that offered by conventional scanning electron microscopy and energy dispersive X-ray analysis are particularly appropriate for the application of these newer techniques. Electron energy loss spectrometry, Auger electron spectroscopy, secondary ion mass spectrometry, and laser microprobe mass analysis each offer unique advantages in this regard, but also possess their own limitations and disadvantages. Diffraction techniques provide crystalline structural information available through no other means. Bulk chemical techniques provide useful cross-checks on the data obtained by microanalytical approaches. It is the purpose of this review to summarize the methodology of these techniques, acknowledge situations in which they have been used in addressing problems in pulmonary toxicology, and comment on the relative advantages and disadvantages of each approach. It is necessary for an investigator to weigh each of these factors when deciding which technique is best suited for any given analytical problem; often it is useful to employ a combination of two or more of the techniques discussed. It is anticipated that there will be increasing utilization of these technologies for problems in pulmonary toxicology in the decades to come. Images FIGURE 3. A FIGURE 3. B FIGURE 3. C FIGURE 3. D FIGURE 4. FIGURE 5. FIGURE 7. A FIGURE 7. B FIGURE 8. A FIGURE 8. B FIGURE 8. C FIGURE 9. A FIGURE 9. B FIGURE 10. PMID:6090115

  6. Characterization techniques for nano-electronics, with emphasis to electron microscopy. The role of the European Project ANNA

    NASA Astrophysics Data System (ADS)

    Armigliato, A.

    2008-07-01

    In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.

  7. A System Analysis for Determining Alternative Technological Issues for the Future

    NASA Technical Reports Server (NTRS)

    Magistrale, V. J.; Small, J.

    1967-01-01

    A systems engineering methodology is provided, by which future technological ventures may be examined utilizing particular national, corporate, or individual value judgments. Three matrix analyses are presented. The first matrix is concerned with the effect of technology on population increase, war, poverty, health, resources, and prejudice. The second matrix explores an analytical technique for determining the relative importance of different areas of technology. The third matrix explores how an individual or corporate entity may determine how its capability may be used for future technological opportunities. No conclusions are presented since primary effort has been placed on the methodology of determining future technological issues.

  8. New methodology for the analysis of volatile organic compounds (VOCs) in bioethanol by gas chromatography coupled to mass spectrometry

    NASA Astrophysics Data System (ADS)

    Campos, M. S. G.; Sarkis, J. E. S.

    2018-03-01

    The present study presents a new analytical methodology for the determination of 11 compounds present in ethanol samples through the gas chromatography coupled to mass spectrometry (GC-MS) technique using a medium polarity chromatography column composed of 6% cyanopropyl-phenyl and 94% dimethyl polysiloxane. The validation parameters were determined according to NBR ISO 17025:2005. The recovery rates of the studied compounds were 100.4% to 114.7%. The limits of quantification are between 2.4 mg.kg-1 and 5.8 mg.kg-1. The uncertainty of the measurement was estimate in circa of 8%.

  9. Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.

  10. Research in health sciences library and information science: a quantitative analysis.

    PubMed Central

    Dimitroff, A

    1992-01-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504

  11. Analysis of low molecular weight metabolites in tea using mass spectrometry-based analytical methods.

    PubMed

    Fraser, Karl; Harrison, Scott J; Lane, Geoff A; Otter, Don E; Hemar, Yacine; Quek, Siew-Young; Rasmussen, Susanne

    2014-01-01

    Tea is the second most consumed beverage in the world after water and there are numerous reported health benefits as a result of consuming tea, such as reducing the risk of cardiovascular disease and many types of cancer. Thus, there is much interest in the chemical composition of teas, for example; defining components responsible for contributing to reported health benefits; defining quality characteristics such as product flavor; and monitoring for pesticide residues to comply with food safety import/export requirements. Covered in this review are some of the latest developments in mass spectrometry-based analytical techniques for measuring and characterizing low molecular weight components of tea, in particular primary and secondary metabolites. The methodology; more specifically the chromatography and detection mechanisms used in both targeted and non-targeted studies, and their main advantages and disadvantages are discussed. Finally, we comment on the latest techniques that are likely to have significant benefit to analysts in the future, not merely in the area of tea research, but in the analytical chemistry of low molecular weight compounds in general.

  12. Reactor safeguards system assessment and design. Volume I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varnado, G.B.; Ericson, D.M. Jr.; Daniel, S.L.

    1978-06-01

    This report describes the development and application of a methodology for evaluating the effectiveness of nuclear power reactor safeguards systems. Analytic techniques are used to identify the sabotage acts which could lead to release of radioactive material from a nuclear power plant, to determine the areas of a plant which must be protected to assure that significant release does not occur, to model the physical plant layout, and to evaluate the effectiveness of various safeguards systems. The methodology was used to identify those aspects of reactor safeguards systems which have the greatest effect on overall system performance and which, therefore,more » should be emphasized in the licensing process. With further refinements, the methodology can be used by the licensing reviewer to aid in assessing proposed or existing safeguards systems.« less

  13. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  14. Advanced analytical techniques for the extraction and characterization of plant-derived essential oils by gas chromatography with mass spectrometry.

    PubMed

    Waseem, Rabia; Low, Kah Hin

    2015-02-01

    In recent years, essential oils have received a growing interest because of the positive health effects of their novel characteristics such as antibacterial, antifungal, and antioxidant activities. For the extraction of plant-derived essential oils, there is the need of advanced analytical techniques and innovative methodologies. An exhaustive study of hydrodistillation, supercritical fluid extraction, ultrasound- and microwave-assisted extraction, solid-phase microextraction, pressurized liquid extraction, pressurized hot water extraction, liquid-liquid extraction, liquid-phase microextraction, matrix solid-phase dispersion, and gas chromatography (one- and two-dimensional) hyphenated with mass spectrometry for the extraction through various plant species and analysis of essential oils has been provided in this review. Essential oils are composed of mainly terpenes and terpenoids with low-molecular-weight aromatic and aliphatic constituents that are particularly important for public health. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Image charge models for accurate construction of the electrostatic self-energy of 3D layered nanostructure devices.

    PubMed

    Barker, John R; Martinez, Antonio

    2018-04-04

    Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.

  16. Image charge models for accurate construction of the electrostatic self-energy of 3D layered nanostructure devices

    NASA Astrophysics Data System (ADS)

    Barker, John R.; Martinez, Antonio

    2018-04-01

    Efficient analytical image charge models are derived for the full spatial variation of the electrostatic self-energy of electrons in semiconductor nanostructures that arises from dielectric mismatch using semi-classical analysis. The methodology provides a fast, compact and physically transparent computation for advanced device modeling. The underlying semi-classical model for the self-energy has been established and validated during recent years and depends on a slight modification of the macroscopic static dielectric constants for individual homogeneous dielectric regions. The model has been validated for point charges as close as one interatomic spacing to a sharp interface. A brief introduction to image charge methodology is followed by a discussion and demonstration of the traditional failure of the methodology to derive the electrostatic potential at arbitrary distances from a source charge. However, the self-energy involves the local limit of the difference between the electrostatic Green functions for the full dielectric heterostructure and the homogeneous equivalent. It is shown that high convergence may be achieved for the image charge method for this local limit. A simple re-normalisation technique is introduced to reduce the number of image terms to a minimum. A number of progressively complex 3D models are evaluated analytically and compared with high precision numerical computations. Accuracies of 1% are demonstrated. Introducing a simple technique for modeling the transition of the self-energy between disparate dielectric structures we generate an analytical model that describes the self-energy as a function of position within the source, drain and gated channel of a silicon wrap round gate field effect transistor on a scale of a few nanometers cross-section. At such scales the self-energies become large (typically up to ~100 meV) close to the interfaces as well as along the channel. The screening of a gated structure is shown to reduce the self-energy relative to un-gated nanowires.

  17. Environmental Quality Standards Research on Wastewaters of Army Ammunition Plants

    DTIC Science & Technology

    1978-06-01

    characterization of nitrocellulose wastewaters. We are grateful to LTC Leroy H. Reuter and LTC Robert P. Carnahan of the US Army Medical Research and...analytical methodology was required to characterize the wastes. The techniques used for fingerprinting (showing that the same compound exists although its...examination of the NC wastewaters has somewhat clarified the problem of characterizing the NC and has caused us to change or modify previous

  18. 2018 Ground Robotics Capabilities Conference and Exhibiton

    DTIC Science & Technology

    2018-04-11

    Transportable Robot System (MTRS) Inc 1 Non -standard Equipment (approved) Explosive Ordnance Disposal Common Robotic System-Heavy (CRS-H) Inc 1 AROC: 3-Star...and engineering • AI risk mitigation methodologies and techniques are at best immature – E.g., V&V; Probabilistic software analytics; code level...controller to minimize potential UxS mishaps and unauthorized Command and Control (C2). • PSP-10 – Ensure that software systems which exhibit non

  19. Chemical speciation of individual airborne particles by the combined use of quantitative energy-dispersive electron probe X-ray microanalysis and attenuated total reflection Fourier transform-infrared imaging techniques.

    PubMed

    Song, Young-Chul; Ryu, JiYeon; Malek, Md Abdul; Jung, Hae-Jin; Ro, Chul-Un

    2010-10-01

    In our previous work, it was demonstrated that the combined use of attenuated total reflectance (ATR) FT-IR imaging and quantitative energy-dispersive electron probe X-ray microanalysis (ED-EPMA), named low-Z particle EPMA, had the potential for characterization of individual aerosol particles. Additionally, the speciation of individual mineral particles was performed on a single particle level by the combined use of the two techniques, demonstrating that simultaneous use of the two single particle analytical techniques is powerful for the detailed characterization of externally heterogeneous mineral particle samples and has great potential for characterization of atmospheric mineral dust aerosols. These single particle analytical techniques provide complementary information on the physicochemical characteristics of the same individual particles, such as low-Z particle EPMA on morphology and elemental concentrations and the ATR-FT-IR imaging on molecular species, crystal structures, functional groups, and physical states. In this work, this analytical methodology was applied to characterize an atmospheric aerosol sample collected in Incheon, Korea. Overall, 118 individual particles were observed to be primarily NaNO(3)-containing, Ca- and/or Mg-containing, silicate, and carbonaceous particles, although internal mixing states of the individual particles proved complicated. This work demonstrates that more detailed physiochemical properties of individual airborne particles can be obtained using this approach than when either the low-Z particle EPMA or ATR-FT-IR imaging technique is used alone.

  20. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007.

    PubMed

    Gore, Sally A; Nordberg, Judith M; Palmer, Lisa A; Piorun, Mary E

    2009-07-01

    This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991-2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians.

  1. Methodological triangulation in a study of social support for siblings of children with cancer.

    PubMed

    Murray, J S

    1999-10-01

    Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.

  2. Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards

    PubMed Central

    Smith, Justin D.

    2013-01-01

    This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have precluded widespread implementation and acceptance of the SCED as a viable complementary methodology to the predominant group design. This article includes a description of the research design, measurement, and analysis domains distinctive to the SCED; a discussion of the results within the framework of contemporary standards and guidelines in the field; and a presentation of updated benchmarks for key characteristics (e.g., baseline sampling, method of analysis), and overall, it provides researchers and reviewers with a resource for conducting and evaluating SCED research. The results of the systematic review of 409 studies suggest that recently published SCED research is largely in accordance with contemporary criteria for experimental quality. Analytic method emerged as an area of discord. Comparison of the findings of this review with historical estimates of the use of statistical analysis indicates an upward trend, but visual analysis remains the most common analytic method and also garners the most support amongst those entities providing SCED standards. Although consensus exists along key dimensions of single-case research design and researchers appear to be practicing within these parameters, there remains a need for further evaluation of assessment and sampling techniques and data analytic methods. PMID:22845874

  3. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  4. Speciated arsenic in air: measurement methodology and risk assessment considerations.

    PubMed

    Lewis, Ari S; Reid, Kim R; Pollock, Margaret C; Campleman, Sharan L

    2012-01-01

    Accurate measurement of arsenic (As) in air is critical to providing a more robust understanding of arsenic exposures and associated human health risks. Although there is extensive information available on total arsenic in air, less is known on the relative contribution of each arsenic species. To address this data gap, the authors conducted an in-depth review of available information on speciated arsenic in air. The evaluation included the type of species measured and the relative abundance, as well as an analysis of the limitations of current analytical methods. Despite inherent differences in the procedures, most techniques effectively separated arsenic species in the air samples. Common analytical techniques such as inductively coupled plasma mass spectrometry (ICP-MS) and/or hydride generation (HG)- or quartz furnace (GF)-atomic absorption spectrometry (AAS) were used for arsenic measurement in the extracts, and provided some of the most sensitive detection limits. The current analysis demonstrated that, despite limited comparability among studies due to differences in seasonal factors, study duration, sample collection methods, and analytical methods, research conducted to date is adequate to show that arsenic in air is mainly in the inorganic form. Reported average concentrations of As(III) and As(V) ranged up to 7.4 and 10.4 ng/m3, respectively, with As(V) being more prevalent than As(III) in most studies. Concentrations of the organic methylated arsenic compounds are negligible (in the pg/m3 range). However because of the variability in study methods and measurement methodology, the authors were unable to determine the variation in arsenic composition as a function of source or particulate matter (PM) fraction. In this work, the authors include the implications of arsenic speciation in air on potential exposure and risks. The authors conclude that it is important to synchronize sample collection, preparation, and analytical techniques in order to generate data more useful for arsenic inhalation risk assessment, and a more robust documentation of quality assurance/quality control (QA/QC) protocols is necessary to ensure accuracy, precision, representativeness, and comparability.

  5. 2017 Workplace and Gender Relations Survey of Reserve Component Members: Statistical Methodology Report

    DTIC Science & Technology

    2018-04-30

    2017 Workplace and Gender Relations Survey of Reserve Component Members Statistical Methodology Report Additional copies of this report...Survey of Reserve Component Members Statistical Methodology Report Office of People Analytics (OPA) 4800 Mark Center Drive, Suite...RESERVE COMPONENT MEMBERS STATISTICAL METHODOLOGY REPORT Introduction The Office of People Analytics’ Center for Health and Resilience (OPA[H&R

  6. Modeling of classical swirl injector dynamics

    NASA Astrophysics Data System (ADS)

    Ismailov, Maksud M.

    The knowledge of the dynamics of a swirl injector is crucial in designing a stable liquid rocket engine. Since the swirl injector is a complex fluid flow device in itself, not much work has been conducted to describe its dynamics either analytically or by using computational fluid dynamics techniques. Even the experimental observation is limited up to date. Thus far, there exists an analytical linear theory by Bazarov [1], which is based on long-wave disturbances traveling on the free surface of the injector core. This theory does not account for variation of the nozzle reflection coefficient as a function of disturbance frequency, and yields a response function which is strongly dependent on the so called artificial viscosity factor. This causes an uncertainty in designing an injector for the given operational combustion instability frequencies in the rocket engine. In this work, the author has studied alternative techniques to describe the swirl injector response, both analytically and computationally. In the analytical part, by using the linear small perturbation analysis, the entire phenomenon of unsteady flow in swirl injectors is dissected into fundamental components, which are the phenomena of disturbance wave refraction and reflection, and vortex chamber resonance. This reveals the nature of flow instability and the driving factors leading to maximum injector response. In the computational part, by employing the nonlinear boundary element method (BEM), the author sets the boundary conditions such that they closely simulate those in the analytical part. The simulation results then show distinct peak responses at frequencies that are coincident with those resonant frequencies predicted in the analytical part. Moreover, a cold flow test of the injector related to this study also shows a clear growth of instability with its maximum amplitude at the first fundamental frequency predicted both by analytical methods and BEM. It shall be noted however that Bazarov's theory does not predict the resonant peaks. Overall this methodology provides clearer understanding of the injector dynamics compared to Bazarov's. Even though the exact value of response is not possible to obtain at this stage of theoretical, computational, and experimental investigation, this methodology sets the starting point from where the theoretical description of reflection/refraction, resonance, and their interaction between each other may be refined to higher order to obtain its more precise value.

  7. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  8. Methodologies for Optimum Capital Expenditure Decisions for New Medical Technology

    PubMed Central

    Landau, Thomas P.; Ledley, Robert S.

    1980-01-01

    This study deals with the development of a theory and an analytical model to support decisions regarding capital expenditures for complex new medical technology. Formal methodologies and quantitative techniques developed by applied mathematicians and management scientists can be used by health planners to develop cost-effective plans for the utilization of medical technology on a community or region-wide basis. In order to maximize the usefulness of the model, it was developed and tested against multiple technologies. The types of technologies studied include capital and labor-intensive technologies, technologies whose utilization rates vary with hospital occupancy rate, technologies whose use can be scheduled, and limited-use and large-use technologies.

  9. Aeroservoelastic wind-tunnel investigations using the Active Flexible Wing Model: Status and recent accomplishments

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Tiffany, Sherwood H.; Cole, Stanley R.; Buttrill, Carey S.; Adams, William M., Jr.; Houck, Jacob A.; Srinathkumar, S.; Mukhopadhyay, Vivek; Pototzky, Anthony S.

    1989-01-01

    The status of the joint NASA/Rockwell Active Flexible Wing Wind-Tunnel Test Program is described. The objectives are to develop and validate the analysis, design, and test methodologies required to apply multifunction active control technology for improving aircraft performance and stability. Major tasks include designing digital multi-input/multi-output flutter-suppression and rolling-maneuver-load alleviation concepts for a flexible full-span wind-tunnel model, obtaining an experimental data base for the basic model and each control concept and providing comparisons between experimental and analytical results to validate the methodologies. The opportunity is provided to improve real-time simulation techniques and to gain practical experience with digital control law implementation procedures.

  10. Design and fabrication of planar structures with graded electromagnetic properties

    NASA Astrophysics Data System (ADS)

    Good, Brandon Lowell

    Successfully integrating electromagnetic properties in planar structures offers numerous benefits to the microwave and optical communities. This work aims at formulating new analytic and optimized design methods, creating new fabrication techniques for achieving those methods, and matching appropriate implementation of methods to fabrication techniques. The analytic method consists of modifying an approach that realizes perfect antireflective properties from graded profiles. This method is shown for all-dielectric and magneto-dielectric grading profiles. The optimized design methods are applied to transformer (discrete) or taper (continuous) designs. From these methods, a subtractive and an additive manufacturing technique were established and are described. The additive method, dry powder dot deposition, enables three dimensional varying electromagnetic properties in a structural composite. Combining the methods and fabrication is shown in two applied methodologies. The first uses dry powder dot deposition to design one dimensionally graded electromagnetic profiles in a planar fiberglass composite. The second method simultaneously applies antireflective properties and adjusts directivity through a slab through the use of subwavelength structures to achieve a flat antireflective lens. The end result of this work is a complete set of methods, formulations, and fabrication techniques to achieve integrated electromagnetic properties in planar structures.

  11. Aeroelastic optimization methodology for viscous and turbulent flows

    NASA Astrophysics Data System (ADS)

    Barcelos Junior, Manuel Nascimento Dias

    2007-12-01

    In recent years, the development of faster computers and parallel processing allowed the application of high-fidelity analysis methods to the aeroelastic design of aircraft. However, these methods are restricted to the final design verification, mainly due to the computational cost involved in iterative design processes. Therefore, this work is concerned with the creation of a robust and efficient aeroelastic optimization methodology for inviscid, viscous and turbulent flows by using high-fidelity analysis and sensitivity analysis techniques. Most of the research in aeroelastic optimization, for practical reasons, treat the aeroelastic system as a quasi-static inviscid problem. In this work, as a first step toward the creation of a more complete aeroelastic optimization methodology for realistic problems, an analytical sensitivity computation technique was developed and tested for quasi-static aeroelastic viscous and turbulent flow configurations. Viscous and turbulent effects are included by using an averaged discretization of the Navier-Stokes equations, coupled with an eddy viscosity turbulence model. For quasi-static aeroelastic problems, the traditional staggered solution strategy has unsatisfactory performance when applied to cases where there is a strong fluid-structure coupling. Consequently, this work also proposes a solution methodology for aeroelastic and sensitivity analyses of quasi-static problems, which is based on the fixed point of an iterative nonlinear block Gauss-Seidel scheme. The methodology can also be interpreted as the solution of the Schur complement of the aeroelastic and sensitivity analyses linearized systems of equations. The methodologies developed in this work are tested and verified by using realistic aeroelastic systems.

  12. Educational Design as Conversation: A Conversation Analytical Perspective on Teacher Dialogue

    ERIC Educational Resources Information Center

    van Kruiningen, Jacqueline F.

    2013-01-01

    The aim of this methodological paper is to expound on and demonstrate the value of conversation-analytical research in the area of (informal) teacher learning. The author discusses some methodological issues in current research on interaction in teacher learning and holds a plea for conversation-analytical research on interactional processes in…

  13. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  16. Blade loss transient dynamics analysis, volume 2. Task 2: Theoretical and analytical development. Task 3: Experimental verification

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.

  17. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  18. Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.

    PubMed

    Lo, Y C; Armbruster, David A

    2012-04-01

    Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.

  19. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug

    PubMed Central

    Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos

    2016-01-01

    Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance (1H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity. PMID:27579215

  20. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  1. Bio-analytical applications of microbial fuel cell-based biosensors for onsite water quality monitoring.

    PubMed

    ElMekawy, A; Hegab, H M; Pant, D; Saint, C P

    2018-01-01

    Globally, sustainable provision of high-quality safe water is a major challenge of the 21st century. Various chemical and biological monitoring analytics are presently utilized to guarantee the availability of high-quality water. However, these techniques still face some challenges including high costs, complex design and onsite and online limitations. The recent technology of using microbial fuel cell (MFC)-based biosensors holds outstanding potential for the rapid and real-time monitoring of water source quality. MFCs have the advantages of simplicity in design and efficiency for onsite sensing. Even though some sensing applications of MFCs were previously studied, e.g. biochemical oxygen demand sensor, recently numerous research groups around the world have presented new practical applications of this technique, which combine multidisciplinary scientific knowledge in materials science, microbiology and electrochemistry fields. This review presents the most updated research on the utilization of MFCs as potential biosensors for monitoring water quality and considers the range of potentially toxic analytes that have so far been detected using this methodology. The advantages of MFCs over established technology are also considered as well as future work required to establish their routine use. © 2017 The Society for Applied Microbiology.

  2. Insights from two industrial hygiene pilot e-cigarette passive vaping studies.

    PubMed

    Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A

    2016-01-01

    While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less

  4. Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis.

    PubMed

    Liao, Hongjing; Hitchcock, John

    2018-06-01

    This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Spectral radiation analyses of the GOES solar illuminated hexagonal cell scan mirror back

    NASA Technical Reports Server (NTRS)

    Fantano, Louis G.

    1993-01-01

    A ray tracing analytical tool has been developed for the simulation of spectral radiation exchange in complex systems. Algorithms are used to account for heat source spectral energy, surface directional radiation properties, and surface spectral absorptivity properties. This tool has been used to calculate the effective solar absorptivity of the geostationary operational environmental satellites (GOES) scan mirror in the calibration position. The development and design of Sounder and Imager instruments on board GOES is reviewed and the problem of calculating the effective solar absorptivity associated with the GOES hexagonal cell configuration is presented. The analytical methodology based on the Monte Carlo ray tracing technique is described and results are presented and verified by experimental measurements for selected solar incidence angles.

  6. Marine anthropogenic radiotracers in the Southern Hemisphere: New sampling and analytical strategies

    NASA Astrophysics Data System (ADS)

    Levy, I.; Povinec, P. P.; Aoyama, M.; Hirose, K.; Sanchez-Cabeza, J. A.; Comanducci, J.-F.; Gastaud, J.; Eriksson, M.; Hamajima, Y.; Kim, C. S.; Komura, K.; Osvath, I.; Roos, P.; Yim, S. A.

    2011-04-01

    The Japan Agency for Marine Earth Science and Technology conducted in 2003-2004 the Blue Earth Global Expedition (BEAGLE2003) around the Southern Hemisphere Oceans, which was a rare opportunity to collect many seawater samples for anthropogenic radionuclide studies. We describe here sampling and analytical methodologies based on radiochemical separations of Cs and Pu from seawater, as well as radiometric and mass spectrometry measurements. Several laboratories took part in radionuclide analyses using different techniques. The intercomparison exercises and analyses of certified reference materials showed a reasonable agreement between the participating laboratories. The obtained data on the distribution of 137Cs and plutonium isotopes in seawater represent the most comprehensive results available for the Southern Hemisphere Oceans.

  7. Analysis of biologically active oxyprenylated phenylpropanoids in Tea tree oil using selective solid-phase extraction with UHPLC-PDA detection.

    PubMed

    Scotti, Luca; Genovese, Salvatore; Bucciarelli, Tonino; Martini, Filippo; Epifano, Francesco; Fiorito, Serena; Preziuso, Francesca; Taddeo, Vito Alessandro

    2018-05-30

    An efficient analytical strategy based on different extraction methods of biologically active naturally occurring oxyprenylated umbelliferone and ferulic acid derivatives 7-isopentenyloxycoumarin, auraptene, umbelliprenin, boropinic acid, and 4'-geranyloxyferulic acid and quantification by UHPLC with spectrophotometric (UV/Vis) detection from Tea tree oil is reported. Absorption of the pure oil on Al 2 O 3 (Brockmann activity II) prior washing the resulting solid with MeOH and treatment of this latter with CH 2 Cl 2 resulted the best extraction methodology in terms of yields of oxyprenylated secondary metabolites. Among the five O-prenylphenylpropanoids herein under investigation auraptene and umbelliprenin were never detected while 4'-geranyloxyferulic acid was the most abundant compound resulting from all the three extraction methods employed. The UHPLC analytical methodology set up in the present study resulted to be an effective and versatile technique for the simultaneous characterization and quantification of prenyloxyphenylpropanoids in Tea tree oil and applicable to other complex matrices from the plant kingdom. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Gas chromatography coupled to tunable pulsed glow discharge time-of-flight mass spectrometry for environmental analysis.

    PubMed

    Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo

    2010-05-01

    A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.

  9. Simple and clean determination of tetracyclines by flow injection analysis

    NASA Astrophysics Data System (ADS)

    Rodríguez, Michael Pérez; Pezza, Helena Redigolo; Pezza, Leonardo

    2016-01-01

    An environmentally reliable analytical methodology was developed for direct quantification of tetracycline (TC) and oxytetracycline (OTC) using continuous flow injection analysis with spectrophotometric detection. The method is based on the diazo coupling reaction between the tetracyclines and diazotized sulfanilic acid in a basic medium, resulting in the formation of an intense orange azo compound that presents maximum absorption at 434 nm. Experimental design was used to optimize the analytical conditions. The proposed technique was validated over the concentration range of 1 to 40 μg mL- 1, and was successfully applied to samples of commercial veterinary pharmaceuticals. The detection (LOD) and quantification (LOQ) limits were 0.40 and 1.35 μg mL- 1, respectively. The samples were also analyzed by an HPLC method, and the results showed agreement with the proposed technique. The new flow injection method can be immediately used for quality control purposes in the pharmaceutical industry, facilitating monitoring in real time during the production processes of tetracycline formulations for veterinary use.

  10. Evaluation of analytical performance based on partial order methodology.

    PubMed

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Research and development activities in unified control-structure modeling and design

    NASA Technical Reports Server (NTRS)

    Nayak, A. P.

    1985-01-01

    Results of work sponsored by JPL and other organizations to develop a unified control/structures modeling and design capability for large space structures is presented. Recent analytical results are presented to demonstrate the significant interdependence between structural and control properties. A new design methodology is suggested in which the structure, material properties, dynamic model and control design are all optimized simultaneously. The development of a methodology for global design optimization is recommended as a long term goal. It is suggested that this methodology should be incorporated into computer aided engineering programs, which eventually will be supplemented by an expert system to aid design optimization. Recommendations are also presented for near term research activities at JPL. The key recommendation is to continue the development of integrated dynamic modeling/control design techniques, with special attention given to the development of structural models specially tailored to support design.

  12. [Evaluation of the methods for the determination of nitrites in baby foods according Mexican legislation].

    PubMed

    Morales Guerrero, Josefina C; García Zepeda, Rodrigo A; Flores Ruvalcaba, Edgar; Martínez Michel, Lorelei

    2012-09-01

    We evaluated the two methods accepted by the Mexican norm for the determination of nitritesin infant meat-based food with vegetables. We determined the content of nitrites in the infant food, raw materials as well as products from the intermediate stages of production. A reagent blank and a reference sample were included at each analytical run. In addition, we determined the sensitivity, recovery percentage and accuracy of each methodology. Infant food results indicated an important difference in the nitrite content determined under each methodology, due to the persistent presence of turbidity in the extracts. Different treatments were proposed to eliminate the turbidity, but these only managed to reduce it. The turbidity was attributed to carbohydrates which disclosed concentration exhibit a wide dispersion and were below the quantifiable limit under both methodologies; therefore it is not recommended to apply these techniques with food suspected to contain traces of nitrites.

  13. Ultrasound-assisted leaching-dispersive solid-phase extraction followed by liquid-liquid microextraction for the determination of polybrominated diphenyl ethers in sediment samples by gas chromatography-tandem mass spectrometry.

    PubMed

    Fontana, Ariel R; Lana, Nerina B; Martinez, Luis D; Altamirano, Jorgelina C

    2010-06-30

    Ultrasound-assisted leaching-dispersive solid-phase extraction followed by dispersive liquid-liquid microextraction (USAL-DSPE-DLLME) technique has been developed as a new analytical approach for extracting, cleaning up and preconcentrating polybrominated diphenyl ethers (PBDEs) from sediment samples prior gas chromatography-tandem mass spectrometry (GC-MS/MS) analysis. In the first place, PBDEs were leached from sediment samples by using acetone. This extract was cleaned-up by DSPE using activated silica gel as sorbent material. After clean-up, PBDEs were preconcentrated by using DLLME technique. Thus, 1 mL acetone extract (disperser solvent) and 60 microL carbon tetrachloride (extraction solvent) were added to 5 mL ultrapure water and a DLLME technique was applied. Several variables that govern the proposed technique were studied and optimized. Under optimum conditions, the method detection limits (MDLs) of PBDEs calculated as three times the signal-to-noise ratio (S/N) were within the range 0.02-0.06 ng g(-1). The relative standard deviations (RSDs) for five replicates were <9.8%. The calibration graphs were linear within the concentration range of 0.07-1000 ng g(-1) for BDE-47, 0.09-1000 ng g(-1) for BDE-100, 0.10-1000 ng g(-1) for BDE-99 and 0.19-1000 ng g(-1) for BDE-153 and the coefficients of estimation were > or =0.9991. Validation of the methodology was carried out by standard addition method at two concentration levels (0.25 and 1 ng g(-1)) and by comparing with a reference Soxhlet technique. Recovery values were > or =80%, which showed a satisfactory robustness of the analytical methodology for determination of low PBDEs concentration in sediment samples. Copyright 2010 Elsevier B.V. All rights reserved.

  14. Analytical Utility of Campylobacter Methodologies

    USDA-ARS?s Scientific Manuscript database

    The National Advisory Committee on Microbiological Criteria for Foods (NACMCF, or the Committee) was asked to address the analytical utility of Campylobacter methodologies in preparation for an upcoming United States Food Safety and Inspection Service (FSIS) baseline study to enumerate Campylobacter...

  15. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  16. Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking

    NASA Astrophysics Data System (ADS)

    Groves, D. G.; Lempert, R.

    2008-12-01

    Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.

  17. Advances in bioanalytical techniques to measure steroid hormones in serum.

    PubMed

    French, Deborah

    2016-06-01

    Steroid hormones are measured clinically to determine if a patient has a pathological process occurring in the adrenal gland, or other hormone responsive organs. They are very similar in structure making them analytically challenging to measure. Additionally, these hormones have vast concentration differences in human serum adding to the measurement complexity. GC-MS was the gold standard methodology used to measure steroid hormones clinically, followed by radioimmunoassay, but that was replaced by immunoassay due to ease of use. LC-MS/MS has now become a popular alternative owing to simplified sample preparation than for GC-MS and increased specificity and sensitivity over immunoassay. This review will discuss these methodologies and some new developments that could simplify and improve steroid hormone analysis in serum.

  18. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  19. Determination of trace level genotoxic impurities in small molecule drug substances using conventional headspace gas chromatography with contemporary ionic liquid diluents and electron capture detection.

    PubMed

    Ho, Tien D; Yehl, Peter M; Chetwyn, Nik P; Wang, Jin; Anderson, Jared L; Zhong, Qiqing

    2014-09-26

    Ionic liquids (ILs) were used as a new class of diluents for the analysis of two classes of genotoxic impurities (GTIs), namely, alkyl/aryl halides and nitro-aromatics, in small molecule drug substances by headspace gas chromatography (HS-GC) coupled with electron capture detection (ECD). This novel approach using ILs as contemporary diluents greatly broadens the applicability of HS-GC for the determination of high boiling (≥ 130°C) analytes including GTIs with limits of detection (LOD) ranging from 5 to 500 parts-per-billion (ppb) of analytes in a drug substance. This represents up to tens of thousands-fold improvement compared to traditional HS-GC diluents such as dimethyl sulfoxide (DMSO) and dimethylacetamide (DMAC). Various ILs were screened to determine their suitability as diluents for the HS-GC/ECD analysis. Increasing the HS oven temperatures resulted in varying responses for alkyl/aryl halides and a significant increase in response for all nitroaromatic GTIs. Linear ranges of up to five orders of magnitude were found for a number of analytes. The technique was validated on two active pharmaceutical ingredients with excellent recovery. This simple and robust methodology offers a key advantage in the ease of method transfer from development laboratories to quality control environments since conventional validated chromatographic data systems and GC instruments can be used. For many analytes, it is a cost effective alternative to more complex trace analytical methodologies like LC/MS and GC/MS, and significantly reduces the training needed for operation. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Candidate substances for space bioprocessing methodology and data specification for benefit evaluation

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Analytical and quantitative economic techniques are applied to the evaluation of the economic benefits of a wide range of substances for space bioprocessing. On the basis of expected clinical applications, as well as the size of the patient that could be affected by the clinical applications, eight substances are recommended for further benefit evaluation. Results show that a transitional probability methodology can be used to model at least one clinical application for each of these substances. In each recommended case, the disease and its therapy are sufficiently well understood and documented, and the statistical data is available to operate the model and produce estimates of the impact of new therapy systems on the cost of treatment, morbidity, and mortality. Utilizing the morbidity and mortality information produced by the model, a standard economic technique called the Value of Human Capital is used to estimate the social welfare benefits that could be attributable to the new therapy systems.

  1. Resistance Curves in the Tensile and Compressive Longitudinal Failure of Composites

    NASA Technical Reports Server (NTRS)

    Camanho, Pedro P.; Catalanotti, Giuseppe; Davila, Carlos G.; Lopes, Claudio S.; Bessa, Miguel A.; Xavier, Jose C.

    2010-01-01

    This paper presents a new methodology to measure the crack resistance curves associated with fiber-dominated failure modes in polymer-matrix composites. These crack resistance curves not only characterize the fracture toughness of the material, but are also the basis for the identification of the parameters of the softening laws used in the analytical and numerical simulation of fracture in composite materials. The method proposed is based on the identification of the crack tip location by the use of Digital Image Correlation and the calculation of the J-integral directly from the test data using a simple expression derived for cross-ply composite laminates. It is shown that the results obtained using the proposed methodology yield crack resistance curves similar to those obtained using FEM-based methods in compact tension carbon-epoxy specimens. However, it is also shown that the Digital Image Correlation based technique can be used to extract crack resistance curves in compact compression tests for which FEM-based techniques are inadequate.

  2. Performance-based, cost- and time-effective pcb analytical methodology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarado, J. S.

    1998-06-11

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  3. SAN RAFAEL PRIMITIVE AREA, CALIFORNIA.

    USGS Publications Warehouse

    Gower, H.D.

    1984-01-01

    No mineral-resource potential was identified during studies of the San Rafael Primitive Area, located at the southern end of the Coast Ranges of California. No petroleum has been produced from the area and there is little promise for the occurrence of energy resources. Limestone occurs in the area but also is found in abundance outside the area. Inasmuch as sampling and analytical techniques have improved significantly since this study was completed a restudy of the area using new methodology is possibly warranted.

  4. Problem Definition Study on Techniques and Methodologies for Evaluating the Chemical and Toxicological Properties of Combustion Products of Gun Systems. Volume 1.

    DTIC Science & Technology

    1988-03-01

    methods that can resolve the various compounds are required. This chapter specifically focuses on analytical and sampling metho - dology used to determine...Salmonella typhimurium TA1538. Cancer Res. 35:2461-2468. Huy, N. D., R. Belleau, and P. E. Roy. 1975. Toxicity of marijuana and tobacco smoking in the... Medicine Division (HSHA-IPM) Fort Sam Houston, TX 78234 Commander U.S. Army Materiel Command ATTN: AMSCG 5001 Eisenhower Avenue Alexandria, VA 22333

  5. Dynamic Rod Worth Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, Y.A.; Chapman, D.M.; Hill, D.J.

    2000-12-15

    The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.

  6. Near-infrared spectroscopy for the detection and quantification of bacterial contaminations in pharmaceutical products.

    PubMed

    Quintelas, Cristina; Mesquita, Daniela P; Lopes, João A; Ferreira, Eugénio C; Sousa, Clara

    2015-08-15

    Accurate detection and quantification of microbiological contaminations remains an issue mainly due the lack of rapid and precise analytical techniques. Standard methods are expensive and time-consuming being associated to high economic losses and public health threats. In the context of pharmaceutical industry, the development of fast analytical techniques able to overcome these limitations is crucial and spectroscopic techniques might constitute a reliable alternative. In this work we proved the ability of Fourier transform near infrared spectroscopy (FT-NIRS) to detect and quantify bacteria (Bacillus subtilis, Escherichia coli, Pseudomonas fluorescens, Salmonella enterica, Staphylococcus epidermidis) from 10 to 10(8) CFUs/mL in sterile saline solutions (NaCl 0.9%). Partial least squares discriminant analysis (PLSDA) models showed that FT-NIRS was able to discriminate between sterile and contaminated solutions for all bacteria as well as to identify the contaminant bacteria. Partial least squares (PLS) models allowed bacterial quantification with limits of detection ranging from 5.1 to 9 CFU/mL for E. coli and B. subtilis, respectively. This methodology was successfully validated in three pharmaceutical preparations (contact lens solution, cough syrup and topic anti-inflammatory solution) proving that this technique possess a high potential to be routinely used for the detection and quantification of bacterial contaminations. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Uncovering category specificity of genital sexual arousal in women: The critical role of analytic technique.

    PubMed

    Pulverman, Carey S; Hixon, J Gregory; Meston, Cindy M

    2015-10-01

    Based on analytic techniques that collapse data into a single average value, it has been reported that women lack category specificity and show genital sexual arousal to a large range of sexual stimuli including those that both match and do not match their self-reported sexual interests. These findings may be a methodological artifact of the way in which data are analyzed. This study examined whether using an analytic technique that models data over time would yield different results. Across two studies, heterosexual (N = 19) and lesbian (N = 14) women viewed erotic films featuring heterosexual, lesbian, and gay male couples, respectively, as their physiological sexual arousal was assessed with vaginal photoplethysmography. Data analysis with traditional methods comparing average genital arousal between films failed to detect specificity of genital arousal for either group. When data were analyzed with smoothing regression splines and a within-subjects approach, both heterosexual and lesbian women demonstrated different patterns of genital sexual arousal to the different types of erotic films, suggesting that sophisticated statistical techniques may be necessary to more fully understand women's genital sexual arousal response. Heterosexual women showed category-specific genital sexual arousal. Lesbian women showed higher arousal to the heterosexual film than the other films. However, within subjects, lesbian women showed significantly different arousal responses suggesting that lesbian women's genital arousal discriminates between different categories of stimuli at the individual level. Implications for the future use of vaginal photoplethysmography as a diagnostic tool of sexual preferences in clinical and forensic settings are discussed. © 2015 Society for Psychophysiological Research.

  8. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  9. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control †

    PubMed Central

    Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob

    2017-01-01

    Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant’s intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms. PMID:28208697

  10. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control.

    PubMed

    Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob

    2017-02-08

    Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant's intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.

  11. The acoustics of ducted propellers

    NASA Astrophysics Data System (ADS)

    Ali, Sherif F.

    The return of the propeller to the long haul commercial service may be rapidly approaching in the form of advanced "prop fans". It is believed that the advanced turboprop will considerably reduce the operational cost. However, such aircraft will come into general use only if their noise levels meet the standards of community acceptability currently applied to existing aircraft. In this work a time-marching boundary-element technique is developed, and used to study the acoustics of ducted propeller. The numerical technique is developed in this work eliminated the inherent instability suffered by conventional approaches. The methodology is validated against other numerical and analytical results. The results show excellent agreement with the analytical solution and show no indication of unstable behavior. For the ducted propeller problem, the propeller is modeled by a rotating source-sink pairs, and the duct is modeled by rigid annular body of elliptical cross-section. Using the model and the developed technique, the effect of different parameters on the acoustic field is predicted and analyzed. This includes the effect of duct length, propeller axial location, and source Mach number. The results of this study show that installing a short duct around the propeller can reduce the noise that reaches an observer on a side line.

  12. Phase-0/microdosing studies using PET, AMS, and LC-MS/MS: a range of study methodologies and conduct considerations. Accelerating development of novel pharmaceuticals through safe testing in humans - a practical guide.

    PubMed

    Burt, Tal; John, Christy S; Ruckle, Jon L; Vuong, Le T

    2017-05-01

    Phase-0 studies, including microdosing, also called Exploratory Investigational New Drug (eIND) or exploratory clinical trials, are a regulatory framework for first-in-human (FIH) trials. Common to these approaches is the use and implied safety of limited exposures to test articles. Use of sub-pharmacological doses in phase-0/microdose studies requires sensitive analytic tools such as accelerator mass spectrometer (AMS), Positron Emission Tomography (PET), and Liquid Chromatography Tandem Mass Spectrometry (LC-MS/MS) to determine drug disposition. Areas covered: Here we present a practical guide to the range of methodologies, design options, and conduct strategies that can be used to increase the efficiency of drug development. We provide detailed examples of relevant developmental scenarios. Expert opinion: Validation studies over the past decade demonstrated the reliability of extrapolation of sub-pharmacological to therapeutic-level exposures in more than 80% of cases, an improvement over traditional allometric approaches. Applications of phase-0/microdosing approaches include study of pharmacokinetic and pharmacodynamic properties, target tissue localization, drug-drug interactions, effects in vulnerable populations (e.g. pediatric), and intra-target microdosing (ITM). Study design should take into account the advantages and disadvantages of each analytic tool. Utilization of combinations of these analytic techniques increases the versatility of study designs and the power of data obtained.

  13. A radiative transfer model for remote sensing of laser induced fluorescence of phytoplankton in non-homogeneous turbid water

    NASA Technical Reports Server (NTRS)

    Venable, D. D.

    1983-01-01

    A semi-analytic Monte Carlo simulation methodology (SALMON) was discussed. This simulation technique is particularly well suited for addressing fundamental radiative transfer problems in oceanographic LIDAR (optical radar), and also provides a framework for investigating the effects of environmental factors on LIDAR system performance. The simulation model was extended for airborne laser fluorosensors to allow for inhomogeneities in the vertical distribution of constituents in clear sea water. Results of the simulations for linearly varying step concentrations of chlorophyll are presented. The SALMON technique was also employed to determine how the LIDAR signals from an inhomogeneous media differ from those from homogeneous media.

  14. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  15. A subjective framework for seat comfort based on a heuristic multi criteria decision making technique and anthropometry.

    PubMed

    Fazlollahtabar, Hamed

    2010-12-01

    Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.

  16. Environmental Monitoring and the Gas Industry: Program Manager Handbook

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory D. Gillispie

    1997-12-01

    This document has been developed for the nontechnical gas industry manager who has the responsibility for the development of waste or potentially contaminated soil and groundwater data or must make decisions based on such data for the management or remediation of these materials. It explores the pse of common analytical chemistry instrumentation and associated techniques for identification of environmentally hazardous materials. Sufficient detail is given to familiarize the nontechnical reader with the principles behind the operation of each technique. The scope and realm of the techniques and their constituent variations are portrayed through a discussion of crucial details and, wheremore » appropriate, the depiction of real-life data. It is the author's intention to provide an easily understood handbook for gas industry management. Techniques which determine the presence, composition, and quantification of gas industry wastes are discussed. Greater focus is given to traditional techniques which have been the mainstay of modem analytical benchwork. However, with the continual advancement of instrumental principles and design, several techniques have been included which are likely to receive greater attention in fiture considerations for waste-related detection. Definitions and concepts inherent to a thorough understanding of the principles common to analytical chemistry are discussed. It is also crucial that gas industry managers understand the effects of the various actions which take place before, during, and after the actual sampling step. When a series of sample collection, storage, and transport activities occur, new or inexperienced project managers may overlook or misunderstand the importance of the sequence. Each step has an impact on the final results of the measurement process; errors in judgment or decision making can be costly. Specific techniques and methodologies for the collection, storage, and transport of environmental media samples are not described or discussed in detail in thk handbook. However, the underlying philosophy regarding the importance of proper collection, storage, and transport practices, as well as pertinent references, are presented.« less

  17. Heavy hydrocarbon main injector technology program

    NASA Technical Reports Server (NTRS)

    Arbit, H. A.; Tuegel, L. M.; Dodd, F. E.

    1991-01-01

    The Heavy Hydrocarbon Main Injector Program was an analytical, design, and test program to demonstrate an injection concept applicable to an Isolated Combustion Compartment of a full-scale, high pressure, LOX/RP-1 engine. Several injector patterns were tested in a 3.4-in. combustor. Based on these results, features of the most promising injector design were incorporated into a 5.7-in. injector which was then hot-fire tested. In turn, a preliminary design of a 5-compartment 2D combustor was based on this pattern. Also the additional subscale injector testing and analysis was performed with an emphasis on improving analytical techniques and acoustic cavity design methodology. Several of the existing 3.5-in. diameter injectors were hot-fire tested with and without acoustic cavities for spontaneous and dynamic stability characteristics.

  18. Critical Review on the Analytical Techniques for the Determination of the Oldest Statin-Atorvastatin-in Bulk, Pharmaceutical Formulations and Biological Fluids.

    PubMed

    Kokilambigai, K S; Seetharaman, R; Lakshmi, K S

    2017-11-02

    Statins are a group of medicines that can help to lower the level of low-density lipoprotein (LDL) cholesterol "bad cholesterol" in the blood. Having a high level of LDL cholesterol is potentially dangerous, as it can lead to a hardening and narrowing of arteries (atherosclerosis) and cardiovascular disease (CVD), atorvastatin is one of the oldest member of the statin family and is used in the treatment of dyslipidemia and the prevention of CVD. Atorvastatin was first made in August 1985 and from 1996 to 2012 under the trade name Lipitor, atorvastatin became the world's best-selling drug. Numerous analytical methodologies are available for the quantification of atorvastatin and its content in pharmaceutical preparations and in biological fluids.

  19. Thermal and Chemical Characterization of Composite Materials. MSFC Center Director's Discretionary Fund Final Report, Project No. ED36-18

    NASA Technical Reports Server (NTRS)

    Stanley, D. C.; Huff, T. L.

    2003-01-01

    The purpose of this research effort was to: (1) provide a concise and well-defined property profile of current and developing composite materials using thermal and chemical characterization techniques and (2) optimize analytical testing requirements of materials. This effort applied a diverse array of methodologies to ascertain composite material properties. Often, a single method of technique will provide useful, but nonetheless incomplete, information on material composition and/or behavior. To more completely understand and predict material properties, a broad-based analytical approach is required. By developing a database of information comprised of both thermal and chemical properties, material behavior under varying conditions may be better understood. THis is even more important in the aerospace community, where new composite materials and those in the development stage have little reference data. For example, Fourier transform infrared (FTIR) spectroscopy spectral databases available for identification of vapor phase spectra, such as those generated during experiments, generally refer to well-defined chemical compounds. Because this method renders a unique thermal decomposition spectral pattern, even larger, more diverse databases, such as those found in solid and liquid phase FTIR spectroscopy libraries, cannot be used. By combining this and other available methodologies, a database specifically for new materials and materials being developed at Marshall Space Flight Center can be generated . In addition, characterizing materials using this approach will be extremely useful in the verification of materials and identification of anomalies in NASA-wide investigations.

  20. Chemical-mineralogical characterization of C&D waste recycled aggregates from São Paulo, Brazil.

    PubMed

    Angulo, S C; Ulsen, C; John, V M; Kahn, H; Cincotto, M A

    2009-02-01

    This study presents a methodology for the characterization of construction and demolition (C&D) waste recycled aggregates based on a combination of analytical techniques (X-ray fluorescence (XRF), soluble ions, semi-quantitative X-ray diffraction (XRD), thermogravimetric analysis (TGA-DTG) and hydrochloric acid (HCl) selective dissolution). These combined analytical techniques allow for the estimation of the amount of cement paste, its most important hydrated and carbonated phases, as well as the amount of clay and micas. Details of the methodology are presented here and the results of three representative C&D samples taken from the São Paulo region in Brazil are discussed. Chemical compositions of mixed C&D aggregate samples have mostly been influenced by particle size rather than the visual classification of C&D into red or grey and geographical origin. The amount of measured soluble salts in C&D aggregates (0.15-25.4mm) is lower than the usual limits for mortar and concrete production. The content of porous cement paste in the C&D aggregates is around 19.3% (w/w). However, this content is significantly lower than the 43% detected for the C&D powders (<0.15 mm). The clay content of the powders was also high, potentially resulting from soil intermixed with the C&D waste, as well as poorly burnt red ceramic. Since only about 50% of the measured CaO is combined with CO(2), the powders have potential use as raw materials for the cement industry.

  1. Combining machine learning and matching techniques to improve causal inference in program evaluation.

    PubMed

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    Program evaluations often utilize various matching approaches to emulate the randomization process for group assignment in experimental studies. Typically, the matching strategy is implemented, and then covariate balance is assessed before estimating treatment effects. This paper introduces a novel analytic framework utilizing a machine learning algorithm called optimal discriminant analysis (ODA) for assessing covariate balance and estimating treatment effects, once the matching strategy has been implemented. This framework holds several key advantages over the conventional approach: application to any variable metric and number of groups; insensitivity to skewed data or outliers; and use of accuracy measures applicable to all prognostic analyses. Moreover, ODA accepts analytic weights, thereby extending the methodology to any study design where weights are used for covariate adjustment or more precise (differential) outcome measurement. One-to-one matching on the propensity score was used as the matching strategy. Covariate balance was assessed using standardized difference in means (conventional approach) and measures of classification accuracy (ODA). Treatment effects were estimated using ordinary least squares regression and ODA. Using empirical data, ODA produced results highly consistent with those obtained via the conventional methodology for assessing covariate balance and estimating treatment effects. When ODA is combined with matching techniques within a treatment effects framework, the results are consistent with conventional approaches. However, given that it provides additional dimensions and robustness to the analysis versus what can currently be achieved using conventional approaches, ODA offers an appealing alternative. © 2016 John Wiley & Sons, Ltd.

  2. What We Do and Do Not Know about Teaching Medical Image Interpretation.

    PubMed

    Kok, Ellen M; van Geel, Koos; van Merriënboer, Jeroen J G; Robben, Simon G F

    2017-01-01

    Educators in medical image interpretation have difficulty finding scientific evidence as to how they should design their instruction. We review and comment on 81 papers that investigated instructional design in medical image interpretation. We distinguish between studies that evaluated complete offline courses and curricula, studies that evaluated e-learning modules, and studies that evaluated specific educational interventions. Twenty-three percent of all studies evaluated the implementation of complete courses or curricula, and 44% of the studies evaluated the implementation of e-learning modules. We argue that these studies have encouraging results but provide little information for educators: too many differences exist between conditions to unambiguously attribute the learning effects to specific instructional techniques. Moreover, concepts are not uniformly defined and methodological weaknesses further limit the usefulness of evidence provided by these studies. Thirty-two percent of the studies evaluated a specific interventional technique. We discuss three theoretical frameworks that informed these studies: diagnostic reasoning, cognitive schemas and study strategies. Research on diagnostic reasoning suggests teaching students to start with non-analytic reasoning and subsequently applying analytic reasoning, but little is known on how to train non-analytic reasoning. Research on cognitive schemas investigated activities that help the development of appropriate cognitive schemas. Finally, research on study strategies supports the effectiveness of practice testing, but more study strategies could be applicable to learning medical image interpretation. Our commentary highlights the value of evaluating specific instructional techniques, but further evidence is required to optimally inform educators in medical image interpretation.

  3. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  4. Simulation of wind turbine wakes using the actuator line technique

    PubMed Central

    Sørensen, Jens N.; Mikkelsen, Robert F.; Henningson, Dan S.; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J.

    2015-01-01

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. PMID:25583862

  5. Analytical and Numerical Results for an Adhesively Bonded Joint Subjected to Pure Bending

    NASA Technical Reports Server (NTRS)

    Smeltzer, Stanley S., III; Lundgren, Eric

    2006-01-01

    A one-dimensional, semi-analytical methodology that was previously developed for evaluating adhesively bonded joints composed of anisotropic adherends and adhesives that exhibit inelastic material behavior is further verified in the present paper. A summary of the first-order differential equations and applied joint loading used to determine the adhesive response from the methodology are also presented. The method was previously verified against a variety of single-lap joint configurations from the literature that subjected the joints to cases of axial tension and pure bending. Using the same joint configuration and applied bending load presented in a study by Yang, the finite element analysis software ABAQUS was used to further verify the semi-analytical method. Linear static ABAQUS results are presented for two models, one with a coarse and one with a fine element meshing, that were used to verify convergence of the finite element analyses. Close agreement between the finite element results and the semi-analytical methodology were determined for both the shear and normal stress responses of the adhesive bondline. Thus, the semi-analytical methodology was successfully verified using the ABAQUS finite element software and a single-lap joint configuration subjected to pure bending.

  6. The analytical calibration in (bio)imaging/mapping of the metallic elements in biological samples--definitions, nomenclature and strategies: state of the art.

    PubMed

    Jurowski, Kamil; Buszewski, Bogusław; Piekoszewski, Wojciech

    2015-01-01

    Nowadays, studies related to the distribution of metallic elements in biological samples are one of the most important issues. There are many articles dedicated to specific analytical atomic spectrometry techniques used for mapping/(bio)imaging the metallic elements in various kinds of biological samples. However, in such literature, there is a lack of articles dedicated to reviewing calibration strategies, and their problems, nomenclature, definitions, ways and methods used to obtain quantitative distribution maps. The aim of this article was to characterize the analytical calibration in the (bio)imaging/mapping of the metallic elements in biological samples including (1) nomenclature; (2) definitions, and (3) selected and sophisticated, examples of calibration strategies with analytical calibration procedures applied in the different analytical methods currently used to study an element's distribution in biological samples/materials such as LA ICP-MS, SIMS, EDS, XRF and others. The main emphasis was placed on the procedures and methodology of the analytical calibration strategy. Additionally, the aim of this work is to systematize the nomenclature for the calibration terms: analytical calibration, analytical calibration method, analytical calibration procedure and analytical calibration strategy. The authors also want to popularize the division of calibration methods that are different than those hitherto used. This article is the first work in literature that refers to and emphasizes many different and complex aspects of analytical calibration problems in studies related to (bio)imaging/mapping metallic elements in different kinds of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Validated analytical methodology for the simultaneous determination of a wide range of pesticides in human blood using GC-MS/MS and LC-ESI/MS/MS and its application in two poisoning cases.

    PubMed

    Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D

    2015-09-01

    Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  8. Force 2025 and Beyond Strategic Force Design Analytic Model

    DTIC Science & Technology

    2017-01-12

    depiction of the core ideas of our force design model. Figure 1: Description of Force Design Model Figure 2 shows an overview of our methodology ...the F2025B Force Design Analytic Model research conducted by TRAC- MTRY and the Naval Postgraduate School. Our research develops a methodology for...designs. We describe a data development methodology that characterizes the data required to construct a force design model using our approach. We

  9. Considering axiological integrity: a methodological analysis of qualitative evidence syntheses, and its implications for health professions education.

    PubMed

    Kelly, Martina; Ellaway, Rachel H; Reid, Helen; Ganshorn, Heather; Yardley, Sarah; Bennett, Deirdre; Dornan, Tim

    2018-05-14

    Qualitative evidence synthesis (QES) is a suite of methodologies that combine qualitative techniques with the synthesis of qualitative knowledge. They are particularly suited to medical education as these approaches pool findings from original qualitative studies, whilst paying attention to context and theoretical development. Although increasingly sophisticated use is being made of qualitative primary research methodologies in health professions education (HPE) the use of secondary qualitative reviews in HPE remains underdeveloped. This study examined QES methods applied to clinical humanism in healthcare as a way of advancing thinking around the use of QES in HPE in general. A systematic search strategy identified 49 reviews that fulfilled the inclusion criteria. Meta-study was used to develop an analytic summary of methodological characteristics, the role of theory, and the synthetic processes used in QES reviews. Fifteen reviews used a defined methodology, and 17 clearly explained the processes that led from data extraction to synthesis. Eight reviews adopted a specific theoretical perspective. Authors rarely described their reflexive relationship with their data. Epistemological positions tended to be implied rather than explicit. Twenty-five reviews included some form of quality appraisal, although it was often unclear how authors acted on its results. Reviewers under-reported qualitative approaches in their review methodologies, and tended to focus on elements such as systematicity and checklist quality appraisal that were more germane to quantitative evidence synthesis. A core concern was that the axiological (value) dimensions of the source materials were rarely considered let alone accommodated in the synthesis techniques used. QES can be used in HPE research but only with careful attention to maintaining axiological integrity.

  10. Recent Analytical Techniques Advances in the Carotenoids and Their Derivatives Determination in Various Matrixes.

    PubMed

    Giuffrida, Daniele; Donato, Paola; Dugo, Paola; Mondello, Luigi

    2018-04-04

    In the present perspective, different approaches to the carotenoids analysis will be discussed providing a brief overview of the most advanced both monodimensional and bidimensional liquid chromatographic methodologies applied to the carotenoids analysis, followed by a discussion on the recents advanced supercritical fluid chromatography × liquid chromatography bidimensional approach with photodiode-array and mass spectrometry detection. Moreover a discussion on the online supercritical fluid extraction-supercritical fluid chromatography with tandem mass spectrometry detection applied to the determination of carotenoids and apocarotenoids will also be provided.

  11. Efficient Solution of Three-Dimensional Problems of Acoustic and Electromagnetic Scattering by Open Surfaces

    NASA Technical Reports Server (NTRS)

    Turc, Catalin; Anand, Akash; Bruno, Oscar; Chaubell, Julian

    2011-01-01

    We present a computational methodology (a novel Nystrom approach based on use of a non-overlapping patch technique and Chebyshev discretizations) for efficient solution of problems of acoustic and electromagnetic scattering by open surfaces. Our integral equation formulations (1) Incorporate, as ansatz, the singular nature of open-surface integral-equation solutions, and (2) For the Electric Field Integral Equation (EFIE), use analytical regularizes that effectively reduce the number of iterations required by iterative linear-algebra solution based on Krylov-subspace iterative solvers.

  12. Architecture for Business Intelligence in the Healthcare Sector

    NASA Astrophysics Data System (ADS)

    Lee, Sang Young

    2018-03-01

    Healthcare environment is growing to include not only the traditional information systems, but also a business intelligence platform. For executive leaders, consultants, and analysts, there is no longer a need to spend hours in design and develop of typical reports or charts, the entire solution can be completed through using Business Intelligence software. The current paper highlights the advantages of big data analytics and business intelligence in the healthcare industry. In this paper, In this paper we focus our discussion around intelligent techniques and methodologies which are recently used for business intelligence in healthcare.

  13. Characterization of Deposits on Glass Substrate as a Tool in Failure Analysis: The Orbiter Vehicle Columbia Case Study

    NASA Technical Reports Server (NTRS)

    Olivas, J. D.; Melroy, P.; McDanels, S.; Wallace, T.; Zapata, M. C.

    2006-01-01

    In connection with the accident investigation of the space shuttle Columbia, an analysis methodology utilizing well established microscopic and spectroscopic techniques was implemented for evaluating the environment to which the exterior fused silica glass was exposed. Through the implementation of optical microscopy, scanning electron microscopy, energy dispersive spectroscopy, transmission electron microscopy, and electron diffraction, details emerged regarding the manner in which a charred metallic deposited layer formed on top of the exposed glass. Due to nature of the substrate and the materials deposited, the methodology proved to allow for a more detailed analysis of the vehicle breakup. By contrast, similar analytical methodologies on metallic substrates have proven to be challenging due to strong potential for error resulting from substrate contamination. This information proved to be valuable to not only those involved in investigating the break up of Columbia, but also provides a potential guide for investigating future high altitude and high energy accidents.

  14. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  15. Rapid monitoring of glycerol in fermentation growth media: Facilitating crude glycerol bioprocess development.

    PubMed

    Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier

    2014-04-01

    Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Defining and Measuring Engagement and Learning in Science: Conceptual, Theoretical, Methodological, and Analytical Issues

    ERIC Educational Resources Information Center

    Azevedo, Roger

    2015-01-01

    Engagement is one of the most widely misused and overgeneralized constructs found in the educational, learning, instructional, and psychological sciences. The articles in this special issue represent a wide range of traditions and highlight several key conceptual, theoretical, methodological, and analytical issues related to defining and measuring…

  17. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    NASA Astrophysics Data System (ADS)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  18. Aircraft wing weight build-up methodology with modification for materials and construction techniques

    NASA Technical Reports Server (NTRS)

    York, P.; Labell, R. W.

    1980-01-01

    An aircraft wing weight estimating method based on a component buildup technique is described. A simplified analytically derived beam model, modified by a regression analysis, is used to estimate the wing box weight, utilizing a data base of 50 actual airplane wing weights. Factors representing materials and methods of construction were derived and incorporated into the basic wing box equations. Weight penalties to the wing box for fuel, engines, landing gear, stores and fold or pivot are also included. Methods for estimating the weight of additional items (secondary structure, control surfaces) have the option of using details available at the design stage (i.e., wing box area, flap area) or default values based on actual aircraft from the data base.

  19. The phonetics of talk in interaction--introduction to the special issue.

    PubMed

    Ogden, Richard

    2012-03-01

    This overview paper provides an introduction to work on naturally-occurring speech data, combining techniques of conversation analysis with techniques and methods from phonetics. The paper describes the development of the field, highlighting current challenges and progress in interdisciplinary work. It considers the role of quantification and its relationship to a qualitative methodology. It presents the conversation analytic notion of sequence as a version of context, and argues that sequences of talk constrain relevant phonetic design, and so provide one account for variability in naturally occurring speech. The paper also describes the manipulation of speech and language on many levels simultaneously. All of these themes occur and are explored in more detail in the papers contained in this special issue.

  20. Eco-analytical Methodology in Environmental Problems Monitoring

    NASA Astrophysics Data System (ADS)

    Agienko, M. I.; Bondareva, E. P.; Chistyakova, G. V.; Zhironkina, O. V.; Kalinina, O. I.

    2017-01-01

    Among the problems common to all mankind, which solutions influence the prospects of civilization, the problem of ecological situation monitoring takes very important place. Solution of this problem requires specific methodology based on eco-analytical comprehension of global issues. Eco-analytical methodology should help searching for the optimum balance between environmental problems and accelerating scientific and technical progress. The fact that Governments, corporations, scientists and nations focus on the production and consumption of material goods cause great damage to environment. As a result, the activity of environmentalists is developing quite spontaneously, as a complement to productive activities. Therefore, the challenge posed by the environmental problems for the science is the formation of geo-analytical reasoning and the monitoring of global problems common for the whole humanity. So it is expected to find the optimal trajectory of industrial development to prevent irreversible problems in the biosphere that could stop progress of civilization.

  1. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    PubMed

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. When can social media lead financial markets?

    PubMed

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-02-27

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.

  3. When Can Social Media Lead Financial Markets?

    NASA Astrophysics Data System (ADS)

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-02-01

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes.

  4. When Can Social Media Lead Financial Markets?

    PubMed Central

    Zheludev, Ilya; Smith, Robert; Aste, Tomaso

    2014-01-01

    Social media analytics is showing promise for the prediction of financial markets. However, the true value of such data for trading is unclear due to a lack of consensus on which instruments can be predicted and how. Current approaches are based on the evaluation of message volumes and are typically assessed via retrospective (ex-post facto) evaluation of trading strategy returns. In this paper, we present instead a sentiment analysis methodology to quantify and statistically validate which assets could qualify for trading from social media analytics in an ex-ante configuration. We use sentiment analysis techniques and Information Theory measures to demonstrate that social media message sentiment can contain statistically-significant ex-ante information on the future prices of the S&P500 index and a limited set of stocks, in excess of what is achievable using solely message volumes. PMID:24572909

  5. Analytical research and development for the Whitney Programs. Automation and instrumentation. Computer automation of the Cary Model 17I spectrophotometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugen, G.R.; Bystroff, R.I.; Downey, R.M.

    1975-09-01

    In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less

  6. Analysis of glyoxal and related substances by means of high-performance liquid chromatography with refractive index detection.

    PubMed

    Zhang, Zhiyong; Zhao, Dishun; Xu, Baoyun

    2013-01-01

    A simple and rapid method is described for the analysis of glyoxal and related substances by high-performance liquid chromatography with a refractive index detector. The following chromatographic conditions were adopted: Aminex HPX-87H column, mobile phase consisting of 0.01N H2SO4, flow rate of 0.8 mL/min and temperature of 65°C. The application of the analytical technique developed in this study demonstrated that the aqueous reaction mixture produced by the oxidation of acetaldehyde with HNO3 was composed of glyoxal, acetaldehyde, acetic acid, formic acid, glyoxylic acid, oxalic acid, butanedione and glycolic acid. The method was validated by evaluating analytical parameters such as linearity, limits of detection and quantification, precision, recovery and robustness. The proposed methodology was successfully applied to the production of glyoxal.

  7. Optimization of microwave-assisted extraction and supercritical fluid extraction of carbamate pesticides in soil by experimental design methodology.

    PubMed

    Sun, Lei; Lee, Hian Kee

    2003-10-03

    Orthogonal array design (OAD) was applied for the first time to optimize microwave-assisted extraction (MAE) and supercritical fluid extraction (SFE) conditions for the analysis of four carbamates (propoxur, propham, methiocarb, chlorpropham) from soil. The theory and methodology of a new OA16 (4(4)) matrix derived from a OA16 (2(15)) matrix were developed during the MAE optimization. An analysis of variance technique was employed as the data analysis strategy in this study. Determinations of analytes were completed using high-performance liquid chromatography (HPLC) with UV detection. Four carbamates were successfully extracted from soil with recoveries ranging from 85 to 105% with good reproducibility (approximately 4.9% RSD) under the optimum MAE conditions: 30 ml methanol, 80 degrees C extraction temperature, and 6-min microwave heating. An OA8 (2(7)) matrix was employed for the SFE optimization. The average recoveries and RSD of the analytes from spiked soil by SFE were 92 and 5.5%, respectively except for propham (66.3+/-7.9%), under the following conditions: heating for 30 min at 60 degrees C under supercritical CO2 at 300 kg/cm2 modified with 10% (v/v) methanol. The composition of the supercritical fluid was demonstrated to be a crucial factor in the extraction. The addition of a small volume (10%) of methanol to CO2 greatly enhanced the recoveries of carbamates. A comparison of MAE with SFE was also conducted. The results indicated that >85% average recoveries were obtained by both optimized extraction techniques, and slightly higher recoveries of three carbamates (propoxur, propham and methiocarb) were achieved using MAE. SFE showed slightly higher recovery for chlorpropham (93 vs. 87% for MAE). The effects of time-aged soil on the extraction of analytes were examined and the results obtained by both methods were also compared.

  8. Reassessing SERS enhancement factors: using thermodynamics to drive substrate design.

    PubMed

    Guicheteau, J A; Tripathi, A; Emmons, E D; Christesen, S D; Fountain, Augustus W

    2017-12-04

    Over the past 40 years fundamental and application research into Surface-Enhanced Raman Scattering (SERS) has been explored by academia, industry, and government laboratories. To date however, SERS has achieved little commercial success as an analytical technique. Researchers are tackling a variety of paths to help break through the commercial barrier by addressing the reproducibility in both the SERS substrates and SERS signals as well as continuing to explore the underlying mechanisms. To this end, investigators use a variety of methodologies, typically studying strongly binding analytes such as aromatic thiols and azarenes, and report SERS enhancement factor calculations. However a drawback of the traditional SERS enhancement factor calculation is that it does not yield enough information to understand substrate reproducibility, application potential with another analyte, or the driving factors behind the molecule-metal interaction. Our work at the US Army Edgewood Chemical Biological Center has focused on these questions and we have shown that thermodynamic principles play a key role in the SERS response and are an essential factor in future designs of substrates and applications. This work will discuss the advantages and disadvantages of various experimental techniques used to report SERS enhancement with planar SERS substrates and present our alternative SERS enhancement value. We will report on three types of analysis scenarios that all yield different information concerning the effectiveness of the SERS substrate, practical application of the substrate, and finally the thermodynamic properties of the substrate. We believe that through this work a greater understanding for designing substrates will be achieved, one that is based on both thermodynamic and plasmonic properties as opposed to just plasmonic properties. This new understanding and potential change in substrate design will enable more applications for SERS based methodologies including targeting molecules that are traditionally not easily detected with SERS due to the perceived weak molecule-metal interaction of substrates.

  9. Hybrid computational and experimental approach for the study and optimization of mechanical components

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    1998-05-01

    Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.

  10. Stable isotope methodology in the pharmacokinetic studies of androgenic steroids in humans.

    PubMed

    Shinohara, Y; Baba, S

    1990-04-01

    The use of stable isotopically labeled steroids combined with gas chromatography/mass spectrometry (GC/MS) has found a broad application in pharmacologic studies. Initially, stable isotopically labeled steroids served as the ideal analytic internal standard for GC/MS analysis; however, their in vivo use has expanded and has proven to be a powerful pharmacokinetic tool. We have successfully used stable isotope methodology to study the pharmacokinetic/bioavailability of androgens. The primary advantage of the technique is that endogenous and exogenous steroids with the same basic structure can be differentiated by using stable isotopically labeled analogs. The method was used to examine the pharmacokinetics of testosterone and testosterone propionate, and to clarify the influence of endogenous testosterone. Another advantage of the isotope methods is that steroidal drugs can be administered concomitantly in two formulations (e.g., solution and solid dosage). A single set of blood samples serves to describe the time course of the formulations being compared. This stable isotope coadministration technique was used to estimate the relative bioavailability of 17 alpha-methyltestosterone.

  11. A methodology to enhance electromagnetic compatibility in joint military operations

    NASA Astrophysics Data System (ADS)

    Buckellew, William R.

    The development and validation of an improved methodology to identify, characterize, and prioritize potential joint EMI (electromagnetic interference) interactions and identify and develop solutions to reduce the effects of the interference are discussed. The methodology identifies potential EMI problems using results from field operations, historical data bases, and analytical modeling. Operational expertise, engineering analysis, and testing are used to characterize and prioritize the potential EMI problems. Results can be used to resolve potential EMI during the development and acquisition of new systems and to develop engineering fixes and operational workarounds for systems already employed. The analytic modeling portion of the methodology is a predictive process that uses progressive refinement of the analysis and the operational electronic environment to eliminate noninterfering equipment pairs, defer further analysis on pairs lacking operational significance, and resolve the remaining EMI problems. Tests are conducted on equipment pairs to ensure that the analytical models provide a realistic description of the predicted interference.

  12. Methodological issues in microdialysis sampling for pharmacokinetic studies.

    PubMed

    de Lange, E C; de Boer, A G; Breimer, D D

    2000-12-15

    Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.

  13. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    PubMed

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  14. Applying Advanced Analytical Approaches to Characterize the Impact of Specific Clinical Gaps and Profiles on the Management of Rheumatoid Arthritis.

    PubMed

    Ruiz-Cordell, Karyn D; Joubin, Kathy; Haimowitz, Steven

    2016-01-01

    The goal of this study was to add a predictive modeling approach to the meta-analysis of continuing medical education curricula to determine whether this technique can be used to better understand clinical decision making. Using the education of rheumatologists on rheumatoid arthritis management as a model, this study demonstrates how the combined methodology has the ability to not only characterize learning gaps but also identify those proficiency areas that have the greatest impact on clinical behavior. The meta-analysis included seven curricula with 25 activities. Learners who identified as rheumatologists were evaluated across multiple learning domains, using a uniform methodology to characterize learning gains and gaps. A performance composite variable (called the treatment individualization and optimization score) was then established as a target upon which predictive analytics were conducted. Significant predictors of the target included items related to the knowledge of rheumatologists and confidence concerning 1) treatment guidelines and 2) tests that measure disease activity. In addition, a striking demographic predictor related to geographic practice setting was also identified. The results demonstrate the power of advanced analytics to identify key predictors that influence clinical behaviors. Furthermore, the ability to provide an expected magnitude of change if these predictors are addressed has the potential to substantially refine educational priorities to those drivers that, if targeted, will most effectively overcome clinical barriers and lead to the greatest success in achieving treatment goals.

  15. RE-EVALUATION OF APPLICABILITY OF AGENCY SAMPLE HOLDING TIMES

    EPA Science Inventory

    Holding times are the length of time a sample can be stored after collection and prior to analysis without significantly affecting the analytical results. Holding times vary with the analyte, sample matrix, and analytical methodology used to quantify the analytes concentration. ...

  16. Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.

    PubMed

    Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio

    2009-12-01

    Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.

  17. Design and development of molecularly imprinted polymers for the selective extraction of deltamethrin in olive oil: An integrated computational-assisted approach.

    PubMed

    Martins, Nuno; Carreiro, Elisabete P; Locati, Abel; Ramalho, João P Prates; Cabrita, Maria João; Burke, Anthony J; Garcia, Raquel

    2015-08-28

    This work firstly addresses the design and development of molecularly imprinted systems selective for deltamethrin aiming to provide a suitable sorbent for solid phase (SPE) extraction that will be further used for the implementation of an analytical methodology for the trace analysis of the target pesticide in spiked olive oil samples. To achieve this goal, a preliminary evaluation of the molecular recognition and selectivity of the molecularly imprinted polymers has been performed. In order to investigate the complexity of the mechanistic basis for template selective recognition in these polymeric matrices, the use of a quantum chemical approach has been attempted providing new insights about the mechanisms underlying template recognition, and in particular the crucial role of the crosslinker agent and the solvent used. Thus, DFT calculations corroborate the results obtained by experimental molecular recognition assays enabling one to select the most suitable imprinting system for MISPE extraction technique which encompasses acrylamide as functional monomer and ethylene glycol dimethacrylate as crosslinker. Furthermore, an analytical methodology comprising a sample preparation step based on solid phase extraction has been implemented using this "tailor made" imprinting system as sorbent, for the selective isolation/pre-concentration of deltamethrin from olive oil samples. Molecularly imprinted solid phase extraction (MISPE) methodology was successfully applied for the clean-up of spiked olive oil samples, with recovery rates up to 94%. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Diffusion orientation transform revisited.

    PubMed

    Canales-Rodríguez, Erick Jorge; Lin, Ching-Po; Iturria-Medina, Yasser; Yeh, Chun-Hung; Cho, Kuan-Hung; Melie-García, Lester

    2010-01-15

    Diffusion orientation transform (DOT) is a powerful imaging technique that allows the reconstruction of the microgeometry of fibrous tissues based on diffusion MRI data. The three main error sources involving this methodology are the finite sampling of the q-space, the practical truncation of the series of spherical harmonics and the use of a mono-exponential model for the attenuation of the measured signal. In this work, a detailed mathematical description that provides an extension to the DOT methodology is presented. In particular, the limitations implied by the use of measurements with a finite support in q-space are investigated and clarified as well as the impact of the harmonic series truncation. Near- and far-field analytical patterns for the diffusion propagator are examined. The near-field pattern makes available the direct computation of the probability of return to the origin. The far-field pattern allows probing the limitations of the mono-exponential model, which suggests the existence of a limit of validity for DOT. In the regimen from moderate to large displacement lengths the isosurfaces of the diffusion propagator reveal aberrations in form of artifactual peaks. Finally, the major contribution of this work is the derivation of analytical equations that facilitate the accurate reconstruction of some orientational distribution functions (ODFs) and skewness ODFs that are relatively immune to these artifacts. The new formalism was tested using synthetic and real data from a phantom of intersecting capillaries. The results support the hypothesis that the revisited DOT methodology could enhance the estimation of the microgeometry of fiber tissues.

  19. Evaluation and performance of desorption electrospray ionization using a triple quadrupole mass spectrometer for quantitation of pharmaceuticals in plasma.

    PubMed

    Kennedy, Joseph H; Wiseman, Justin M

    2010-02-01

    The present work describes the methodology and investigates the performance of desorption electrospray ionization (DESI) combined with a triple quadrupole mass spectrometer for the quantitation of small drug molecules in human plasma. Amoxepine, atenolol, carbamazepine, clozapine, prazosin, propranolol and verapamil were selected as target analytes while terfenadine was selected as the internal standard common to each of the analytes. Protein precipitation of human plasma using acetonitrile was utilized for all samples. Limits of detection were determined for all analytes in plasma and shown to be in the range 0.2-40 ng/mL. Quantitative analysis of amoxepine, prazosin and verapamil was performed over the range 20-7400 ng/mL and shown to be linear in all cases with R(2) >0.99. In most cases, the precision (relative standard deviation) and accuracy (relative error) of each method were less than or equal to 20%, respectively. The performance of the combined techniques made it possible to analyze each sample in 15 s illustrating DESI tandem mass spectrometry (MS/MS) as powerful tool for the quantitation of analytes in deproteinized human plasma. Copyright 2010 John Wiley & Sons, Ltd.

  20. Chemical-mineralogical characterization of C and D waste recycled aggregates from Sao Paulo, Brazil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angulo, S.C.; Ulsen, C.; John, V.M.

    2009-02-15

    This study presents a methodology for the characterization of construction and demolition (C and D) waste recycled aggregates based on a combination of analytical techniques (X-ray fluorescence (XRF), soluble ions, semi-quantitative X-ray diffraction (XRD), thermogravimetric analysis (TGA-DTG) and hydrochloric acid (HCl) selective dissolution). These combined analytical techniques allow for the estimation of the amount of cement paste, its most important hydrated and carbonated phases, as well as the amount of clay and micas. Details of the methodology are presented here and the results of three representative C and D samples taken from the Sao Paulo region in Brazil are discussed.more » Chemical compositions of mixed C and D aggregate samples have mostly been influenced by particle size rather than the visual classification of C and D into red or grey and geographical origin. The amount of measured soluble salts in C and D aggregates (0.15-25.4 mm) is lower than the usual limits for mortar and concrete production. The content of porous cement paste in the C and D aggregates is around 19.3% (w/w). However, this content is significantly lower than the 43% detected for the C and D powders (<0.15 mm). The clay content of the powders was also high, potentially resulting from soil intermixed with the C and D waste, as well as poorly burnt red ceramic. Since only about 50% of the measured CaO is combined with CO{sub 2}, the powders have potential use as raw materials for the cement industry.« less

  1. Correlated Raman micro-spectroscopy and scanning electron microscopy analyses of flame retardants in environmental samples: a micro-analytical tool for probing chemical composition, origin and spatial distribution.

    PubMed

    Ghosal, Sutapa; Wagner, Jeff

    2013-07-07

    We present correlated application of two micro-analytical techniques: scanning electron microscopy/energy dispersive X-ray spectroscopy (SEM/EDS) and Raman micro-spectroscopy (RMS) for the non-invasive characterization and molecular identification of flame retardants (FRs) in environmental dusts and consumer products. The SEM/EDS-RMS technique offers correlated, morphological, molecular, spatial distribution and semi-quantitative elemental concentration information at the individual particle level with micrometer spatial resolution and minimal sample preparation. The presented methodology uses SEM/EDS analyses for rapid detection of particles containing FR specific elements as potential indicators of FR presence in a sample followed by correlated RMS analyses of the same particles for characterization of the FR sub-regions and surrounding matrices. The spatially resolved characterization enabled by this approach provides insights into the distributional heterogeneity as well as potential transfer and exposure mechanisms for FRs in the environment that is typically not available through traditional FR analysis. We have used this methodology to reveal a heterogeneous distribution of highly concentrated deca-BDE particles in environmental dust, sometimes in association with identifiable consumer materials. The observed coexistence of deca-BDE with consumer material in dust is strongly indicative of its release into the environment via weathering/abrasion of consumer products. Ingestion of such enriched FR particles in dust represents a potential for instantaneous exposure to high FR concentrations. Therefore, correlated SEM/RMS analysis offers a novel investigative tool for addressing an area of important environmental concern.

  2. Laser-induced breakdown spectroscopy (LIBS), part II: review of instrumental and methodological approaches to material analysis and applications to different fields.

    PubMed

    Hahn, David W; Omenetto, Nicoló

    2012-04-01

    The first part of this two-part review focused on the fundamental and diagnostics aspects of laser-induced plasmas, only touching briefly upon concepts such as sensitivity and detection limits and largely omitting any discussion of the vast panorama of the practical applications of the technique. Clearly a true LIBS community has emerged, which promises to quicken the pace of LIBS developments, applications, and implementations. With this second part, a more applied flavor is taken, and its intended goal is summarizing the current state-of-the-art of analytical LIBS, providing a contemporary snapshot of LIBS applications, and highlighting new directions in laser-induced breakdown spectroscopy, such as novel approaches, instrumental developments, and advanced use of chemometric tools. More specifically, we discuss instrumental and analytical approaches (e.g., double- and multi-pulse LIBS to improve the sensitivity), calibration-free approaches, hyphenated approaches in which techniques such as Raman and fluorescence are coupled with LIBS to increase sensitivity and information power, resonantly enhanced LIBS approaches, signal processing and optimization (e.g., signal-to-noise analysis), and finally applications. An attempt is made to provide an updated view of the role played by LIBS in the various fields, with emphasis on applications considered to be unique. We finally try to assess where LIBS is going as an analytical field, where in our opinion it should go, and what should still be done for consolidating the technique as a mature method of chemical analysis. © 2012 Society for Applied Spectroscopy

  3. Methodological development of topographic correction in 2D/3D ToF-SIMS images using AFM images

    NASA Astrophysics Data System (ADS)

    Jung, Seokwon; Lee, Nodo; Choi, Myungshin; Lee, Jungmin; Cho, Eunkyunng; Joo, Minho

    2018-02-01

    Time-of-flight secondary-ion mass spectrometry (ToF-SIMS) is an emerging technique that provides chemical information directly from the surface of electronic materials, e.g. OLED and solar cell. It is very versatile and highly sensitive mass spectrometric technique that provides surface molecular information with their lateral distribution as a two-dimensional (2D) molecular image. Extending the usefulness of ToF-SIMS, a 3D molecular image can be generated by acquiring multiple 2D images in a stack. These imaging techniques by ToF-SIMS provide an insight into understanding the complex structures of unknown composition in electronic material. However, one drawback in ToF-SIMS is not able to represent topographical information in 2D and 3D mapping images. To overcome this technical limitation, topographic information by ex-situ technique such as atomic force microscopy (AFM) has been combined with chemical information from SIMS that provides both chemical and physical information in one image. The key to combine two different images obtained from ToF-SIMS and AFM techniques is to develop the image processing algorithm, which performs resize and alignment by comparing the specific pixel information of each image. In this work, we present methodological development of the semiautomatic alignment and the 3D structure interpolation system for the combination of 2D/3D images obtained by ToF-SIMS and AFM measurements, which allows providing useful analytical information in a single representation.

  4. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  5. A methodology for the assessment of manned flight simulator fidelity

    NASA Technical Reports Server (NTRS)

    Hess, Ronald A.; Malsbury, Terry N.

    1989-01-01

    A relatively simple analytical methodology for assessing the fidelity of manned flight simulators for specific vehicles and tasks is offered. The methodology is based upon an application of a structural model of the human pilot, including motion cue effects. In particular, predicted pilot/vehicle dynamic characteristics are obtained with and without simulator limitations. A procedure for selecting model parameters can be implemented, given a probable pilot control strategy. In analyzing a pair of piloting tasks for which flight and simulation data are available, the methodology correctly predicted the existence of simulator fidelity problems. The methodology permitted the analytical evaluation of a change in simulator characteristics and indicated that a major source of the fidelity problems was a visual time delay in the simulation.

  6. A review of modern instrumental techniques for measurements of ice cream characteristics.

    PubMed

    Bahram-Parvar, Maryam

    2015-12-01

    There is an increasing demand of the food industries and research institutes to have means of measurement allowing the characterization of foods. Ice cream, as a complex food system, consists of a frozen matrix containing air bubbles, fat globules, ice crystals, and an unfrozen serum phase. Some deficiencies in conventional methods for testing this product encourage the use of alternative techniques such as rheometry, spectroscopy, X-ray, electro-analytical techniques, ultrasound, and laser. Despite the development of novel instrumental applications in food science, use of some of them in ice cream testing is few, but has shown promising results. Developing the novel methods should increase our understanding of characteristics of ice cream and may allow online testing of the product. This review article discusses the potential of destructive and non-destructive methodologies in determining the quality and characteristics of ice cream and similar products. Copyright © 2015. Published by Elsevier Ltd.

  7. Volatile organic compounds: sampling methods and their worldwide profile in ambient air.

    PubMed

    Kumar, Anuj; Víden, Ivan

    2007-08-01

    The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.

  8. Integrated HPTLC-based Methodology for the Tracing of Bioactive Compounds in Herbal Extracts Employing Multivariate Chemometrics. A Case Study on Morus alba.

    PubMed

    Chaita, Eliza; Gikas, Evagelos; Aligiannis, Nektarios

    2017-03-01

    In drug discovery, bioassay-guided isolation is a well-established procedure, and still the basic approach for the discovery of natural products with desired biological properties. However, in these procedures, the most laborious and time-consuming step is the isolation of the bioactive constituents. A prior identification of the compounds that contribute to the demonstrated activity of the fractions would enable the selection of proper chromatographic techniques and lead to targeted isolation. The development of an integrated HPTLC-based methodology for the rapid tracing of the bioactive compounds during bioassay-guided processes, using multivariate statistics. Materials and Methods - The methanol extract of Morus alba was fractionated employing CPC. Subsequently, fractions were assayed for tyrosinase inhibition and analyzed with HPTLC. PLS-R algorithm was performed in order to correlate the analytical data with the biological response of the fractions and identify the compounds with the highest contribution. Two methodologies were developed for the generation of the dataset; one based on manual peak picking and the second based on chromatogram binning. Results and Discussion - Both methodologies afforded comparable results and were able to trace the bioactive constituents (e.g. oxyresveratrol, trans-dihydromorin, 2,4,3'-trihydroxydihydrostilbene). The suggested compounds were compared in terms of R f values and UV spectra with compounds isolated from M. alba using typical bioassay-guided process. Chemometric tools supported the development of a novel HPTLC-based methodology for the tracing of tyrosinase inhibitors in M. alba extract. All steps of the experimental procedure implemented techniques that afford essential key elements for application in high-throughput screening procedures for drug discovery purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  9. Analysis of Combined Data from Heterogeneous Study Designs: A Methodological Proposal from the Patient Navigation Research program

    PubMed Central

    Roetzheim, Richard G.; Freund, Karen M.; Corle, Don K.; Murray, David M.; Snyder, Frederick R.; Kronman, Andrea C.; Jean-Pierre, Pascal; Raich, Peter C.; Holden, Alan E. C.; Darnell, Julie S.; Warren-Mears, Victoria; Patierno, Steven; Design, PNRP; Committee, Analysis

    2013-01-01

    Background The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, each employing its own unique study design. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from members of the PNRP Design and Analysis Committee Purpose To review possible methodologies for analyzing combined data arising from heterogeneous study designs. Methods The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. Conclusions were based on simple consensus. The five approaches reviewed included: 1) Analyzing and reporting each project separately, 2) Combining data from all projects and performing an individual-level analysis, 3) Pooling data from projects having similar study designs, 4) Analyzing pooled data using a prospective meta analytic technique, 5) Analyzing pooled data utilizing a novel simulated group randomized design. Results Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and in their impact from differing project sample sizes. Limitations The conclusions reached were based on expert opinion and not derived from actual analyses performed. Conclusions The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multi-site community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs. PMID:22273587

  10. Comparison Between Laser Scanning and Automated 3d Modelling Techniques to Reconstruct Complex and Extensive Cultural Heritage Areas

    NASA Astrophysics Data System (ADS)

    Fassi, F.; Fregonese, L.; Ackermann, S.; De Troia, V.

    2013-02-01

    In Cultural Heritage field, the necessity to survey objects in a fast manner, with the ability to repeat the measurements several times for deformation or degradation monitoring purposes, is increasing. In this paper, two significant cases, an architectonical one and an archaeological one, are presented. Due to different reasons and emergency situations, the finding of the optimal solution to enable quick and well-timed survey for a complete digital reconstruction of the object is required. In both cases, two survey methods have been tested and used: a laser scanning approach that allows to obtain high-resolution and complete scans within a short time and a photogrammetric one that allows the three-dimensional reconstruction of the object from images. In the last months, several methodologies, including free or low cost techniques, have arisen. These kinds of software allow the fully automatically three-dimensional reconstruction of objects from images, giving back a dense point cloud and, in some case, a surfaced mesh model. In this paper some comparisons between the two methodologies above mentioned are presented, using the example of some real cases of study. The surveys have been performed by employing both photogrammetry and laser scanner techniques. The methodological operational choices, depending on the required goal, the difficulties encountered during the survey with these methods, the execution time (that is the key parameter), and finally the obtained results, are fully described and examinated. On the final 3D model, an analytical comparison has been made, to analyse the differences, the tolerances, the possibility of accuracy improvement and the future developments.

  11. Simulation of wind turbine wakes using the actuator line technique.

    PubMed

    Sørensen, Jens N; Mikkelsen, Robert F; Henningson, Dan S; Ivanell, Stefan; Sarmast, Sasan; Andersen, Søren J

    2015-02-28

    The actuator line technique was introduced as a numerical tool to be employed in combination with large eddy simulations to enable the study of wakes and wake interaction in wind farms. The technique is today largely used for studying basic features of wakes as well as for making performance predictions of wind farms. In this paper, we give a short introduction to the wake problem and the actuator line methodology and present a study in which the technique is employed to determine the near-wake properties of wind turbines. The presented results include a comparison of experimental results of the wake characteristics of the flow around a three-bladed model wind turbine, the development of a simple analytical formula for determining the near-wake length behind a wind turbine and a detailed investigation of wake structures based on proper orthogonal decomposition analysis of numerically generated snapshots of the wake. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  12. Strategy for determination of LOD and LOQ values--some basic aspects.

    PubMed

    Uhrovčík, Jozef

    2014-02-01

    The paper is devoted to the evaluation of limit of detection (LOD) and limit of quantification (LOQ) values in concentration domain by using 4 different approaches; namely 3σ and 10σ approaches, ULA2 approach, PBA approach and MDL approach. Brief theoretical analyses of all above mentioned approaches are given together with directions for their practical use. Calculations and correct calibration design are exemplified by using of electrothermal atomic absorption spectrometry for determination of lead in drinking water sample. These validation parameters reached 1.6 μg L(-1) (LOD) and 5.4 μg L(-1) (LOQ) by using 3σ and 10σ approaches. For obtaining relevant values of analyte concentration the influence of calibration design and measurement methodology were examined. The most preferred technique has proven to be a method of preconcentration of the analyte on the surface of the graphite cuvette (boost cycle). © 2013 Elsevier B.V. All rights reserved.

  13. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  14. Pharmaceuticals in biota in the aquatic environment: analytical methods and environmental implications.

    PubMed

    Huerta, B; Rodríguez-Mozaz, S; Barceló, D

    2012-11-01

    The presence of pharmaceuticals in the aquatic environment is an ever-increasing issue of concern as they are specifically designed to target specific metabolic and molecular pathways in organisms, and they may have the potential for unintended effects on nontarget species. Information on the presence of pharmaceuticals in biota is still scarce, but the scientific literature on the subject has established the possibility of bioaccumulation in exposed aquatic organisms through other environmental compartments. However, few studies have correlated both bioaccumulation of pharmaceutical compounds and the consequent effects. Analytical methodology to detect pharmaceuticals at trace quantities in biota has advanced significantly in the last few years. Nonetheless, there are still unresolved analytical challenges associated with the complexity of biological matrices, which require exhaustive extraction and purification steps, and highly sensitive and selective detection techniques. This review presents the trends in the analysis of pharmaceuticals in aquatic organisms in the last decade, recent data about the occurrence of these compounds in natural biota, and the environmental implications that chronic exposure could have on aquatic wildlife.

  15. Using object-oriented analysis to design a multi-mission ground data system

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1995-01-01

    This paper describes an analytical approach and descriptive methodology that is adapted from Object-Oriented Analysis (OOA) techniques. The technique is described and then used to communicate key issues of system logical architecture. The essence of the approach is to limit the analysis to only service objects, with the idea of providing a direct mapping from the design to a client-server implementation. Key perspectives on the system, such as user interaction, data flow and management, service interfaces, hardware configuration, and system and data integrity are covered. A significant advantage of this service-oriented approach is that it permits mapping all of these different perspectives on the system onto a single common substrate. This services substrate is readily represented diagramatically, thus making details of the overall design much more accessible.

  16. Advances in functional X-ray imaging techniques and contrast agents

    PubMed Central

    Chen, Hongyu; Rogalski, Melissa M.

    2012-01-01

    X-rays have been used for non-invasive high-resolution imaging of thick biological specimens since their discovery in 1895. They are widely used for structural imaging of bone, metal implants, and cavities in soft tissue. Recently, a number of new contrast methodologies have emerged which are expanding X-ray’s biomedical applications to functional as well as structural imaging. These techniques are promising to dramatically improve our ability to study in situ biochemistry and disease pathology. In this review, we discuss how X-ray absorption, X-ray fluorescence, and X-ray excited optical luminescence can be used for physiological, elemental, and molecular imaging of vasculature, tumours, pharmaceutical distribution, and the surface of implants. Imaging of endogenous elements, exogenous labels, and analytes detected with optical indicators will be discussed. PMID:22962667

  17. NASA's post-Challenger safety program - Themes and thrusts

    NASA Technical Reports Server (NTRS)

    Rodney, G. A.

    1988-01-01

    The range of managerial, technical, and procedural initiatives implemented by NASA's post-Challenger safety program is reviewed. The recommendations made by the Rogers Commission, the NASA post-Challenger review of Shuttle design, the Congressional investigation of the accident, the National Research Council, the Aerospace Safety Advisory Panel, and NASA internal advisory panels and studies are summarized. NASA safety initiatives regarding improved organizational accountability for safety, upgraded analytical techniques and methodologies for risk assessment and management, procedural initiatives in problem reporting and corrective-action tracking, ground processing, maintenance documentation, and improved technologies are discussed. Safety issues relevant to the planned Space Station are examined.

  18. An expert system for municipal solid waste management simulation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsieh, M.C.; Chang, N.B.

    1996-12-31

    Optimization techniques were usually used to model the complicated metropolitan solid waste management system to search for the best dynamic combination of waste recycling, facility siting, and system operation, where sophisticated and well-defined interrelationship are required in the modeling process. But this paper applied the Concurrent Object-Oriented Simulation (COOS), a new simulation software construction method, to bridge the gap between the physical system and its computer representation. The case study of Kaohsiung solid waste management system in Taiwan is prepared for the illustration of the analytical methodology of COOS and its implementation in the creation of an expert system.

  19. Enabling Data-Driven Methodologies Across the Data Lifecycle and Ecosystem

    NASA Astrophysics Data System (ADS)

    Doyle, R. J.; Crichton, D.

    2017-12-01

    NASA has unlocked unprecedented scientific knowledge through exploration of the Earth, our solar system, and the larger universe. NASA is generating enormous amounts of data that are challenging traditional approaches to capturing, managing, analyzing and ultimately gaining scientific understanding from science data. New architectures, capabilities and methodologies are needed to span the entire observing system, from spacecraft to archive, while integrating data-driven discovery and analytic capabilities. NASA data have a definable lifecycle, from remote collection point to validated accessibility in multiple archives. Data challenges must be addressed across this lifecycle, to capture opportunities and avoid decisions that may limit or compromise what is achievable once data arrives at the archive. Data triage may be necessary when the collection capacity of the sensor or instrument overwhelms data transport or storage capacity. By migrating computational and analytic capability to the point of data collection, informed decisions can be made about which data to keep; in some cases, to close observational decision loops onboard, to enable attending to unexpected or transient phenomena. Along a different dimension than the data lifecycle, scientists and other end-users must work across an increasingly complex data ecosystem, where the range of relevant data is rarely owned by a single institution. To operate effectively, scalable data architectures and community-owned information models become essential. NASA's Planetary Data System is having success with this approach. Finally, there is the difficult challenge of reproducibility and trust. While data provenance techniques will be part of the solution, future interactive analytics environments must support an ability to provide a basis for a result: relevant data source and algorithms, uncertainty tracking, etc., to assure scientific integrity and to enable confident decision making. Advances in data science offer opportunities to gain new insights from space missions and their vast data collections. We are working to innovate new architectures, exploit emerging technologies, develop new data-driven methodologies, and transfer them across disciplines, while working across the dual dimensions of the data lifecycle and the data ecosystem.

  20. Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.

    PubMed

    Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian

    2017-09-12

    Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.

  1. INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ...

    EPA Pesticide Factsheets

    Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than parts per billion, ppb). IAG with USGS ends in FY05. APM 20 due in FY05.Subtask 2: Coordination of interagency research and public outreach activities for PPCPs. Participate on NSTC Health and Environment subcommittee working group on PPCPs. Web site maintenance and expansion, invited technical presentations, invited articles for peer-reviewed journals, interviews

  2. Moving alcohol prevention research forward-Part I: introducing a complex systems paradigm.

    PubMed

    Apostolopoulos, Yorghos; Lemke, Michael K; Barry, Adam E; Lich, Kristen Hassmiller

    2018-02-01

    The drinking environment is a complex system consisting of a number of heterogeneous, evolving and interacting components, which exhibit circular causality and emergent properties. These characteristics reduce the efficacy of commonly used research approaches, which typically do not account for the underlying dynamic complexity of alcohol consumption and the interdependent nature of diverse factors influencing misuse over time. We use alcohol misuse among college students in the United States as an example for framing our argument for a complex systems paradigm. A complex systems paradigm, grounded in socio-ecological and complex systems theories and computational modeling and simulation, is introduced. Theoretical, conceptual, methodological and analytical underpinnings of this paradigm are described in the context of college drinking prevention research. The proposed complex systems paradigm can transcend limitations of traditional approaches, thereby fostering new directions in alcohol prevention research. By conceptualizing student alcohol misuse as a complex adaptive system, computational modeling and simulation methodologies and analytical techniques can be used. Moreover, use of participatory model-building approaches to generate simulation models can further increase stakeholder buy-in, understanding and policymaking. A complex systems paradigm for research into alcohol misuse can provide a holistic understanding of the underlying drinking environment and its long-term trajectory, which can elucidate high-leverage preventive interventions. © 2017 Society for the Study of Addiction.

  3. Fast and Simple Discriminative Analysis of Anthocyanins-Containing Berries Using LC/MS Spectral Data.

    PubMed

    Yang, Heejung; Kim, Hyun Woo; Kwon, Yong Soo; Kim, Ho Kyong; Sung, Sang Hyun

    2017-09-01

    Anthocyanins are potent antioxidant agents that protect against many degenerative diseases; however, they are unstable because they are vulnerable to external stimuli including temperature, pH and light. This vulnerability hinders the quality control of anthocyanin-containing berries using classical high-performance liquid chromatography (HPLC) analytical methodologies based on UV or MS chromatograms. To develop an alternative approach for the quality assessment and discrimination of anthocyanin-containing berries, we used MS spectral data acquired in a short analytical time rather than UV or MS chromatograms. Mixtures of anthocyanins were separated from other components in a short gradient time (5 min) due to their higher polarity, and the representative MS spectrum was acquired from the MS chromatogram corresponding to the mixture of anthocyanins. The chemometric data from the representative MS spectra contained reliable information for the identification and relative quantification of anthocyanins in berries with good precision and accuracy. This fast and simple methodology, which consists of a simple sample preparation method and short gradient analysis, could be applied to reliably discriminate the species and geographical origins of different anthocyanin-containing berries. These features make the technique useful for the food industry. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    PubMed

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Development and characterization of antibody reagents for detecting nanoparticles

    NASA Astrophysics Data System (ADS)

    Ravichandran, Supriya; Sullivan, Mark A.; Callahan, Linda M.; Bentley, Karen L.; Delouise, Lisa A.

    2015-11-01

    The increasing use of nanoparticles (NPs) in technological applications and in commercial products has escalated environmental health and safety concerns. The detection of NPs in the environment and in biological systems is challenged by limitations associated with commonly used analytical techniques. In this paper we report on the development and characterization of NP binding antibodies, termed NProbes. Phage display methodology was used to discover antibodies that bind NPs dispersed in solution. We present a proof-of-concept for the generation of NProbes and their use for detecting quantum dots and titanium dioxide NPs in vitro and in an ex vivo human skin model. Continued development and refinement of NProbes to detect NPs that vary in composition, shape, size, and surface coating will comprise a powerful tool kit that can be used to advance nanotechnology research particularly in the nanotoxicology and nanotherapeutics fields.The increasing use of nanoparticles (NPs) in technological applications and in commercial products has escalated environmental health and safety concerns. The detection of NPs in the environment and in biological systems is challenged by limitations associated with commonly used analytical techniques. In this paper we report on the development and characterization of NP binding antibodies, termed NProbes. Phage display methodology was used to discover antibodies that bind NPs dispersed in solution. We present a proof-of-concept for the generation of NProbes and their use for detecting quantum dots and titanium dioxide NPs in vitro and in an ex vivo human skin model. Continued development and refinement of NProbes to detect NPs that vary in composition, shape, size, and surface coating will comprise a powerful tool kit that can be used to advance nanotechnology research particularly in the nanotoxicology and nanotherapeutics fields. Electronic supplementary information (ESI) available: Figures and detailed methods of various techniques used. See DOI: 10.1039/c5nr04882f

  6. Need for evaluative methodologies in land use, regional resource and waste management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croke, E. J.

    The transfer of planning methodology from the research community to the practitioner very frequently takes the form of analytical and evaluative techniques and procedures. In the end, these become operational in the form of data acquisition, management and display systems, computational schemes that are codified in the form of manuals and handbooks, and computer simulation models. The complexity of the socioeconomic and physical processes that govern environmental resource and waste management have reinforced the need for computer assisted, scientifically sophisticated planning models that are fully operational, dependent on an attainable data base and accessible in terms of the resources normallymore » available to practitioners of regional resource management, waste management, and land use planning. A variety of models and procedures that attempt to meet one or more of the needs of these practitioners are discussed.« less

  7. Measuring allostatic load in the workforce: a systematic review

    PubMed Central

    MAUSS, Daniel; LI, Jian; SCHMIDT, Burkhard; ANGERER, Peter; JARCZOK, Marc N.

    2014-01-01

    The Allostatic Load Index (ALI) has been used to establish associations between stress and health-related outcomes. This review summarizes the measurement and methodological challenges of allostatic load in occupational settings. Databases of Medline, PubPsych, and Cochrane were searched to systematically explore studies measuring ALI in working adults following the PRISMA statement. Study characteristics, biomarkers and methods were tabulated. Methodological quality was evaluated using a standardized checklist. Sixteen articles (2003–2013) met the inclusion criteria, with a total of 39 (range 6–17) different variables used to calculate ALI. Substantial heterogeneity was observed in the number and type of biomarkers used, the analytic techniques applied and study quality. Particularly, primary mediators were not regularly included in ALI calculation. Consensus on methods to measure ALI in working populations is limited. Research should include longitudinal studies using multi-systemic variables to measure employees at risk for biological wear and tear. PMID:25224337

  8. Electromagnetic interference of cardiac rhythmic monitoring devices to radio frequency identification: analytical analysis and mitigation methodology.

    PubMed

    Ogirala, Ajay; Stachel, Joshua R; Mickle, Marlin H

    2011-11-01

    Increasing density of wireless communication and development of radio frequency identification (RFID) technology in particular have increased the susceptibility of patients equipped with cardiac rhythmic monitoring devices (CRMD) to environmental electro magnetic interference (EMI). Several organizations reported observing CRMD EMI from different sources. This paper focuses on mathematically analyzing the energy as perceived by the implanted device, i.e., voltage. Radio frequency (RF) energy transmitted by RFID interrogators is considered as an example. A simplified front-end equivalent circuit of a CRMD sensing circuitry is proposed for the analysis following extensive black-box testing of several commercial pacemakers and implantable defibrillators. After careful understanding of the mechanics of the CRMD signal processing in identifying the QRS complex of the heart-beat, a mitigation technique is proposed. The mitigation methodology introduced in this paper is logical in approach, simple to implement and is therefore applicable to all wireless communication protocols.

  9. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming

    PubMed Central

    van Elk, Michiel; Matzke, Dora; Gronau, Quentin F.; Guan, Maime; Vandekerckhove, Joachim; Wagenmakers, Eric-Jan

    2015-01-01

    According to a recent meta-analysis, religious priming has a positive effect on prosocial behavior (Shariff et al., 2015). We first argue that this meta-analysis suffers from a number of methodological shortcomings that limit the conclusions that can be drawn about the potential benefits of religious priming. Next we present a re-analysis of the religious priming data using two different meta-analytic techniques. A Precision-Effect Testing–Precision-Effect-Estimate with Standard Error (PET-PEESE) meta-analysis suggests that the effect of religious priming is driven solely by publication bias. In contrast, an analysis using Bayesian bias correction suggests the presence of a religious priming effect, even after controlling for publication bias. These contradictory statistical results demonstrate that meta-analytic techniques alone may not be sufficiently robust to firmly establish the presence or absence of an effect. We argue that a conclusive resolution of the debate about the effect of religious priming on prosocial behavior – and about theoretically disputed effects more generally – requires a large-scale, preregistered replication project, which we consider to be the sole remedy for the adverse effects of experimenter bias and publication bias. PMID:26441741

  10. Cost and Schedule Analytical Techniques Development

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This Final Report summarizes the activities performed by Science Applications International Corporation (SAIC) under contract NAS 8-40431 "Cost and Schedule Analytical Techniques Development Contract" (CSATD) during Option Year 3 (December 1, 1997 through November 30, 1998). This Final Report is in compliance with Paragraph 5 of Section F of the contract. This CSATD contract provides technical products and deliverables in the form of parametric models, databases, methodologies, studies, and analyses to the NASA Marshall Space Flight Center's (MSFC) Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) and other user organizations. Detailed Monthly Reports were submitted to MSFC in accordance with the contract's Statement of Work, Section IV "Reporting and Documentation". These reports spelled out each month's specific work performed, deliverables submitted, major meetings conducted, and other pertinent information. Therefore, this Final Report will summarize these activities at a higher level. During this contract Option Year, SAIC expended 25,745 hours in the performance of tasks called out in the Statement of Work. This represents approximately 14 full-time EPs. Included are the Huntsville-based team, plus SAIC specialists in San Diego, Ames Research Center, Tampa, and Colorado Springs performing specific tasks for which they are uniquely qualified.

  11. An analytical procedure to assist decision-making in a government research organization

    Treesearch

    H. Dean Claxton; Giuseppe Rensi

    1972-01-01

    An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...

  12. Analytical and simulator study of advanced transport

    NASA Technical Reports Server (NTRS)

    Levison, W. H.; Rickard, W. W.

    1982-01-01

    An analytic methodology, based on the optimal-control pilot model, was demonstrated for assessing longitidunal-axis handling qualities of transport aircraft in final approach. Calibration of the methodology is largely in terms of closed-loop performance requirements, rather than specific vehicle response characteristics, and is based on a combination of published criteria, pilot preferences, physical limitations, and engineering judgment. Six longitudinal-axis approach configurations were studied covering a range of handling qualities problems, including the presence of flexible aircraft modes. The analytical procedure was used to obtain predictions of Cooper-Harper ratings, a solar quadratic performance index, and rms excursions of important system variables.

  13. Single-step affinity purification of enzyme biotherapeutics: a platform methodology for accelerated process development.

    PubMed

    Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean

    2014-01-01

    Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ikonomou, M.G.; Crewe, N.F.; Fischer, M.

    It has been demonstrated that marine mammals accumulate high concentrations of lipophilic organochlorine contaminants in blubber. As predators of the high trophic level they have also been used to evaluate contamination in the marine environment. Sampling of living marine mammals using a microsample (100 to 200 mg) biopsy dart technique offers a potentially invaluable chronicle in assessing levels and types of persistent environmental pollutants from a sample in which age, sex and other genetic information can additionally be ascertained. The authors have explored analytical methodology based on a high sensitivity detection system (HRGC/HRMS) which provides multi-residue determinations from biopsy dartmore » microsamples. Lipid content and the concentrations of PCDDs, PCDFs and non-ortho and mono-ortho substituted PCBs were measured in 100 mg biopsy dart replicates taken from a killer whale carcass and in three strata of the blubber of that carcass. Statistically acceptable results were obtained from the dart replicates which compared very well with those of the blubber strata. Analytical data from 100 mg extractions from an established in house blubber CRM also compared well against a series of 2.5 g extractions of that CRM. The extraction and cleanup procedures used also allow for the determination of other organohalogen contaminants such as DDT and other pesticides, all the remaining PCBs, polychlorinated diphenylethers and brominated residues. The strengths and limitations of the analytical methodology and the biopsy dart as a sampling tool and pollution predicator will be illustrated in terms of method accuracy and precision, glassware and procedural blanks associated with each extraction batch, and the incorporation of an in house micro reference standard.« less

  15. LOX/hydrocarbon rocket engine analytical design methodology development and validation. Volume 1: Executive summary and technical narrative

    NASA Technical Reports Server (NTRS)

    Pieper, Jerry L.; Walker, Richard E.

    1993-01-01

    During the past three decades, an enormous amount of resources were expended in the design and development of Liquid Oxygen/Hydrocarbon and Hydrogen (LOX/HC and LOX/H2) rocket engines. A significant portion of these resources were used to develop and demonstrate the performance and combustion stability for each new engine. During these efforts, many analytical and empirical models were developed that characterize design parameters and combustion processes that influence performance and stability. Many of these models are suitable as design tools, but they have not been assembled into an industry-wide usable analytical design methodology. The objective of this program was to assemble existing performance and combustion stability models into a usable methodology capable of producing high performing and stable LOX/hydrocarbon and LOX/hydrogen propellant booster engines.

  16. Application of person-centered analytic methodology in longitudinal research: exemplars from the Women's Health Initiative Clinical Trial data.

    PubMed

    Zaslavsky, Oleg; Cochrane, Barbara B; Herting, Jerald R; Thompson, Hilaire J; Woods, Nancy F; Lacroix, Andrea

    2014-02-01

    Despite the variety of available analytic methods, longitudinal research in nursing has been dominated by use of a variable-centered analytic approach. The purpose of this article is to present the utility of person-centered methodology using a large cohort of American women 65 and older enrolled in the Women's Health Initiative Clinical Trial (N = 19,891). Four distinct trajectories of energy/fatigue scores were identified. Levels of fatigue were closely linked to age, socio-demographic factors, comorbidities, health behaviors, and poor sleep quality. These findings were consistent regardless of the methodological framework. Finally, we demonstrated that energy/fatigue levels predicted future hospitalization in non-disabled elderly. Person-centered methods provide unique opportunities to explore and statistically model the effects of longitudinal heterogeneity within a population. © 2013 Wiley Periodicals, Inc.

  17. Pumping tests in nonuniform aquifers - The radially symmetric case

    USGS Publications Warehouse

    Butler, J.J.

    1988-01-01

    Traditionally, pumping-test-analysis methodology has been limited to applications involving aquifers whose properties are assumed uniform in space. This work attempts to assess the applicability of analytical methodology to a broader class of units with spatially varying properties. An examination of flow behavior in a simple configuration consisting of pumping from the center of a circular disk embedded in a matrix of differing properties is the basis for this investigation. A solution describing flow in this configuration is obtained through Laplace-transform techniques using analytical and numerical inversion schemes. Approaches for the calculation of flow properties in conditions that can be roughly represented by this simple configuration are proposed. Possible applications include a wide variety of geologic structures, as well as the case of a well skin resulting from drilling or development. Of more importance than the specifics of these techniques for analysis of water-level responses is the insight into flow behavior during a pumping test that is provided by the large-time form of the derived solution. The solution reveals that drawdown during a pumping test can be considered to consist of two components that are dependent and independent of near-well properties, respectively. Such an interpretation of pumping-test drawdown allows some general conclusions to be drawn concerning the relationship between parameters calculated using analytical approaches based on curve-matching and those calculated using approaches based on the slope of a semilog straight line plot. The infinite-series truncation that underlies the semilog analytical approaches is shown to remove further contributions of near-well material to total drawdown. In addition, the semilog distance-drawdown approach is shown to yield an expression that is equivalent to the Thiem equation. These results allow some general recommendations to be made concerning observation-well placement for pumping tests in nonuniform aquifers. The relative diffusivity of material on either side of a discontinuity is shown to be the major factor in controlling flow behavior during the period in which the front of the cone of depression is moving across the discontinuity. Though resulting from an analysis of flow in an idealized configuration, the insights of this work into flow behavior during a pumping test are applicable to a wide class of nonuniform units. ?? 1988.

  18. Experimental and analytical investigation of inertial propulsion mechanisms and motion simulation of rigid multi-body mechanical systems

    NASA Astrophysics Data System (ADS)

    Almesallmy, Mohammed

    Methodologies are developed for dynamic analysis of mechanical systems with emphasis on inertial propulsion systems. This work adopted the Lagrangian methodology. Lagrangian methodology is the most efficient classical computational technique, which we call Equations of Motion Code (EOMC). The EOMC is applied to several simple dynamic mechanical systems for easier understanding of the method and to aid other investigators in developing equations of motion of any dynamic system. In addition, it is applied to a rigid multibody system, such as Thomson IPS [Thomson 1986]. Furthermore, a simple symbolic algorithm is developed using Maple software, which can be used to convert any nonlinear n-order ordinary differential equation (ODE) systems into 1st-order ODE system in ready format to be used in Matlab software. A side issue, but equally important, we have started corresponding with the U.S. Patent office to persuade them that patent applications, claiming gross linear motion based on inertial propulsion systems should be automatically rejected. The precedent is rejection of patent applications involving perpetual motion machines.

  19. Determination of iodopropynyl butylcarbamate in cosmetic formulations utilizing pulsed splitless injection, gas chromatography with electron capture detector.

    PubMed

    Palmer, Kevin B; LaFon, William; Burford, Mark D

    2017-09-22

    Current analytical methodology for iodopropynyl butylcarbamate (IPBC) analysis focuses on the use of liquid chromatography and mass spectrometer (LC-MS), but the high instrumentation and operator investment required has resulted in the need for a cost effective alternative methodology. Past publications investigating gas chromatography with electron capture detector (GC-ECD) for IPBC quantitation proved largely unsuccessful, likely due to the preservatives limited thermal stability. The use of pulsed injection techniques commonly used for trace analysis of thermally labile pharmaceutical compounds was successfully adapted for IPBC analysis and utilizes the selectivity of GC-ECD analysis. System optimization and sample preparation improvements resulted in substantial performance and reproducibility gains. Cosmetic formulations preserved with IPBC (50-100ppm) were solvated in toluene/isopropyl alcohol and quantified over the 0.3-1.3μg/ml calibration range. The methodology was robust (relative standard deviation 4%), accurate (98% recovery), and sensitive (limit of detection 0.25ng/ml) for use in routine testing of cosmetic formulation preservation. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Cost-Effectiveness of HBV and HCV Screening Strategies – A Systematic Review of Existing Modelling Techniques

    PubMed Central

    Geue, Claudia; Wu, Olivia; Xin, Yiqiao; Heggie, Robert; Hutchinson, Sharon; Martin, Natasha K.; Fenwick, Elisabeth; Goldberg, David

    2015-01-01

    Introduction Studies evaluating the cost-effectiveness of screening for Hepatitis B Virus (HBV) and Hepatitis C Virus (HCV) are generally heterogeneous in terms of risk groups, settings, screening intervention, outcomes and the economic modelling framework. It is therefore difficult to compare cost-effectiveness results between studies. This systematic review aims to summarise and critically assess existing economic models for HBV and HCV in order to identify the main methodological differences in modelling approaches. Methods A structured search strategy was developed and a systematic review carried out. A critical assessment of the decision-analytic models was carried out according to the guidelines and framework developed for assessment of decision-analytic models in Health Technology Assessment of health care interventions. Results The overall approach to analysing the cost-effectiveness of screening strategies was found to be broadly consistent for HBV and HCV. However, modelling parameters and related structure differed between models, producing different results. More recent publications performed better against a performance matrix, evaluating model components and methodology. Conclusion When assessing screening strategies for HBV and HCV infection, the focus should be on more recent studies, which applied the latest treatment regimes, test methods and had better and more complete data on which to base their models. In addition to parameter selection and associated assumptions, careful consideration of dynamic versus static modelling is recommended. Future research may want to focus on these methodological issues. In addition, the ability to evaluate screening strategies for multiple infectious diseases, (HCV and HIV at the same time) might prove important for decision makers. PMID:26689908

  1. Methods for investigating biosurfactants and bioemulsifiers: a review.

    PubMed

    Satpute, Surekha K; Banpurkar, Arun G; Dhakephalkar, Prashant K; Banat, Ibrahim M; Chopade, Balu A

    2010-06-01

    Microorganisms produce biosurfactant (BS)/bioemulsifier (BE) with wide structural and functional diversity which consequently results in the adoption of different techniques to investigate these diverse amphiphilic molecules. This review aims to compile information on different microbial screening methods, surface active products extraction procedures, and analytical terminologies used in this field. Different methods for screening microbial culture broth or cell biomass for surface active compounds production are also presented and their possible advantages and disadvantages highlighted. In addition, the most common methods for purification, detection, and structure determination for a wide range of BS and BE are introduced. Simple techniques such as precipitation using acetone, ammonium sulphate, solvent extraction, ultrafiltration, ion exchange, dialysis, ultrafiltration, lyophilization, isoelectric focusing (IEF), and thin layer chromatography (TLC) are described. Other more elaborate techniques including high pressure liquid chromatography (HPLC), infra red (IR), gas chromatography-mass spectroscopy (GC-MS), nuclear magnetic resonance (NMR), and fast atom bombardment mass spectroscopy (FAB-MS), protein digestion and amino acid sequencing are also elucidated. Various experimental strategies including static light scattering and hydrodynamic characterization for micelles have been discussed. A combination of various analytical methods are often essential in this area of research and a numbers of trials and errors to isolate, purify and characterize various surface active agents are required. This review introduces the various methodologies that are indispensable for studying biosurfactants and bioemulsifiers.

  2. New understanding of rhizosphere processes enabled by advances in molecular and spatially resolved techniques

    DOE PAGES

    Hess, Nancy J.; Pasa-Tolic, Ljiljana; Bailey, Vanessa L.; ...

    2017-04-12

    Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less

  3. Use of economic evaluation guidelines: 2 years' experience in Canada.

    PubMed

    Baladi, J F; Menon, D; Otten, N

    1998-05-01

    Considerable effort has been expended in recent years in the development of methodology guidelines for economic evaluation of pharmaceutical products, driven in part by the desire to improve the rigour and quality of economic evaluations and to help decision making. Canada was one of the first countries to develop such guidelines and to encourage their use. This paper examines the extent to which the economic evaluations that were submitted to the Canadian Coordinating Office for Health Technology Assessment in the last two years adhered to Canadian guidelines. The analytic technique employed by twelve studies as well as the comparator used, the perspective taken, the outcome measure selected, the cost items that were taken into consideration and the extent of sensitivity analyses that were performed are reviewed in this paper. It can be concluded that although studies have been of variable quality, the majority of them were well presented, complete and transparent, due in part to the guidelines. Except for the perspective of the analysis, guidelines were, in many respects, adhered to and did not restrict investigators to specific methodologies or specific techniques. They were also instrumental in ensuring a minimum set of standards.

  4. Optimization of oncological {sup 18}F-FDG PET/CT imaging based on a multiparameter analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menezes, Vinicius O., E-mail: vinicius@radtec.com.br; Machado, Marcos A. D.; Queiroz, Cleiton C.

    2016-02-15

    Purpose: This paper describes a method to achieve consistent clinical image quality in {sup 18}F-FDG scans accounting for patient habitus, dose regimen, image acquisition, and processing techniques. Methods: Oncological PET/CT scan data for 58 subjects were evaluated retrospectively to derive analytical curves that predict image quality. Patient noise equivalent count rate and coefficient of variation (CV) were used as metrics in their analysis. Optimized acquisition protocols were identified and prospectively applied to 179 subjects. Results: The adoption of different schemes for three body mass ranges (<60 kg, 60–90 kg, >90 kg) allows improved image quality with both point spread functionmore » and ordered-subsets expectation maximization-3D reconstruction methods. The application of this methodology showed that CV improved significantly (p < 0.0001) in clinical practice. Conclusions: Consistent oncological PET/CT image quality on a high-performance scanner was achieved from an analysis of the relations existing between dose regimen, patient habitus, acquisition, and processing techniques. The proposed methodology may be used by PET/CT centers to develop protocols to standardize PET/CT imaging procedures and achieve better patient management and cost-effective operations.« less

  5. New understanding of rhizosphere processes enabled by advances in molecular and spatially resolved techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Nancy J.; Paša-Tolić, Ljiljana; Bailey, Vanessa L.

    Understanding the role played by microorganisms within soil systems is challenged by the unique intersection of physics, chemistry, mineralogy and biology in fostering habitat for soil microbial communities. To address these challenges will require observations across multiple spatial and temporal scales to capture the dynamics and emergent behavior from complex and interdependent processes. The heterogeneity and complexity of the rhizosphere require advanced techniques that press the simultaneous frontiers of spatial resolution, analyte sensitivity and specificity, reproducibility, large dynamic range, and high throughput. Fortunately many exciting technical advancements are now available to inform and guide the development of new hypotheses. Themore » aim of this Special issue is to provide a holistic view of the rhizosphere in the perspective of modern molecular biology methodologies that enabled a highly-focused, detailed view on the processes in the rhizosphere, including numerous, strong and complex interactions between plant roots, soil constituents and microorganisms. We discuss the current rhizosphere research challenges and knowledge gaps, as well as perspectives and approaches using newly available state-of-the-art toolboxes. These new approaches and methodologies allow the study of rhizosphere processes and properties, and rhizosphere as a central component of ecosystems and biogeochemical cycles.« less

  6. Studies on the Presence of Mycotoxins in Biological Samples: An Overview

    PubMed Central

    Escrivá, Laura; Font, Guillermina; Manyes, Lara

    2017-01-01

    Mycotoxins are fungal secondary metabolites with bioaccumulation levels leading to their carry-over into animal fluids, organs, and tissues. As a consequence, mycotoxin determination in biological samples from humans and animals has been reported worldwide. Since most mycotoxins show toxic effects at low concentrations and considering the extremely low levels present in biological samples, the application of reliable detection methods is required. This review summarizes the information regarding the studies involving mycotoxin determination in biological samples over the last 10 years. Relevant data on extraction methodology, detection techniques, sample size, limits of detection, and quantitation are presented herein. Briefly, liquid-liquid extraction followed by LC-MS/MS determination was the most common technique. The most analyzed mycotoxin was ochratoxin A, followed by zearalenone and deoxynivalenol—including their metabolites, enniatins, fumonisins, aflatoxins, T-2 and HT-2 toxins. Moreover, the studies were classified by their purpose, mainly focused on the development of analytical methodologies, mycotoxin biomonitoring, and exposure assessment. The study of tissue distribution, bioaccumulation, carry-over, persistence and transference of mycotoxins, as well as, toxicokinetics and ADME (absorption, distribution, metabolism and excretion) were other proposed goals for biological sample analysis. Finally, an overview of risk assessment was discussed. PMID:28820481

  7. Ultrasound-assisted emulsification microextraction for determination of 2,4,6-trichloroanisole in wine samples by gas chromatography tandem mass spectrometry.

    PubMed

    Fontana, Ariel R; Patil, Sangram H; Banerjee, Kaushik; Altamirano, Jorgelina C

    2010-04-28

    A fast and effective microextraction technique is proposed for preconcentration of 2,4,6-trichloroanisole (2,4,6-TCA) from wine samples prior gas chromatography tandem mass spectrometric (GC-MS/MS) analysis. The proposed technique is based on ultrasonication (US) for favoring the emulsification phenomenon during the extraction stage. Several variables influencing the relative response of the target analyte were studied and optimized. Under optimal experimental conditions, 2,4,6-TCA was quantitatively extracted achieving enhancement factors (EF) > or = 400 and limits of detection (LODs) 0.6-0.7 ng L(-1) with relative standard deviations (RSDs) < or = 11.3%, when 10 ng L(-1) 2,4,6-TCA standard-wine sample blend was analyzed. The calibration graphs for white and red wine were linear within the range of 5-1000 ng L(-1), and estimation coefficients (r(2)) were > or = 0.9995. Validation of the methodology was carried out by standard addition method at two concentrations (10 and 50 ng L(-1)) achieving recoveries >80% indicating satisfactory robustness of the method. The methodology was successfully applied for determination of 2,4,6-TCA in different wine samples.

  8. Study of disulfide reduction and alkyl chloroformate derivatization of plasma sulfur amino acids using gas chromatography-mass spectrometry.

    PubMed

    Svagera, Zdeněk; Hanzlíková, Dagmar; Simek, Petr; Hušek, Petr

    2012-03-01

    Four disulfide-reducing agents, dithiothreitol (DTT), 2,3-dimercaptopropanesulfonate (DMPS), and the newly tested 2-mercaptoethanesulfonate (MESNA) and Tris(hydroxypropyl)phosphine (THP), were investigated in detail for release of sulfur amino acids in human plasma. After protein precipitation with trichloroacetic acid (TCA), the plasma supernatant was treated with methyl, ethyl, or propyl chloroformate via the well-proven derivatization-extraction technique and the products were subjected to gas chromatographic-mass spectrometric (GC-MS) analysis. All the tested agents proved to be rapid and effective reducing agents for the assay of plasma thiols. When compared with DTT, the novel reducing agents DMPS, MESNA, and THP provided much cleaner extracts and improved analytical performance. Quantification of homocysteine, cysteine, and methionine was performed using their deuterated analogues, whereas other analytes were quantified by means of 4-chlorophenylalanine. Precise and reliable assay of all examined analytes was achieved, irrespective of the chloroformate reagent used. Average relative standard deviations at each analyte level were ≤6%, quantification limits were 0.1-0.2 μmol L(-1), recoveries were 94-121%, and linearity was over three orders of magnitude (r(2) equal to 0.997-0.998). Validation performed with the THP agent and propyl chloroformate derivatization demonstrated the robustness and reliability of this simple sample-preparation methodology.

  9. Analytical procedures for the determination of fuel combustion products, anti-corrosive compounds, and de-icing compounds in airport runoff water samples.

    PubMed

    Sulej, Anna Maria; Polkowska, Żaneta; Astel, Aleksander; Namieśnik, Jacek

    2013-12-15

    The purpose of this study is to propose and evaluate new procedures for determination of fuel combustion products, anti-corrosive and de-icing compounds in runoff water samples collected from the airports located in different regions and characterized by different levels of the activity expressed by the number of flights and the number of passengers (per year). The most difficult step in the analytical procedure used for the determination of PAHs, benzotriazoles and glycols is sample preparation stage, due to diverse matrix composition, the possibility of interference associated with the presence of components with similar physicochemical properties. In this study, five different versions of sample preparation using extraction techniques, such as: LLE and SPE, were tested. In all examined runoff water samples collected from the airports, the presence of PAH compounds and glycols was observed. In majority of the samples, BT compounds were determined. Runoff water samples collected from the areas of Polish and British international airports as well as local airports had similar qualitative composition, but quantitative composition of the analytes was very diverse. New and validated analytical methodologies ensure that the necessary information for assessing the negative impact of airport activities on the environment can be obtained. © 2013 Elsevier B.V. All rights reserved.

  10. The evolution of analytical chemistry methods in foodomics.

    PubMed

    Gallo, Monica; Ferranti, Pasquale

    2016-01-08

    The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Ultrasound-assisted low-density solvent dispersive liquid-liquid microextraction for the determination of 4 designer benzodiazepines in urine samples by gas chromatography-triple quadrupole mass spectrometry.

    PubMed

    Meng, Liang; Zhu, Binling; Zheng, Kefang; Fu, Shanlin

    2017-05-15

    A novel microextraction technique based on ultrasound-assisted low-density solvent dispersive liquid-liquid microextraction (UA-LDS-DLLME) had been applied for the determination of 4 designer benzodiazepines (phenazepam, diclazepam, flubromazepam and etizolam) in urine samples by gas chromatography- triple quadrupole mass spectrometry (GC-QQQ-MS). Ethyl acetate (168μL) was added into the urine samples after adjusting pH to 11.3. The samples were sonicated in an ultrasonic bath for 5.5min to form a cloudy suspension. After centrifugation at 10000rpm for 3min, the supernatant extractant was withdrawn and injected into the GC-QQQ-MS for analysis. Parameters affecting the extraction efficiency have been investigated and optimized by means of single factor experiment and response surface methodology (Box-Behnken design). Under the optimum extraction conditions, a recovery of 73.8-85.5% were obtained for all analytes. The analytical method was linear for all analytes in the range from 0.003 to 10μg/mL with the correlation coefficient ranging from 0.9978 to 0.9990. The LODs were estimated to be 1-3ng/mL. The accuracy (expressed as mean relative error MRE) was within ±5.8% and the precision (expressed as relative standard error RSD) was less than 5.9%. UA-LDS-DLLME technique has the advantages of shorter extraction time and is suitable for simultaneous pretreatment of samples in batches. The combination of UA-LDS-DLLME with GC-QQQ-MS offers an alternative analytical approach for the sensitive detection of these designer benzodiazepines in urine matrix for clinical and medico-legal purposes. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. SociAL Sensor Analytics: Measuring Phenomenology at Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.

    The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less

  13. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  14. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  15. 7 CFR 91.23 - Analytical methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 91.23 Section 91.23 Agriculture... SERVICES AND GENERAL INFORMATION Method Manuals § 91.23 Analytical methods. Most analyses are performed according to approved procedures described in manuals of standardized methodology. These standard methods...

  16. Analytical Chemistry Division annual progress report for period ending November 30, 1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyon, W.S.

    1978-03-01

    Activities for the year are summarized in sections on analytical methodology, mass and mass emission spectrometry, analytical services, bio-organic analysis, nuclear and radiochemical analysis, and quality assurance and safety. Presentations of research results in publications and reports are tabulated. (JRD)

  17. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. 76 FR 55804 - Dicamba; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-09

    ... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...

  19. A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales

    PubMed Central

    Ayton, Gary S.; Voth, Gregory A.

    2009-01-01

    A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167

  20. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  1. Molecularly imprinted solid-phase extraction in the analysis of agrochemicals.

    PubMed

    Yi, Ling-Xiao; Fang, Rou; Chen, Guan-Hua

    2013-08-01

    The molecular imprinting technique is a highly predeterminative recognition technology. Molecularly imprinted polymers (MIPs) can be applied to the cleanup and preconcentration of analytes as the selective adsorbent of solid-phase extraction (SPE). In recent years, a new type of SPE has formed, molecularly imprinted polymer solid-phase extraction (MISPE), and has been widely applied to the extraction of agrochemicals. In this review, the mechanism of the molecular imprinting technique and the methodology of MIP preparations are explained. The extraction modes of MISPE, including offline and online, are discussed, and the applications of MISPE in the analysis of agrochemicals such as herbicides, fungicides and insecticides are summarized. It is concluded that MISPE is a powerful tool to selectively isolate agrochemicals from real samples with higher extraction and cleanup efficiency than commercial SPE and that it has great potential for broad applications.

  2. Recent Application of Solid Phase Based Techniques for Extraction and Preconcentration of Cyanotoxins in Environmental Matrices.

    PubMed

    Mashile, Geaneth Pertunia; Nomngongo, Philiswa N

    2017-03-04

    Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.

  3. FASP, an analytic resource appraisal program for petroleum play analysis

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1986-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented in a FORTRAN program termed FASP. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An established geologic model considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The program FASP produces resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and many laws of expectation and variance. ?? 1986.

  4. Reducing Conservatism of Analytic Transient Response Bounds via Shaping Filters

    NASA Technical Reports Server (NTRS)

    Kwan, Aiyueh; Bedrossian, Nazareth; Jan, Jiann-Woei; Grigoriadis, Karolos; Hua, Tuyen (Technical Monitor)

    1999-01-01

    Recent results show that the peak transient response of a linear system to bounded energy inputs can be computed using the energy-to-peak gain of the system. However, analytically computed peak response bound can be conservative for a class of class bounded energy signals, specifically pulse trains generated from jet firings encountered in space vehicles. In this paper, shaping filters are proposed as a Methodology to reduce the conservatism of peak response analytic bounds. This Methodology was applied to a realistic Space Station assembly operation subject to jet firings. The results indicate that shaping filters indeed reduce the predicted peak response bounds.

  5. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks.

    PubMed

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-12-17

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism.

  6. A Security Assessment Mechanism for Software-Defined Networking-Based Mobile Networks

    PubMed Central

    Luo, Shibo; Dong, Mianxiong; Ota, Kaoru; Wu, Jun; Li, Jianhua

    2015-01-01

    Software-Defined Networking-based Mobile Networks (SDN-MNs) are considered the future of 5G mobile network architecture. With the evolving cyber-attack threat, security assessments need to be performed in the network management. Due to the distinctive features of SDN-MNs, such as their dynamic nature and complexity, traditional network security assessment methodologies cannot be applied directly to SDN-MNs, and a novel security assessment methodology is needed. In this paper, an effective security assessment mechanism based on attack graphs and an Analytic Hierarchy Process (AHP) is proposed for SDN-MNs. Firstly, this paper discusses the security assessment problem of SDN-MNs and proposes a methodology using attack graphs and AHP. Secondly, to address the diversity and complexity of SDN-MNs, a novel attack graph definition and attack graph generation algorithm are proposed. In order to quantify security levels, the Node Minimal Effort (NME) is defined to quantify attack cost and derive system security levels based on NME. Thirdly, to calculate the NME of an attack graph that takes the dynamic factors of SDN-MN into consideration, we use AHP integrated with the Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS) as the methodology. Finally, we offer a case study to validate the proposed methodology. The case study and evaluation show the advantages of the proposed security assessment mechanism. PMID:26694409

  7. MASS SPECTROMETRY FOR RISK MANAGEMENT OF DRINKING WATER TREATMENT; II. DISINFECTION BY-PRODUCTS: HALOACETIC ACIDS

    EPA Science Inventory

    Risk management of drinking water relies on quality analytical data. Analytical methodology can often be adapted from environmental monitoring sources. However, risk management sometimes presents special analytical challenges because data may be needed from a source for which n...

  8. Assessment of Methodological Quality of Economic Evaluations in Belgian Drug Reimbursement Applications

    PubMed Central

    Simoens, Steven

    2013-01-01

    Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474

  9. Assessment of methodological quality of economic evaluations in belgian drug reimbursement applications.

    PubMed

    Simoens, Steven

    2013-01-01

    This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.

  10. Critical evaluation of methodology commonly used in sample collection, storage and preparation for the analysis of pharmaceuticals and illicit drugs in surface water and wastewater by solid phase extraction and liquid chromatography-mass spectrometry.

    PubMed

    Baker, David R; Kasprzyk-Hordern, Barbara

    2011-11-04

    The main aim of this manuscript is to provide a comprehensive and critical verification of methodology commonly used for sample collection, storage and preparation in studies concerning the analysis of pharmaceuticals and illicit drugs in aqueous environmental samples with the usage of SPE-LC/MS techniques. This manuscript reports the results of investigations into several sample preparation parameters that to the authors' knowledge have not been reported or have received very little attention. This includes: (i) effect of evaporation temperature and (ii) solvent with regards to solid phase extraction (SPE) extracts; (iii) effect of silanising glassware; (iv) recovery of analytes during vacuum filtration through glass fibre filters and (v) pre LC-MS filter membranes. All of these parameters are vital to develop efficient and reliable extraction techniques; an essential factor given that target drug residues are often present in the aqueous environment at ng L(-1) levels. Presented is also the first comprehensive review of the stability of illicit drugs and pharmaceuticals in wastewater. Among the parameters studied are: time of storage, temperature and pH. Over 60 analytes were targeted including stimulants, opioid and morphine derivatives, benzodiazepines, antidepressants, dissociative anaesthetics, drug precursors, human urine indicators and their metabolites. The lack of stability of analytes in raw wastewater was found to be significant for many compounds. For instance, 34% of compounds studied reported a stability change >15% after only 12 h in raw wastewater stored at 2 °C; a very important finding given that wastewater is typically collected with the use of 24 h composite samplers. The stability of these compounds is also critical given the recent development of so-called 'sewage forensics' or 'sewage epidemiology' in which concentrations of target drug residues in wastewater are used to back-calculate drug consumption. Without an understanding of stability, under (or over) reporting of consumption estimations may take place. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  12. Identification of novel peptides for horse meat speciation in highly processed foodstuffs.

    PubMed

    Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario

    2015-01-01

    There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.

  13. Teleoperator system man-machine interface requirements for satellite retrieval and satellite servicing. Volume 1: Requirements

    NASA Technical Reports Server (NTRS)

    Malone, T. B.

    1972-01-01

    Requirements were determined analytically for the man machine interface for a teleoperator system performing on-orbit satellite retrieval and servicing. Requirements are basically of two types; mission/system requirements, and design requirements or design criteria. Two types of teleoperator systems were considered: a free flying vehicle, and a shuttle attached manipulator. No attempt was made to evaluate the relative effectiveness or efficiency of the two system concepts. The methodology used entailed an application of the Essex Man-Systems analysis technique as well as a complete familiarization with relevant work being performed at government agencies and by private industry.

  14. Digital system upset. The effects of simulated lightning-induced transients on a general-purpose microprocessor

    NASA Technical Reports Server (NTRS)

    Belcastro, C. M.

    1983-01-01

    Flight critical computer based control systems designed for advanced aircraft must exhibit ultrareliable performance in lightning charged environments. Digital system upset can occur as a result of lightning induced electrical transients, and a methodology was developed to test specific digital systems for upset susceptibility. Initial upset data indicates that there are several distinct upset modes and that the occurrence of upset is related to the relative synchronization of the transient input with the processing sate of the digital system. A large upset test data base will aid in the formulation and verification of analytical upset reliability modeling techniques which are being developed.

  15. Population Studies of Intact Vitamin D Binding Protein by Affinity Capture ESI-TOF-MS

    PubMed Central

    Borges, Chad R.; Jarvis, Jason W.; Oran, Paul E.; Rogers, Stephen P.; Nelson, Randall W.

    2008-01-01

    Blood plasma proteins with molecular weights greater than approximately 30 kDa are refractory to comprehensive, high-throughput qualitative characterization of microheterogeneity across human populations. Analytical techniques for obtaining high mass resolution for targeted, intact protein characterization and, separately, high sample throughput exist, but efficient means of coupling these assay characteristics remain rather limited. This article discusses the impetus for analyzing intact proteins in a targeted manner across populations and describes the methodology required to couple mass spectrometric immunoassay with electrospray ionization mass spectrometry for the purpose of qualitatively characterizing a prototypical large plasma protein, vitamin D binding protein, across populations. PMID:19137103

  16. Skin microbiome: genomics-based insights into the diversity and role of skin microbes

    PubMed Central

    Kong, Heidi H.

    2011-01-01

    Recent advances in DNA sequencing methodology have enabled studies of human skin microbes that circumvent difficulties in isolating and characterizing fastidious microbes. Sequence-based approaches have identified greater diversity of cutaneous bacteria than studies using traditional cultivation techniques. However, improved sequencing technologies and analytical methods are needed to study all skin microbes, including bacteria, archaea, fungi, viruses, and mites, and how they interact with each other and their human hosts. This review discusses current skin microbiome research, with a primary focus on bacteria, and the challenges facing investigators striving to understand how skin micro-organisms contribute to health and disease. PMID:21376666

  17. Screening Vaccine Formulations in Fresh Human Whole Blood.

    PubMed

    Hakimi, Jalil; Aboutorabian, Sepideh; To, Frederick; Ausar, Salvador F; Rahman, Nausheen; Brookes, Roger H

    2017-01-01

    Monitoring the immunological functionality of vaccine formulations is critical for vaccine development. While the traditional approach using established animal models has been relatively effective, the use of animals is costly and cumbersome, and animal models are not always reflective of a human response. The development of a human-based approach would be a major step forward in understanding how vaccine formulations might behave in humans. Here, we describe a platform methodology using fresh human whole blood (hWB) to monitor adjuvant-modulated, antigen-specific responses to vaccine formulations, which is amenable to analysis by standard immunoassays as well as a variety of other analytical techniques.

  18. Rockfall vulnerability assessment for masonry buildings

    NASA Astrophysics Data System (ADS)

    Mavrouli, Olga

    2015-04-01

    The methodologies for the quantitative risk assessment vary in function of the application scale and the available data. For fragmental rockfalls, risk calculation requires data for the expected damage of the exposed elements due to potential rock block impacts with a range of trajectories, magnitudes and intensities. Although the procedures for the quantification of the rock block characteristics in terms of magnitude-frequency relationships are well established, there are few methodologies for the calculation of the vulnerability, and these are usually empirical or judgmental. The response of buildings to rock block impacts using analytical methods has been mainly realised so far for reinforced concrete buildings, and some fragility curves have been calculated with the results, indicating the potential damage for a range of rock block characteristics. Masonry buildings, as a common structural typology in mountainous areas, are in many cases impacted by rock blocks during rockfalls. Their response presents some peculiarities in comparison with reinforced-concrete structures given the non-homogeneity and variability of the compound materials (blocks and mortar), their orthotropy, low strength in tension, the statically indeterminate load-bearing system and the non-monolithic connections. To this purpose, analytical procedures which are specifically adapted to masonry structures should be used for the evaluation of the expected damage due to rock impacts. In this contribution we discuss the application of the analytical approach for the assessment of the expected damage in rockfall prone areas and the simulation assumptions that can be made concerning the materials, geometry, loading and the relevant simplifications. The amount of uncertainties introduced during their analytical simulation is high due to the dispersion of the data for material mechanical properties and the construction techniques and quality and thus a probabilistic assessment is suggested. The random nature of the rockfall as far as it concerns the magnitude and the intensity of the rock blocks can also be introduced using parametric analyses.

  19. Methodology of analysis of very weak acids by isotachophoresis with electrospray-ionization mass-spectrometric detection: Anionic electrolyte systems for the medium-alkaline pH range.

    PubMed

    Malá, Zdena; Gebauer, Petr

    2018-01-15

    This work describes for the first time a functional electrolyte system setup for anionic isotachophoresis (ITP) with electrospray-ionization mass-spectrometric (ESI-MS) detection in the neutral to medium-alkaline pH range. So far no application was published on the analysis of very weak acids by anionic ITP-MS although there is a broad spectrum of potential analytes with pK a values in the range 5-10, where application of this technique promises interesting gains in both sensitivity and specificity. The problem so far was the lack of anionic ESI-compatible ITP systems in the mentioned pH range as all typical volatile anionic system components are fully ionized at neutral and alkaline pH and thus too fast to suit as terminators. We propose an original solution of the problem based on the combination of two ITP methods: (i) use of the hydroxyl ion as a natural and ESI-compatible terminator, and (ii) use of configurations based on moving-boundary ITP. The former method ensures effective stacking of analytes by an alkaline terminator of sufficiently low mobility and the latter offers increased flexibility for tuning of the separation window and selectivity according to actual needs. A theoretical description of the proposed model is presented and applied to the design of very simple functional electrolyte configurations. The properties of example systems are demonstrated by both computer simulation and experiments with a group of model analytes. Potential effects of carbon dioxide present in the solutions are demonstrated for particular systems. Experimental results confirm that the proposed methodology is well capable of performing sensitive and selective ITP-MS analyses of very weak acidic analytes (e.g. sulfonamides or chlorophenols). Copyright © 2017 Elsevier B.V. All rights reserved.

  20. 2016 Workplace and Gender Relations Survey of Active Duty Members: Statistical Methodology Report

    DTIC Science & Technology

    2017-03-01

    2016 Workplace and Gender Relations Survey of Active Duty Members Statistical Methodology Report Additional copies of this report may be...MEMBERS: STATISTICAL METHODOLOGY REPORT Office of People Analytics (OPA) Defense Research, Surveys, and Statistics Center 4800 Mark Center Drive...20 1 2016 WORKPLACE AND GENDER RELATIONS SURVEY OF ACTIVE DUTY MEMBERS: STATISTICAL METHODOLOGY REPORT

  1. Does Metformin Reduce Cancer Risks? Methodologic Considerations.

    PubMed

    Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh

    2016-01-01

    The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.

  2. Techniques used for the screening of hemoglobin levels in blood donors: current insights and future directions.

    PubMed

    Chaudhary, Rajendra; Dubey, Anju; Sonker, Atul

    2017-01-01

    Blood donor hemoglobin (Hb) estimation is an important donation test that is performed prior to blood donation. It serves the dual purpose of protecting the donors' health against anemia and ensuring good quality of blood components, which has an implication on recipients' health. Diverse cutoff criteria have been defined world over depending on population characteristics; however, no testing methodology and sample requirement have been specified for Hb screening. Besides the technique, there are several physiological and methodological factors that affect accuracy and reliability of Hb estimation. These include the anatomical source of blood sample, posture of the donor, timing of sample and several other biological factors. Qualitative copper sulfate gravimetric method has been the archaic time-tested method that is still used in resource-constrained settings. Portable hemoglobinometers are modern quantitative devices that have been further modified to reagent-free cuvettes. Furthermore, noninvasive spectrophotometry was introduced, mitigating pain to blood donor and eliminating risk of infection. Notwithstanding a tremendous evolution in terms of ease of operation, accuracy, mobility, rapidity and cost, a component of inherent variability persists, which may partly be attributed to pre-analytical variables. Hence, blood centers should pay due attention to validation of test methodology, competency of operating staff and regular proficiency testing of the outputs. In this article, we have reviewed various regulatory guidelines, described the variables that affect the measurements and compared the validated technologies for Hb screening of blood donors along with enumeration of their merits and limitations.

  3. Analysis and control of high-speed wheeled vehicles

    NASA Astrophysics Data System (ADS)

    Velenis, Efstathios

    In this work we reproduce driving techniques to mimic expert race drivers and obtain the open-loop control signals that may be used by auto-pilot agents driving autonomous ground wheeled vehicles. Race drivers operate their vehicles at the limits of the acceleration envelope. An accurate characterization of the acceleration capacity of the vehicle is required. Understanding and reproduction of such complex maneuvers also require a physics-based mathematical description of the vehicle dynamics. While most of the modeling issues of ground-vehicles/automobiles are already well established in the literature, lack of understanding of the physics associated with friction generation results in ad-hoc approaches to tire friction modeling. In this work we revisit this aspect of the overall vehicle modeling and develop a tire friction model that provides physical interpretation of the tire forces. The new model is free of those singularities at low vehicle speed and wheel angular rate that are inherent in the widely used empirical static models. In addition, the dynamic nature of the tire model proposed herein allows the study of dynamic effects such as transients and hysteresis. The trajectory-planning problem for an autonomous ground wheeled vehicle is formulated in an optimal control framework aiming to minimize the time of travel and maximize the use of the available acceleration capacity. The first approach to solve the optimal control problem is using numerical techniques. Numerical optimization allows incorporation of a vehicle model of high fidelity and generates realistic solutions. Such an optimization scheme provides an ideal platform to study the limit operation of the vehicle, which would not be possible via straightforward simulation. In this work we emphasize the importance of online applicability of the proposed methodologies. This underlines the need for optimal solutions that require little computational cost and are able to incorporate real, unpredictable environments. A semi-analytic methodology is developed to generate the optimal velocity profile for minimum time travel along a prescribed path. The semi-analytic nature ensures minimal computational cost while a receding horizon implementation allows application of the methodology in uncertain environments. Extensions to increase fidelity of the vehicle model are finally provided.

  4. Response Surface Methods for Spatially-Resolved Optical Measurement Techniques

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.

    2003-01-01

    Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatial ly-re solved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/-30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-light, inflatable space antenna at NASA Langley Research Center.

  5. Current techniques in acid-chloride corrosion control and monitoring at The Geysers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirtz, Paul; Buck, Cliff; Kunzman, Russell

    1991-01-01

    Acid chloride corrosion of geothermal well casings, production piping and power plant equipment has resulted in costly corrosion damage, frequent curtailments of power plants and the permanent shut-in of wells in certain areas of The Geysers. Techniques have been developed to mitigate these corrosion problems, allowing continued production of steam from high chloride wells with minimal impact on production and power generation facilities.The optimization of water and caustic steam scrubbing, steam/liquid separation and process fluid chemistry has led to effective and reliable corrosion mitigation systems currently in routine use at The Geysers. When properly operated, these systems can yield steammore » purities equal to or greater than those encountered in areas of The Geysers where chloride corrosion is not a problem. Developments in corrosion monitoring techniques, steam sampling and analytical methodologies for trace impurities, and computer modeling of the fluid chemistry has been instrumental in the success of this technology.« less

  6. Real-Time Leaky Lamb Wave Spectrum Measurement and Its Application to NDE of Composites

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph

    1999-01-01

    Numerous analytical and theoretical studies of the behavior of leaky Lamb waves (LLW) in composite materials were documented in the literature. One of the key issues that are constraining the application of this method as a practical tool is the amount of data that needs to be acquired and the slow process that is involved with such experiments. Recently, a methodology that allows quasi real-time acquisition of LLW dispersion data was developed. At each angle of incidence the reflection spectrum is available in real time from the experimental setup and it can be used for rapid detection of the defects. This technique can be used to rapidly acquire the various plate wave modes along various angles of incidence for the characterization of the material elastic properties. The experimental method and data acquisition technique will be described in this paper. Experimental data was used to examine a series of flaws including porosity and delaminations and demonstrated the efficiency of the developed technique.

  7. Recent advances in computational-analytical integral transforms for convection-diffusion problems

    NASA Astrophysics Data System (ADS)

    Cotta, R. M.; Naveira-Cotta, C. P.; Knupp, D. C.; Zotin, J. L. Z.; Pontes, P. C.; Almeida, A. P.

    2017-10-01

    An unifying overview of the Generalized Integral Transform Technique (GITT) as a computational-analytical approach for solving convection-diffusion problems is presented. This work is aimed at bringing together some of the most recent developments on both accuracy and convergence improvements on this well-established hybrid numerical-analytical methodology for partial differential equations. Special emphasis is given to novel algorithm implementations, all directly connected to enhancing the eigenfunction expansion basis, such as a single domain reformulation strategy for handling complex geometries, an integral balance scheme in dealing with multiscale problems, the adoption of convective eigenvalue problems in formulations with significant convection effects, and the direct integral transformation of nonlinear convection-diffusion problems based on nonlinear eigenvalue problems. Then, selected examples are presented that illustrate the improvement achieved in each class of extension, in terms of convergence acceleration and accuracy gain, which are related to conjugated heat transfer in complex or multiscale microchannel-substrate geometries, multidimensional Burgers equation model, and diffusive metal extraction through polymeric hollow fiber membranes. Numerical results are reported for each application and, where appropriate, critically compared against the traditional GITT scheme without convergence enhancement schemes and commercial or dedicated purely numerical approaches.

  8. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  9. Analysis of combined data from heterogeneous study designs: an applied example from the patient navigation research program.

    PubMed

    Roetzheim, Richard G; Freund, Karen M; Corle, Don K; Murray, David M; Snyder, Frederick R; Kronman, Andrea C; Jean-Pierre, Pascal; Raich, Peter C; Holden, Alan Ec; Darnell, Julie S; Warren-Mears, Victoria; Patierno, Steven

    2012-04-01

    The Patient Navigation Research Program (PNRP) is a cooperative effort of nine research projects, with similar clinical criteria but with different study designs. To evaluate projects such as PNRP, it is desirable to perform a pooled analysis to increase power relative to the individual projects. There is no agreed-upon prospective methodology, however, for analyzing combined data arising from different study designs. Expert opinions were thus solicited from the members of the PNRP Design and Analysis Committee. To review possible methodologies for analyzing combined data arising from heterogeneous study designs. The Design and Analysis Committee critically reviewed the pros and cons of five potential methods for analyzing combined PNRP project data. The conclusions were based on simple consensus. The five approaches reviewed included the following: (1) analyzing and reporting each project separately, (2) combining data from all projects and performing an individual-level analysis, (3) pooling data from projects having similar study designs, (4) analyzing pooled data using a prospective meta-analytic technique, and (5) analyzing pooled data utilizing a novel simulated group-randomized design. Methodologies varied in their ability to incorporate data from all PNRP projects, to appropriately account for differing study designs, and to accommodate differing project sample sizes. The conclusions reached were based on expert opinion and not derived from actual analyses performed. The ability to analyze pooled data arising from differing study designs may provide pertinent information to inform programmatic, budgetary, and policy perspectives. Multisite community-based research may not lend itself well to the more stringent explanatory and pragmatic standards of a randomized controlled trial design. Given our growing interest in community-based population research, the challenges inherent in the analysis of heterogeneous study design are likely to become more salient. Discussion of the analytic issues faced by the PNRP and the methodological approaches we considered may be of value to other prospective community-based research programs.

  10. Baselining PMU Data to Find Patterns and Anomalies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amidan, Brett G.; Follum, James D.; Freeman, Kimberly A.

    This paper looks at the application of situational awareness methodologies with respect to power grid data. These methodologies establish baselines that look for typical patterns and atypical behavior in the data. The objectives of the baselining analyses are to provide: real-time analytics, the capability to look at historical trends and events, and reliable predictions of the near future state of the grid. Multivariate algorithms were created to establish normal baseline behavior and then score each moment in time according to its variance from the baseline. Detailed multivariate analytical techniques are described in this paper that produced ways to identify typicalmore » patterns and atypical behavior. In this case, atypical behavior is behavior that is unenvisioned. Visualizations were also produced to help explain the behavior that was identified mathematically. Examples are shown to help describe how to read and interpret the analyses and visualizations. Preliminary work has been performed on PMU data sets from BPA (Bonneville Power Administration) and EI (Eastern Interconnect). Actual results are not fully shown here because of confidentiality issues. Comparisons between atypical events found mathematically and actual events showed that many of the actual events are also atypical events; however there are many atypical events that do not correlate to any actual events. Additional work needs to be done to help classify the atypical events into actual events, so that the importance of the events can be better understood.« less

  11. Analytical group decision making in natural resources: Methodology and application

    USGS Publications Warehouse

    Schmoldt, D.L.; Peterson, D.L.

    2000-01-01

    Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.

  12. Development of a particle-trap preconcentration-soft ionization mass spectrometric technique for the quantification of mercury halides in air.

    PubMed

    Deeds, Daniel A; Ghoshdastidar, Avik; Raofie, Farhad; Guérette, Élise-Andrée; Tessier, Alain; Ariya, Parisa A

    2015-01-01

    Measurement of oxidized mercury, Hg(II), in the atmosphere poses a significant analytical challenge as Hg(II) is present at ultra-trace concentrations (picograms per cubic meter air). Current technologies are sufficiently sensitive to measure the total Hg present as Hg(II) but cannot determine the chemical speciation of Hg(II). We detail here the development of a soft ionization mass spectrometric technique coupled with preconcentration onto nano- or microparticle-based traps prior to analysis for the measurement of mercury halides in air. The current methodology has comparable detection limits (4-11 pg m(-3)) to previously developed techniques for the measurement of total inorganic mercury in air while allowing for the identification of HgX2 in collected samples. Both mercury chloride and mercury bromide have been sporadically detected in Montreal urban and indoor air using atmospheric pressure chemical ionization-mass spectrometry (APCI-MS). We discuss limitations and advantages of the current technique and discuss potential avenues for future research including quantitative trace measurements of a larger range of mercury compounds.

  13. Determination of glycols in air: development of sampling and analytical methodology and application to theatrical smokes.

    PubMed

    Pendergrass, S M

    1999-01-01

    Glycol-based fluids are used in the production of theatrical smokes in theaters, concerts, and other stage productions. The fluids are heated and dispersed in aerosol form to create the effect of a smoke, mist, or fog. There have been reports of adverse health effects such as respiratory irritation, chest tightness, shortness of breath, asthma, and skin rashes. Previous attempts to collect and quantify the aerosolized glycols used in fogging agents have been plagued by inconsistent results, both in the efficiency of collection and in the chromatographic analysis of the glycol components. The development of improved sampling and analytical methodology for aerosolized glycols was required to assess workplace exposures more effectively. An Occupational Safety and Health Administration versatile sampler tube was selected for the collection of ethylene glycol, propylene glycol, 1,3-butylene glycol, diethylene glycol, triethylene glycol, and tetraethylene glycol aerosols. Analytical methodology for the separation, identification, and quantitation of the six glycols using gas chromatography/flame ionization detection is described. Limits of detection of the glycol analytes ranged from 7 to 16 micrograms/sample. Desorption efficiencies for all glycol compounds were determined over the range of study and averaged greater than 90%. Storage stability results were acceptable after 28 days for all analytes except ethylene glycol, which was stable at ambient temperature for 14 days. Based on the results of this study, the new glycol method was published in the NIOSH Manual of Analytical Methods.

  14. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  15. Analytical methodology for sampling and analysing eight siloxanes and trimethylsilanol in biogas from different wastewater treatment plants in Europe.

    PubMed

    Raich-Montiu, J; Ribas-Font, C; de Arespacochaga, N; Roig-Torres, E; Broto-Puig, F; Crest, M; Bouchy, L; Cortina, J L

    2014-02-17

    Siloxanes and trimethylsilanol belong to a family of organic silicone compounds that are currently used extensively in industry. Those that are prone to volatilisation become minor compounds in biogas adversely affecting energetic applications. However, non-standard analytical methodologies are available to analyse biogas-based gaseous matrixes. To this end, different sampling techniques (adsorbent tubes, impingers and tedlar bags) were compared using two different configurations: sampling directly from the biogas source or from a 200 L tedlar bag filled with biogas and homogenised. No significant differences were apparent between the two sampling configurations. The adsorbent tubes performed better than the tedlar bags and impingers, particularly for quantifying low concentrations. A method for the speciation of silicon compounds in biogas was developed using gas chromatography coupled with mass spectrometry working in dual scan/single ion monitoring mode. The optimised conditions could separate and quantify eight siloxane compounds (L2, L3, L4, L5, D3, D4, D5 and D6) and trimethylsilanol within fourteen minutes. Biogas from five waste water treatment plants located in Spain, France and England was sampled and analysed using the developed methodology. The siloxane concentrations in the biogas samples were influenced by the anaerobic digestion temperature, as well as the nature and composition of the sewage inlet. Siloxanes D4 and D5 were the most abundant, ranging in concentration from 1.5 to 10.1 and 10.8 to 124.0 mg Nm(-3), respectively, and exceeding the tolerance limit of most energy conversion systems. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    ERIC Educational Resources Information Center

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  17. A Progressive Approach to Teaching Analytics in the Marketing Curriculum

    ERIC Educational Resources Information Center

    Liu, Yiyuan; Levin, Michael A.

    2018-01-01

    With the emerging use of analytics tools and methodologies in marketing, marketing educators have provided students training and experiences beyond the soft skills associated with understanding consumer behavior. Previous studies have only discussed how to apply analytics in course designs, tools, and related practices. However, there is a lack of…

  18. Analytical Methodology Used To Assess/Refine Observatory Thermal Vacuum Test Conditions For the Landsat 8 Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Fantano, Louis

    2015-01-01

    Thermal and Fluids Analysis Workshop Silver Spring, MD NCTS 21070-15 The Landsat 8 Data Continuity Mission, which is part of the United States Geologic Survey (USGS), launched February 11, 2013. A Landsat environmental test requirement mandated that test conditions bound worst-case flight thermal environments. This paper describes a rigorous analytical methodology applied to assess refine proposed thermal vacuum test conditions and the issues encountered attempting to satisfy this requirement.

  19. Conducting Meta-Analyses Based on p Values

    PubMed Central

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  20. Sex, Parity, and Scars: A Meta-analytic Review.

    PubMed

    McFadden, Clare; Oxenham, Marc F

    2018-01-01

    The ability to identify whether a female has been pregnant or has given birth has significant implications for forensic investigations and bioarcheological research. The meaning of "scars of parturition," their causes, and their significance are a matter of contention, with a substantial literature of re-evaluations and tests of the relationship between pelvic scarring and parity. The aim of this study was to use meta-analytic techniques (the methodological approach) to test whether pelvic scarring, namely dorsal pubic pitting and the preauricular groove, is a predictor of parity and sex. Meta-analyses indicated that neither dorsal pubic pitting nor the preauricular groove are predictors of parity status, while dorsal pubic pitting is a moderate predictor of sex. A weak relationship between dorsal pubic pitting and parity was identified, but this is believed to be a product of the moderate relationship with sex. This calls into question whether any causal relationship between parity and pelvic scarring exists. © 2017 American Academy of Forensic Sciences.

  1. Measurement of volatile organic compounds in human blood.

    PubMed Central

    Ashley, D L; Bonin, M A; Cardinali, F L; McCraw, J M; Wooten, J V

    1996-01-01

    Volatile organic compounds (VOCs) are an important public health problem throughout the developed world. Many important questions remain to be addressed in assessing exposure to these compounds. Because they are ubiquitous and highly volatile, special techniques must be applied in the analytical determination of VOCs. The analytical methodology chosen to measure toxicants in biological materials must be well validated and carefully carried out; poor quality assurance can lead to invalid results that can have a direct bearing on treating exposed persons. The pharmacokinetics of VOCs show that most of the internal dose of these compounds is quickly eliminated, but there is a fraction that is only slowly removed, and these compounds may bioaccumulate. VOCs are found in the general population at the high parts-per-trillion range, but some people with much higher levels have apparently been exposed to VOC sources away from the workplace. Smoking is the most significant confounder to internal dose levels of VOCs and must be considered when evaluating suspected cases of exposure. PMID:8933028

  2. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  3. Development of an analytical scheme for simazine and 2,4-D in soil and water runoff from ornamental plant nursery plots.

    PubMed

    Sutherland, Devon J; Stearman, G Kim; Wells, Martha J M

    2003-01-01

    The transport and fate of pesticides applied to ornamental plant nursery crops are not well documented. Methodology for analysis of soil and water runoff samples concomitantly containing the herbicides simazine (1-chloro-4,6-bis(ethylamino)-s-triazine) and 2,4-D ((2,4-dichlorophenoxy)acetic acid) was developed in this research to investigate the potential for runoff and leaching from ornamental nursery plots. Solid-phase extraction was used prior to analysis by gas chromatography and liquid chromatography. Chromatographic results were compared with determination by enzyme-linked immunoassay analysis. The significant analytical contributions of this research include (1) the development of a scheme using chromatographic mode sequencing for the fractionation of simazine and 2,4-D, (2) optimization of the homogeneous derivatization of 2,4-D using the methylating agent boron trifluoride in methanol as an alternative to in situ generation of diazomethane, and (3) the practical application of these techniques to field samples.

  4. A Variational Approach to the Analysis of Dissipative Electromechanical Systems

    PubMed Central

    Allison, Andrew; Pearce, Charles E. M.; Abbott, Derek

    2014-01-01

    We develop a method for systematically constructing Lagrangian functions for dissipative mechanical, electrical, and electromechanical systems. We derive the equations of motion for some typical electromechanical systems using deterministic principles that are strictly variational. We do not use any ad hoc features that are added on after the analysis has been completed, such as the Rayleigh dissipation function. We generalise the concept of potential, and define generalised potentials for dissipative lumped system elements. Our innovation offers a unified approach to the analysis of electromechanical systems where there are energy and power terms in both the mechanical and electrical parts of the system. Using our novel technique, we can take advantage of the analytic approach from mechanics, and we can apply these powerful analytical methods to electrical and to electromechanical systems. We can analyse systems that include non-conservative forces. Our methodology is deterministic, and does does require any special intuition, and is thus suitable for automation via a computer-based algebra package. PMID:24586221

  5. Evaluation of protective shielding thickness for diagnostic radiology rooms: theory and computer simulation.

    PubMed

    Costa, Paulo R; Caldas, Linda V E

    2002-01-01

    This work presents the development and evaluation using modern techniques to calculate radiation protection barriers in clinical radiographic facilities. Our methodology uses realistic primary and scattered spectra. The primary spectra were computer simulated using a waveform generalization and a semiempirical model (the Tucker-Barnes-Chakraborty model). The scattered spectra were obtained from published data. An analytical function was used to produce attenuation curves from polychromatic radiation for specified kVp, waveform, and filtration. The results of this analytical function are given in ambient dose equivalent units. The attenuation curves were obtained by application of Archer's model to computer simulation data. The parameters for the best fit to the model using primary and secondary radiation data from different radiographic procedures were determined. They resulted in an optimized model for shielding calculation for any radiographic room. The shielding costs were about 50% lower than those calculated using the traditional method based on Report No. 49 of the National Council on Radiation Protection and Measurements.

  6. Enantioseparation by Capillary Electrophoresis Using Ionic Liquids as Chiral Selectors.

    PubMed

    Greño, Maider; Marina, María Luisa; Castro-Puyana, María

    2018-11-02

    Capillary electrophoresis (CE) is one of the most widely employed analytical techniques to achieve enantiomeric separations. In spite of the fact that there are many chiral selectors commercially available to perform enantioseparations by CE, one of the most relevant topics in this field is the search for new selectors capable of providing high enantiomeric resolutions. Chiral ionic liquids (CILs) have interesting characteristics conferring them a high potential in chiral separations although only some of them are commercially available. The aim of this article is to review all the works published on the use of CILs as chiral selectors in the development of enantioselective methodologies by CE, covering the period from 2006 (when the first research work on this topic was published) to 2017. The use of CILs as sole chiral selectors, as chiral selectors in dual systems or as chiral ligands will be considered. This review also provides detailed analytical information on the experimental conditions used to carry out enantioseparations in different fields as well as on the separation mechanism involved.

  7. Quantitative analysis of Sudan dye adulteration in paprika powder using FTIR spectroscopy.

    PubMed

    Lohumi, Santosh; Joshi, Ritu; Kandpal, Lalit Mohan; Lee, Hoonsoo; Kim, Moon S; Cho, Hyunjeong; Mo, Changyeun; Seo, Young-Wook; Rahman, Anisur; Cho, Byoung-Kwan

    2017-05-01

    As adulteration of foodstuffs with Sudan dye, especially paprika- and chilli-containing products, has been reported with some frequency, this issue has become one focal point for addressing food safety. FTIR spectroscopy has been used extensively as an analytical method for quality control and safety determination for food products. Thus, the use of FTIR spectroscopy for rapid determination of Sudan dye in paprika powder was investigated in this study. A net analyte signal (NAS)-based methodology, named HLA/GO (hybrid linear analysis in the literature), was applied to FTIR spectral data to predict Sudan dye concentration. The calibration and validation sets were designed to evaluate the performance of the multivariate method. The obtained results had a high determination coefficient (R 2 ) of 0.98 and low root mean square error (RMSE) of 0.026% for the calibration set, and an R 2 of 0.97 and RMSE of 0.05% for the validation set. The model was further validated using a second validation set and through the figures of merit, such as sensitivity, selectivity, and limits of detection and quantification. The proposed technique of FTIR combined with HLA/GO is rapid, simple and low cost, making this approach advantageous when compared with the main alternative methods based on liquid chromatography (LC) techniques.

  8. Tennessee long-range transportation plan : project evaluation system

    DOT National Transportation Integrated Search

    2005-12-01

    The Project Evaluation System (PES) Report is an analytical methodology to aid programming efforts and prioritize multimodal investments. The methodology consists of both quantitative and qualitative evaluation criteria built upon the Guiding Princip...

  9. Towards automated human gait disease classification using phase space representation of intrinsic mode functions

    NASA Astrophysics Data System (ADS)

    Pratiher, Sawon; Patra, Sayantani; Pratiher, Souvik

    2017-06-01

    A novel analytical methodology for segregating healthy and neurological disorders from gait patterns is proposed by employing a set of oscillating components called intrinsic mode functions (IMF's). These IMF's are generated by the Empirical Mode Decomposition of the gait time series and the Hilbert transformed analytic signal representation forms the complex plane trace of the elliptical shaped analytic IMFs. The area measure and the relative change in the centroid position of the polygon formed by the Convex Hull of these analytic IMF's are taken as the discriminative features. Classification accuracy of 79.31% with Ensemble learning based Adaboost classifier validates the adequacy of the proposed methodology for a computer aided diagnostic (CAD) system for gait pattern identification. Also, the efficacy of several potential biomarkers like Bandwidth of Amplitude Modulation and Frequency Modulation IMF's and it's Mean Frequency from the Fourier-Bessel expansion from each of these analytic IMF's has been discussed for its potency in diagnosis of gait pattern identification and classification.

  10. PCB congener analysis with Hall electrolytic conductivity detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edstrom, R.D.

    1989-01-01

    This work reports the development of an analytical methodology for the analysis of PCB congeners based on integrating relative retention data provided by other researchers. The retention data were transposed into a multiple retention marker system which provided good precision in the calculation of relative retention indices for PCB congener analysis. Analytical run times for the developed methodology were approximately one hour using a commercially available GC capillary column. A Tracor Model 700A Hall Electrolytic Conductivity Detector (HECD) was employed in the GC detection of Aroclor standards and environmental samples. Responses by the HECD provided good sensitivity and were reasonablymore » predictable. Ten response factors were calculated based on the molar chlorine content of each homolog group. Homolog distributions were determined for Aroclors 1016, 1221, 1232, 1242, 1248, 1254, 1260, 1262 along with binary and ternary mixtures of the same. These distributions were compared with distributions reported by other researchers using electron capture detection as well as chemical ionization mass spectrometric methodologies. Homolog distributions acquired by the HECD methodology showed good correlation with the previously mentioned methodologies. The developed analytical methodology was used in the analysis of bluefish (Pomatomas saltatrix) and weakfish (Cynoscion regalis) collected from the York River, lower James River and lower Chesapeake Bay in Virginia. Total PCB concentrations were calculated and homolog distributions were constructed from the acquired data. Increases in total PCB concentrations were found in the analyzed fish samples during the fall of 1985 collected from the lower James River and lower Chesapeake Bay.« less

  11. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  12. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  13. Approaches in the determination of plant nutrient uptake and distribution in space flight conditions

    NASA Technical Reports Server (NTRS)

    Heyenga, A. G.; Forsman, A.; Stodieck, L. S.; Hoehn, A.; Kliss, M.

    2000-01-01

    The effective growth and development of vascular plants rely on the adequate availability of water and nutrients. Inefficiency in either the initial absorption, transportation, or distribution of these elements are factors which impinge on plant structure and metabolic integrity. The potential effect of space flight and microgravity conditions on the efficiency of these processes is unclear. Limitations in the available quantity of space-grown plant material and the sensitivity of routine analytical techniques have made an evaluation of these processes impractical. However, the recent introduction of new plant cultivating methodologies supporting the application of radionuclide elements and subsequent autoradiography techniques provides a highly sensitive investigative approach amenable to space flight studies. Experiments involving the use of gel based 'nutrient packs' and the radionuclides calcium-45 and iron-59 were conducted on the Shuttle mission STS-94. Uptake rates of the radionuclides between ground and flight plant material appeared comparable.

  14. Approaches in the Determination of Plant Nutrient Uptake and Distribution in Space Flight Conditions

    NASA Technical Reports Server (NTRS)

    Heyenga, A. G.; Forsman, A.; Stodieck, L. S.; Hoehn, A.; Kliss, Mark

    1998-01-01

    The effective growth and development of vascular plants rely on the adequate availability of water and nutrients. Inefficiency in either the initial absorption, transportation, or distribution of these elements are factors which may impinge on plant structure and metabolic integrity. The potential effect of space flight and microgravity conditions on the efficiency of these processes is unclear. Limitations in the available quantity of space-grown plant material and the sensitivity of routine analytical techniques have made an evaluation of these processes impractical. However, the recent introduction of new plant cultivating methodologies supporting the application of radionuclide elements and subsequent autoradiography techniques provides a highly sensitive investigative approach amenable to space flight studies. Experiments involving the use of gel based 'nutrient packs' and the nuclides Ca45 and Fe59 were conducted on the Shuttle mission STS-94. Uptake rates of the radionuclides between ground and flight plant material appeared comparable.

  15. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  16. Evaluation of trade-offs in costs and environmental impacts for returnable packaging implementation

    NASA Astrophysics Data System (ADS)

    Jarupan, Lerpong; Kamarthi, Sagar V.; Gupta, Surendra M.

    2004-02-01

    The main thrust of returnable packaging these days is to provide logistical services through transportation and distribution of products and be environmentally friendly. Returnable packaging and reverse logistics concepts have converged to mitigate the adverse effect of packaging materials entering the solid waste stream. Returnable packaging must be designed by considering the trade-offs between costs and environmental impact to satisfy manufacturers and environmentalists alike. The cost of returnable packaging entails such items as materials, manufacturing, collection, storage and disposal. Environmental impacts are explicitly linked with solid waste, air pollution, and water pollution. This paper presents a multi-criteria evaluation technique to assist decision-makers for evaluating the trade-offs in costs and environmental impact during the returnable packaging design process. The proposed evaluation technique involves a combination of multiple objective integer linear programming and analytic hierarchy process. A numerical example is used to illustrate the methodology.

  17. A Global Optimization Methodology for Rocket Propulsion Applications

    NASA Technical Reports Server (NTRS)

    2001-01-01

    While the response surface method is an effective method in engineering optimization, its accuracy is often affected by the use of limited amount of data points for model construction. In this chapter, the issues related to the accuracy of the RS approximations and possible ways of improving the RS model using appropriate treatments, including the iteratively re-weighted least square (IRLS) technique and the radial-basis neural networks, are investigated. A main interest is to identify ways to offer added capabilities for the RS method to be able to at least selectively improve the accuracy in regions of importance. An example is to target the high efficiency region of a fluid machinery design space so that the predictive power of the RS can be maximized when it matters most. Analytical models based on polynomials, with controlled level of noise, are used to assess the performance of these techniques.

  18. Information management: considering adolescents' regulation of parental knowledge.

    PubMed

    Marshall, Sheila K; Tilton-Weaver, Lauree C; Bosdet, Lara

    2005-10-01

    Employing Goffman's [(1959). The presentation of self in everyday life. New York: Doubleday and Company] notion of impression management, adolescents' conveyance of information about their whereabouts and activities to parents was assessed employing two methodologies. First, a two-wave panel design with a sample of 121 adolescents was used to test a model of information management incorporating two forms of information regulation (lying and willingness to disclose), adolescents' perception of their parents' knowledge about their activities, and adolescent misconduct. Path analysis was used to examine the model for two forms of misconduct as outcomes: substance use and antisocial behaviours. Fit indices indicate the path models were all good fits to the data. Second, 96 participants' responses to semi-structured questions were analyzed using a qualitative analytic technique. Findings reveal adolescents withhold or divulge information in coordination with their parents, employ impression management techniques, and try to balance safety issues with preservation of the parent-adolescent relationship.

  19. On a PLIF quantification methodology in a nonlinear dye response regime

    NASA Astrophysics Data System (ADS)

    Baj, P.; Bruce, P. J. K.; Buxton, O. R. H.

    2016-06-01

    A new technique of planar laser-induced fluorescence calibration is presented in this work. It accounts for a nonlinear dye response at high concentrations, an illumination light attenuation and a secondary fluorescence's influence in particular. An analytical approximation of a generic solution of the Beer-Lambert law is provided and utilized for effective concentration evaluation. These features make the technique particularly well suited for high concentration measurements, or those with a large range of concentration values, c, present (i.e. a high dynamic range of c). The method is applied to data gathered in a water flume experiment where a stream of a fluorescent dye (rhodamine 6G) was released into a grid-generated turbulent flow. Based on these results, it is shown that the illumination attenuation and the secondary fluorescence introduce a significant error into the data quantification (up to 15 and 80 %, respectively, for the case considered in this work) unless properly accounted for.

  20. Carbene footprinting accurately maps binding sites in protein-ligand and protein-protein interactions

    NASA Astrophysics Data System (ADS)

    Manzi, Lucio; Barrow, Andrew S.; Scott, Daniel; Layfield, Robert; Wright, Timothy G.; Moses, John E.; Oldham, Neil J.

    2016-11-01

    Specific interactions between proteins and their binding partners are fundamental to life processes. The ability to detect protein complexes, and map their sites of binding, is crucial to understanding basic biology at the molecular level. Methods that employ sensitive analytical techniques such as mass spectrometry have the potential to provide valuable insights with very little material and on short time scales. Here we present a differential protein footprinting technique employing an efficient photo-activated probe for use with mass spectrometry. Using this methodology the location of a carbohydrate substrate was accurately mapped to the binding cleft of lysozyme, and in a more complex example, the interactions between a 100 kDa, multi-domain deubiquitinating enzyme, USP5 and a diubiquitin substrate were located to different functional domains. The much improved properties of this probe make carbene footprinting a viable method for rapid and accurate identification of protein binding sites utilizing benign, near-UV photoactivation.

  1. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    PubMed

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  2. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site—Working towards a toolbox for better assessment

    PubMed Central

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607

  3. Belgian guidelines for economic evaluations: second edition.

    PubMed

    Thiry, Nancy; Neyt, Mattias; Van De Sande, Stefaan; Cleemput, Irina

    2014-12-01

    The aim of this study was to present the updated methodological guidelines for economic evaluations of healthcare interventions (drugs, medical devices, and other interventions) in Belgium. The update of the guidelines was performed by three Belgian health economists following feedback from users of the former guidelines and personal experience. The updated guidelines were discussed with a multidisciplinary team consisting of other health economists, assessors of reimbursement request files, representatives of Belgian databases and representatives of the drugs and medical devices industry. The final document was validated by three external validators that were not involved in the previous discussions. The guidelines give methodological guidance for the following components of an economic evaluation: literature review, perspective of the evaluation, definition of the target population, choice of the comparator, analytic technique and study design, calculation of costs, valuation of outcomes, definition of the time horizon, modeling, handling uncertainty and discounting. We present a reference case that can be considered as the minimal requirement for Belgian economic evaluations of health interventions. These guidelines will improve the methodological quality, transparency and uniformity of the economic evaluations performed in Belgium. The guidelines will also provide support to the researchers and assessors performing or evaluating economic evaluations.

  4. [Epistemological/methodological contributions to the fortification of an emancipatory con(science)].

    PubMed

    Ferreira, Marcelo José Monteiro; Rigotto, Raquel Maria

    2014-10-01

    This article conducts a critical and reflective analysis into the paths of elaboration, sistematization and communication of the results of research in conjunction with colleges, social movements and individuals in the territory under scrutiny. For this, the article embraces as the core analytical theme the process of shared production of knowledge, both in the epistemological-methodological field and with respect to its social destination. The case study was adopted as the methodology, preceded by the use of focused groups and in-depth interviews as technique. To analyze the qualitative material discourse analysis was adopted in line with the assumptions of in-depth hermeneutics. The results are presented in two stages: Firstly, the new possibilities for a paradigmatic reorientation are discussed from the permanent and procedural interlocution with the empirical field and it's different contexts and authors. Secondly, it analyzes in the praxiological dimension, the distinct ways of appropriation of knowledge produced in dialogue with the social movements and the individuals in the territory under scrutiny. It concludes by highlighting alternative and innovative paths to an edifying academic practice. which stresses solidarity and is sensitive to the vulnerable population and its requests.

  5. The SIMRAND methodology: Theory and application for the simulation of research and development projects

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.

  6. On-line coupling of supercritical fluid extraction and chromatographic techniques.

    PubMed

    Sánchez-Camargo, Andrea Del Pilar; Parada-Alfonso, Fabián; Ibáñez, Elena; Cifuentes, Alejandro

    2017-01-01

    This review summarizes and discusses recent advances and applications of on-line supercritical fluid extraction coupled to liquid chromatography, gas chromatography, and supercritical fluid chromatographic techniques. Supercritical fluids, due to their exceptional physical properties, provide unique opportunities not only during the extraction step but also in the separation process. Although supercritical fluid extraction is especially suitable for recovery of non-polar organic compounds, this technique can also be successfully applied to the extraction of polar analytes by the aid of modifiers. Supercritical fluid extraction process can be performed following "off-line" or "on-line" approaches and their main features are contrasted herein. Besides, the parameters affecting the supercritical fluid extraction process are explained and a "decision tree" is for the first time presented in this review work as a guide tool for method development. The general principles (instrumental and methodological) of the different on-line couplings of supercritical fluid extraction with chromatographic techniques are described. Advantages and shortcomings of supercritical fluid extraction as hyphenated technique are discussed. Besides, an update of the most recent applications (from 2005 up to now) of the mentioned couplings is also presented in this review. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Trace elemental analysis of Indian natural moonstone gems by PIXE and XRD techniques.

    PubMed

    Venkateswara Rao, R; Venkateswarulu, P; Kasipathi, C; Sivajyothi, S

    2013-12-01

    A selected number of Indian Eastern Ghats natural moonstone gems were studied with a powerful nuclear analytical and non-destructive Proton Induced X-ray Emission (PIXE) technique. Thirteen elements, including V, Co, Ni, Zn, Ga, Ba and Pb, were identified in these moonstones and may be useful in interpreting the various geochemical conditions and the probable cause of their inceptions in the moonstone gemstone matrix. Furthermore, preliminary XRD studies of different moonstone patterns were performed. The PIXE technique is a powerful method for quickly determining the elemental concentration of a substance. A 3MeV proton beam was employed to excite the samples. The chemical constituents of moonstones from parts of the Eastern Ghats geological formations of Andhra Pradesh, India were determined, and gemological studies were performed on those gems. The crystal structure and the lattice parameters of the moonstones were estimated using X-Ray Diffraction studies, trace and minor elements were determined using the PIXE technique, and major compositional elements were confirmed by XRD. In the present work, the usefulness and versatility of the PIXE technique for research in geo-scientific methodology is established. © 2013 Elsevier Ltd. All rights reserved.

  8. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  9. MS-based analytical methodologies to characterize genetically modified crops.

    PubMed

    García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro

    2011-01-01

    The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.

  10. Durability predictions of adhesively bonded composite structures using accelerated characterization methods

    NASA Technical Reports Server (NTRS)

    Brinson, H. F.

    1985-01-01

    The utilization of adhesive bonding for composite structures is briefly assessed. The need for a method to determine damage initiation and propagation for such joints is outlined. Methods currently in use to analyze both adhesive joints and fiber reinforced plastics is mentioned and it is indicated that all methods require the input of the mechanical properties of the polymeric adhesive and composite matrix material. The mechanical properties of polymers are indicated to be viscoelastic and sensitive to environmental effects. A method to analytically characterize environmentally dependent linear and nonlinear viscoelastic properties is given. It is indicated that the methodology can be used to extrapolate short term data to long term design lifetimes. That is, the method can be used for long term durability predictions. Experimental results for near adhesive resins, polymers used as composite matrices and unidirectional composite laminates is given. The data is fitted well with the analytical durability methodology. Finally, suggestions are outlined for the development of an analytical methodology for the durability predictions of adhesively bonded composite structures.

  11. Combining numerical simulations with time-domain random walk for pathogen risk assessment in groundwater

    NASA Astrophysics Data System (ADS)

    Cvetkovic, V.; Molin, S.

    2012-02-01

    We present a methodology that combines numerical simulations of groundwater flow and advective transport in heterogeneous porous media with analytical retention models for computing the infection risk probability from pathogens in aquifers. The methodology is based on the analytical results presented in [1,2] for utilising the colloid filtration theory in a time-domain random walk framework. It is shown that in uniform flow, the results from the numerical simulations of advection yield comparable results as the analytical TDRW model for generating advection segments. It is shown that spatial variability of the attachment rate may be significant, however, it appears to affect risk in a different manner depending on if the flow is uniform or radially converging. In spite of the fact that numerous issues remain open regarding pathogen transport in aquifers on the field scale, the methodology presented here may be useful for screening purposes, and may also serve as a basis for future studies that would include greater complexity.

  12. A shipboard comparison of analytic methods for ballast water compliance monitoring

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Broeg, Katja; Gianoli, Claudio; He, Jianjun; Heitmüller, Susanne; Curto, Alberto Lo; Nakata, Akiko; Rolke, Manfred; Schillak, Lothar; Stehouwer, Peter; Vanden Byllaardt, Julie; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Promising approaches for indicative analysis of ballast water samples have been developed that require study in the field to examine their utility for determining compliance with the International Convention for the Control and Management of Ships' Ballast Water and Sediments. To address this gap, a voyage was undertaken on board the RV Meteor, sailing the North Atlantic Ocean from Mindelo (Cape Verde) to Hamburg (Germany) during June 4-15, 2015. Trials were conducted on local sea water taken up by the ship's ballast system at multiple locations along the trip, including open ocean, North Sea, and coastal water, to evaluate a number of analytic methods that measure the numeric concentration or biomass of viable organisms according to two size categories (≥ 50 μm in minimum dimension: 7 techniques, ≥ 10 μm and < 50 μm: 9 techniques). Water samples were analyzed in parallel to determine whether results were similar between methods and whether rapid, indicative methods offer comparable results to standard, time- and labor-intensive detailed methods (e.g. microscopy) and high-end scientific approaches (e.g. flow cytometry). Several promising indicative methods were identified that showed high correlation with microscopy, but allow much quicker processing and require less expert knowledge. This study is the first to concurrently use a large number of analytic tools to examine a variety of ballast water samples on board an operational ship in the field. Results are useful to identify the merits of each method and can serve as a basis for further improvement and development of tools and methodologies for ballast water compliance monitoring.

  13. Microbiological concerns and methodological approaches related to bacterial water quality in spaceflight

    NASA Technical Reports Server (NTRS)

    Pyle, Barry H.; Mcfeters, Gordon A.

    1992-01-01

    A number of microbiological issues are of critical importance to crew health and system performance in spacecraft water systems. This presentation reviews an army of these concerns which include factors that influence water treatment and disinfection in spaceflight such as biofilm formation and the physiological responses of bacteria in clean water systems. Factors associated with spaceflight like aerosol formation under conditions of microgravity are also discussed within the context of airborne infections such as Legionellosis. Finally, a spectrum of analytical approaches is reviewed to provide an evaluation of methodological alternatives that have been suggested or used to detect microorganisms of interest in water systems. These range from classical approaches employing colony formation on specific microbiological growth media to direct (i.e. microscopic) and indirect (e.g. electrochemical) methods as well as the use of molecular approaches and gene probes. These techniques are critically evaluated for their potential utility in determining microbiological water quality through the detection of microorganisms under the influence of ambient environmental stress inherent in spaceflight water systems.

  14. Methodological pitfalls in the analysis of contraceptive failure.

    PubMed

    Trussell, J

    1991-02-01

    Although the literature on contraceptive failure is vast and is expanding rapidly, our understanding of the relative efficacy of methods is quite limited because of defects in the research design and in the analytical tools used by investigators. Errors in the literature range from simple arithmetical mistakes to outright fraud. In many studies the proportion of the original sample lost to follow-up is so large that the published results have little meaning. Investigators do not routinely use life table techniques to control for duration of exposure; many employ the Pearl index, which suffers from the same problem as does the crude death rate as a measure of mortality. Investigators routinely calculate 'method' failure rates by eliminating 'user' failures from the numerator (pregnancies) but fail to eliminate 'imperfect' use from the denominator (exposure); as a consequence, these 'method' rates are biased downward. This paper explores these and other common biases that snare investigators and establishes methodological guidelines for future research.

  15. Intermatrix Synthesis as a rapid, inexpensive and reproducible methodology for the in situ functionalization of nanostructured surfaces with quantum dots

    NASA Astrophysics Data System (ADS)

    Bastos-Arrieta, Julio; Muñoz, Jose; Stenbock-Fermor, Anja; Muñoz, Maria; Muraviev, Dmitri N.; Céspedes, Francisco; Tsarkova, Larisa A.; Baeza, Mireia

    2016-04-01

    Intermatrix Synthesis (IMS) technique has proven to be a valid methodology for the in situ incorporation of quantum dots (QDs) in a wide range of nanostructured surfaces for the preparation of advanced hybrid-nanomaterials. In this sense, this communication reports the recent advances in the application of IMS for the synthesis of CdS-QDs with favourable distribution on sulfonated polyetherether ketone (SPEEK) membrane thin films (TFs), multiwall carbon nanotubes (MWCNTs) and nanodiamonds (NDs). The synthetic route takes advantage of the ion exchange functionality of the reactive surfaces for the loading of the QDs precursor and consequent QDs appearance by precipitation. The benefits of such modified nanomaterials were studied using CdS-QDs@MWCNTs hybrid-nanomaterials. CdS-QDs@MWCNTs has been used as conducting filler for the preparation of electrochemical nanocomposite sensors, which present electrocatalytic properties. Finally, the optical properties of the QDs contained on MWCNTs could allow a new procedure for the analytical detection of nanostructured carbon allotropes in water.

  16. Conventional and Accelerated-Solvent Extractions of Green Tea (Camellia sinensis) for Metabolomics-based Chemometrics

    PubMed Central

    Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.

    2018-01-01

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673

  17. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  18. Integrated corridor management analysis, modeling and simulation (AMS) methodology.

    DOT National Transportation Integrated Search

    2008-03-01

    This AMS Methodologies Document provides a discussion of potential ICM analytical approaches for the assessment of generic corridor operations. The AMS framework described in this report identifies strategies and procedures for tailoring AMS general ...

  19. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    DOT National Transportation Integrated Search

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  20. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  1. Machine-learning techniques for geochemical discrimination of 2011 Tohoku tsunami deposits

    PubMed Central

    Kuwatani, Tatsu; Nagata, Kenji; Okada, Masato; Watanabe, Takahiro; Ogawa, Yasumasa; Komai, Takeshi; Tsuchiya, Noriyoshi

    2014-01-01

    Geochemical discrimination has recently been recognised as a potentially useful proxy for identifying tsunami deposits in addition to classical proxies such as sedimentological and micropalaeontological evidence. However, difficulties remain because it is unclear which elements best discriminate between tsunami and non-tsunami deposits. Herein, we propose a mathematical methodology for the geochemical discrimination of tsunami deposits using machine-learning techniques. The proposed method can determine the appropriate combinations of elements and the precise discrimination plane that best discerns tsunami deposits from non-tsunami deposits in high-dimensional compositional space through the use of data sets of bulk composition that have been categorised as tsunami or non-tsunami sediments. We applied this method to the 2011 Tohoku tsunami and to background marine sedimentary rocks. After an exhaustive search of all 262,144 (= 218) combinations of the 18 analysed elements, we observed several tens of combinations with discrimination rates higher than 99.0%. The analytical results show that elements such as Ca and several heavy-metal elements are important for discriminating tsunami deposits from marine sedimentary rocks. These elements are considered to reflect the formation mechanism and origin of the tsunami deposits. The proposed methodology has the potential to aid in the identification of past tsunamis by using other tsunami proxies. PMID:25399750

  2. A novel methodology for estimating upper limits of major cost drivers for profitable conceptual launch system architectures

    NASA Astrophysics Data System (ADS)

    Rhodes, Russel E.; Byrd, Raymond J.

    1998-01-01

    This paper presents a ``back of the envelope'' technique for fast, timely, on-the-spot, assessment of affordability (profitability) of commercial space transportation architectural concepts. The tool presented here is not intended to replace conventional, detailed costing methodology. The process described enables ``quick look'' estimations and assumptions to effectively determine whether an initial concept (with its attendant cost estimating line items) provides focus for major leapfrog improvement. The Cost Charts Users Guide provides a generic sample tutorial, building an approximate understanding of the basic launch system cost factors and their representative magnitudes. This process will enable the user to develop a net ``cost (and price) per payload-mass unit to orbit'' incorporating a variety of significant cost drivers, supplemental to basic vehicle cost estimates. If acquisition cost and recurring cost factors (as a function of cost per payload-mass unit to orbit) do not meet the predetermined system-profitability goal, the concept in question will be clearly seen as non-competitive. Multiple analytical approaches, and applications of a variety of interrelated assumptions, can be examined in a quick, (on-the-spot) cost approximation analysis as this tool has inherent flexibility. The technique will allow determination of concept conformance to system objectives.

  3. Development of a methodology for strategic environmental assessment: application to the assessment of golf course installation policy in Taiwan.

    PubMed

    Chen, Ching-Ho; Wu, Ray-Shyan; Liu, Wei-Lin; Su, Wen-Ray; Chang, Yu-Min

    2009-01-01

    Some countries, including Taiwan, have adopted strategic environmental assessment (SEA) to assess and modify proposed policies, plans, and programs (PPPs) in the planning phase for pursuing sustainable development. However, there were only some sketchy steps focusing on policy assessment in the system of Taiwan. This study aims to develop a methodology for SEA in Taiwan to enhance the effectiveness associated with PPPs. The proposed methodology comprises an SEA procedure involving PPP management and assessment in various phases, a sustainable assessment framework, and an SEA management system. The SEA procedure is devised based on the theoretical considerations by systems thinking and the regulative requirements in Taiwan. The positive and negative impacts on ecology, society, and economy are simultaneously considered in the planning (including policy generation and evaluation), implementation, and control phases of the procedure. This study used the analytic hierarchy process, Delphi technique, and systems analysis to develop a sustainable assessment framework. An SEA management system was built based on geographic information system software to process spatial, attribute, and satellite image data during the assessment procedure. The proposed methodology was applied in the SEA of golf course installation policy in 2001 as a case study, which was the first SEA in Taiwan. Most of the 82 existing golf courses in 2001 were installed on slope lands and caused a serious ecological impact. Assessment results indicated that 15 future golf courses installed on marginal lands (including buffer zones, remedied lands, and wastelands) were acceptable because the comprehensive environmental (ecological, social, and economic) assessment value was better based on environmental characteristics and management regulations of Taiwan. The SEA procedure in the planning phase for this policy was completed but the implementation phase of this policy was not begun because the related legislation procedure could not be arranged due to a few senators' resistance. A self-review of the control phase was carried out in 2006 using this methodology. Installation permits for 12 courses on slope lands were terminated after 2001 and then 27 future courses could be installed on marginal lands. The assessment value of this policy using the data on ecological, social, and economic conditions from 2006 was higher than that using the data from 2001. The analytical results illustrate that the proposed methodology can be used to effectively and efficiently assist the related authorities for SEA.

  4. Building a Three-Dimensional Nano-Bio Interface for Aptasensing: An Analytical Methodology Based on Steric Hindrance Initiated Signal Amplification Effect.

    PubMed

    Du, Xiaojiao; Jiang, Ding; Hao, Nan; Qian, Jing; Dai, Liming; Zhou, Lei; Hu, Jianping; Wang, Kun

    2016-10-04

    The development of novel detection methodologies in electrochemiluminescence (ECL) aptasensor fields with simplicity and ultrasensitivity is essential for constructing biosensing architectures. Herein, a facile, specific, and sensitive methodology was developed unprecedentedly for quantitative detection of microcystin-LR (MC-LR) based on three-dimensional boron and nitrogen codoped graphene hydrogels (BN-GHs) assisted steric hindrance amplifying effect between the aptamer and target analytes. The recognition reaction was monitored by quartz crystal microbalance (QCM) to validate the possible steric hindrance effect. First, the BN-GHs were synthesized via self-assembled hydrothermal method and then applied as the Ru(bpy) 3 2+ immobilization platform for further loading the biomolecule aptamers due to their nanoporous structure and large specific surface area. Interestingly, we discovered for the first time that, without the aid of conventional double-stranded DNA configuration, such three-dimensional nanomaterials can directly amplify the steric hindrance effect between the aptamer and target analytes to a detectable level, and this facile methodology could be for an exquisite assay. With the MC-LR as a model, this novel ECL biosensor showed a high sensitivity and a wide linear range. This strategy supplies a simple and versatile platform for specific and sensitive determination of a wide range of aptamer-related targets, implying that three-dimensional nanomaterials would play a crucial role in engineering and developing novel detection methodologies for ECL aptasensing fields.

  5. Violent Video Game Effects on Aggression, Empathy, and Prosocial Behavior in Eastern and Western Countries: A Meta-Analytic Review

    ERIC Educational Resources Information Center

    Anderson, Craig A.; Shibuya, Akiko; Ihori, Nobuko; Swing, Edward L.; Bushman, Brad J.; Sakamoto, Akira; Rothstein, Hannah R.; Saleem, Muniba

    2010-01-01

    Meta-analytic procedures were used to test the effects of violent video games on aggressive behavior, aggressive cognition, aggressive affect, physiological arousal, empathy/desensitization, and prosocial behavior. Unique features of this meta-analytic review include (a) more restrictive methodological quality inclusion criteria than in past…

  6. Development of a model web-based system to support a statewide quality consortium in radiation oncology.

    PubMed

    Moran, Jean M; Feng, Mary; Benedetti, Lisa A; Marsh, Robin; Griffith, Kent A; Matuszak, Martha M; Hess, Michael; McMullen, Matthew; Fisher, Jennifer H; Nurushev, Teamour; Grubb, Margaret; Gardner, Stephen; Nielsen, Daniel; Jagsi, Reshma; Hayman, James A; Pierce, Lori J

    A database in which patient data are compiled allows analytic opportunities for continuous improvements in treatment quality and comparative effectiveness research. We describe the development of a novel, web-based system that supports the collection of complex radiation treatment planning information from centers that use diverse techniques, software, and hardware for radiation oncology care in a statewide quality collaborative, the Michigan Radiation Oncology Quality Consortium (MROQC). The MROQC database seeks to enable assessment of physician- and patient-reported outcomes and quality improvement as a function of treatment planning and delivery techniques for breast and lung cancer patients. We created tools to collect anonymized data based on all plans. The MROQC system representing 24 institutions has been successfully deployed in the state of Michigan. Since 2012, dose-volume histogram and Digital Imaging and Communications in Medicine-radiation therapy plan data and information on simulation, planning, and delivery techniques have been collected. Audits indicated >90% accurate data submission and spurred refinements to data collection methodology. This model web-based system captures detailed, high-quality radiation therapy dosimetry data along with patient- and physician-reported outcomes and clinical data for a radiation therapy collaborative quality initiative. The collaborative nature of the project has been integral to its success. Our methodology can be applied to setting up analogous consortiums and databases. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.

  7. Combining geographic information system, multicriteria evaluation techniques and fuzzy logic in siting MSW landfills

    NASA Astrophysics Data System (ADS)

    Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George

    2007-01-01

    This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.

  8. Evaluation and Selection of Best Priority Sequencing Rule in Job Shop Scheduling using Hybrid MCDM Technique

    NASA Astrophysics Data System (ADS)

    Kiran Kumar, Kalla; Nagaraju, Dega; Gayathri, S.; Narayanan, S.

    2017-05-01

    Priority Sequencing Rules provide the guidance for the order in which the jobs are to be processed at a workstation. The application of different priority rules in job shop scheduling gives different order of scheduling. More experimentation needs to be conducted before a final choice is made to know the best priority sequencing rule. Hence, a comprehensive method of selecting the right choice is essential in managerial decision making perspective. This paper considers seven different priority sequencing rules in job shop scheduling. For evaluation and selection of the best priority sequencing rule, a set of eight criteria are considered. The aim of this work is to demonstrate the methodology of evaluating and selecting the best priority sequencing rule by using hybrid multi criteria decision making technique (MCDM), i.e., analytical hierarchy process (AHP) with technique for order preference by similarity to ideal solution (TOPSIS). The criteria weights are calculated by using AHP whereas the relative closeness values of all priority sequencing rules are computed based on TOPSIS with the help of data acquired from the shop floor of a manufacturing firm. Finally, from the findings of this work, the priority sequencing rules are ranked from most important to least important. The comprehensive methodology presented in this paper is very much essential for the management of a workstation to choose the best priority sequencing rule among the available alternatives for processing the jobs with maximum benefit.

  9. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  10. Mind the gaps - the epidemiology of poor-quality anti-malarials in the malarious world - analysis of the WorldWide Antimalarial Resistance Network database

    PubMed Central

    2014-01-01

    Background Poor quality medicines threaten the lives of millions of patients and are alarmingly common in many parts of the world. Nevertheless, the global extent of the problem remains unknown. Accurate estimates of the epidemiology of poor quality medicines are sparse and are influenced by sampling methodology and diverse chemical analysis techniques. In order to understand the existing data, the Antimalarial Quality Scientific Group at WWARN built a comprehensive, open-access, global database and linked Antimalarial Quality Surveyor, an online visualization tool. Analysis of the database is described here, the limitations of the studies and data reported, and their public health implications discussed. Methods The database collates customized summaries of 251 published anti-malarial quality reports in English, French and Spanish by time and location since 1946. It also includes information on assays to determine quality, sampling and medicine regulation. Results No publicly available reports for 60.6% (63) of the 104 malaria-endemic countries were found. Out of 9,348 anti-malarials sampled, 30.1% (2,813) failed chemical/packaging quality tests with 39.3% classified as falsified, 2.3% as substandard and 58.3% as poor quality without evidence available to categorize them as either substandard or falsified. Only 32.3% of the reports explicitly described their definitions of medicine quality and just 9.1% (855) of the samples collected in 4.6% (six) surveys were conducted using random sampling techniques. Packaging analysis was only described in 21.5% of publications and up to twenty wrong active ingredients were found in falsified anti-malarials. Conclusions There are severe neglected problems with anti-malarial quality but there are important caveats to accurately estimate the prevalence and distribution of poor quality anti-malarials. The lack of reports in many malaria-endemic areas, inadequate sampling techniques and inadequate chemical analytical methods and instrumental procedures emphasizes the need to interpret medicine quality results with caution. The available evidence demonstrates the need for more investment to improve both sampling and analytical methodology and to achieve consensus in defining different types of poor quality medicines. PMID:24712972

  11. Absorption into fluorescence. A method to sense biologically relevant gas molecules

    NASA Astrophysics Data System (ADS)

    Strianese, Maria; Varriale, Antonio; Staiano, Maria; Pellecchia, Claudio; D'Auria, Sabato

    2011-01-01

    In this work we present an innovative optical sensing methodology based on the use of biomolecules as molecular gating nano-systems. Here, as an example, we report on the detection ofanalytes related to climate change. In particular, we focused our attention on the detection ofnitric oxide (NO) and oxygen (O2). Our methodology builds on the possibility of modulating the excitation intensity of a fluorescent probe used as a transducer and a sensor molecule whose absorption is strongly affected by the binding of an analyte of interest used as a filter. The two simple conditions that have to be fulfilled for the method to work are: (a) the absorption spectrum of the sensor placed inside the cuvette, and acting as the recognition element for the analyte of interest, should strongly change upon the binding of the analyte and (b) the fluorescence dye transducer should exhibit an excitation band which overlaps with one or more absorption bands of the sensor. The absorption band of the sensor affected by the binding of the specific analyte should overlap with the excitation band of the transducer. The high sensitivity of fluorescence detection combined with the use of proteins as highly selective sensors makes this method a powerful basis for the development of a new generation of analytical assays. Proof-of-principle results showing that cytochrome c peroxidase (CcP) for NO detection and myoglobin (Mb) for O2 detection can be successfully used by exploiting our new methodology are reported. The proposed technology can be easily expanded to the determination of different target analytes.

  12. A Six Sigma Trial For Reduction of Error Rates in Pathology Laboratory.

    PubMed

    Tosuner, Zeynep; Gücin, Zühal; Kiran, Tuğçe; Büyükpinarbaşili, Nur; Turna, Seval; Taşkiran, Olcay; Arici, Dilek Sema

    2016-01-01

    A major target of quality assurance is the minimization of error rates in order to enhance patient safety. Six Sigma is a method targeting zero error (3.4 errors per million events) used in industry. The five main principles of Six Sigma are defining, measuring, analysis, improvement and control. Using this methodology, the causes of errors can be examined and process improvement strategies can be identified. The aim of our study was to evaluate the utility of Six Sigma methodology in error reduction in our pathology laboratory. The errors encountered between April 2014 and April 2015 were recorded by the pathology personnel. Error follow-up forms were examined by the quality control supervisor, administrative supervisor and the head of the department. Using Six Sigma methodology, the rate of errors was measured monthly and the distribution of errors at the preanalytic, analytic and postanalytical phases was analysed. Improvement strategies were reclaimed in the monthly intradepartmental meetings and the control of the units with high error rates was provided. Fifty-six (52.4%) of 107 recorded errors in total were at the pre-analytic phase. Forty-five errors (42%) were recorded as analytical and 6 errors (5.6%) as post-analytical. Two of the 45 errors were major irrevocable errors. The error rate was 6.8 per million in the first half of the year and 1.3 per million in the second half, decreasing by 79.77%. The Six Sigma trial in our pathology laboratory provided the reduction of the error rates mainly in the pre-analytic and analytic phases.

  13. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  14. Development of methodologies and procedures for identifying STS users and uses

    NASA Technical Reports Server (NTRS)

    Archer, J. L.; Beauchamp, N. A.; Macmichael, D. C.

    1974-01-01

    A study was conducted to identify new uses and users of the new Space Transporation System (STS) within the domestic government sector. The study develops a series of analytical techniques and well-defined functions structured as an integrated planning process to assure efficient and meaningful use of the STS. The purpose of the study is to provide NASA with the following functions: (1) to realize efficient and economic use of the STS and other NASA capabilities, (2) to identify new users and uses of the STS, (3) to contribute to organized planning activities for both current and future programs, and (4) to air in analyzing uses of NASA's overall capabilities.

  15. Liquid Chromatography-Tandem Mass Spectrometry: An Emerging Technology in the Toxicology Laboratory.

    PubMed

    Zhang, Yan Victoria; Wei, Bin; Zhu, Yu; Zhang, Yanhua; Bluth, Martin H

    2016-12-01

    In the last decade, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has seen enormous growth in routine toxicology laboratories. LC-MS/MS offers significant advantages over other traditional testing, such as immunoassay and gas chromatography-mass spectrometry methodologies. Major strengths of LC-MS/MS include improvement in specificity, flexibility, and sample throughput when compared with other technologies. Here, the basic principles of LC-MS/MS technology are reviewed, followed by advantages and disadvantages of this technology compared with other traditional techniques. In addition, toxicology applications of LC-MS/MS for simultaneous detection of large panels of analytes are presented. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mensah, P.F.; Stubblefield, M.A.; Pang, S.S.

    Thermal characterization of a prepreg fabric used as the bonding material to join composite pipes has been modeled and solved using finite difference modeling (FDM) numerical analysis technique for one dimensional heat transfer through the material. Temperature distributions within the composite pipe joint are predicted. The prepreg material has temperature dependent thermal properties. Thus the resulting boundary value equations are non linear and analytical solutions cannot be obtained. This characterization is pertinent in determining the temperature profile in the prepreg layer during the manufacturing process for optimization purposes. In addition, in order to assess the effects of induced thermal stressmore » in the joint, the temperature profile is needed. The methodology employed in this analysis compares favorably with data from experimentation.« less

  17. Numerical Modeling of Pot-Hole Subsidence Due to Shallow Underground Coal Mining in Structurally Disturbed Ground

    NASA Astrophysics Data System (ADS)

    Lokhande, Ritesh D.; Murthy, V. M. S. R.; Singh, K. B.; Verma, Chandan Prasad; Verma, A. K.

    2018-04-01

    Stability analysis of underground mining is, generally, complex in nature and is difficult to carry out through analytical solutions more so in case of pot-hole subsidence prediction. Thus, application of numerical modeling technique for simulating and finding a solution is preferred. This paper reports the development of a methodology for simulating the pot-hole subsidence using FLAC3D. This study is restricted to geologically disturbed areas where presence of fault was dominating factor for occurrence of pot-hole subsidence. The results demonstrate that the variation in the excavation geometry and properties of immediate roof rocks play a vital role in the occurrence of pot-hole subsidence.

  18. Evidence for consciousness-related anomalies in random physical systems

    NASA Astrophysics Data System (ADS)

    Radin, Dean I.; Nelson, Roger D.

    1989-12-01

    Speculations about the role of consciousness in physical systems are frequently observed in the literature concerned with the interpretation of quantum mechanics. While only three experimental investigations can be found on this topic in physics journals, more than 800 relevant experiments have been reported in the literature of parapsychology. A well-defined body of empirical evidence from this domain was reviewed using meta-analytic techniques to assess methodological quality and overall effect size. Results showed effects conforming to chance expectation in control conditions and unequivocal non-chance effects in experimental conditions. This quantitative literature review agrees with the findings of two earlier reviews, suggesting the existence of some form of consciousness-related anomaly in random physical systems.

  19. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  20. New tools for investigating student learning in upper-division electrostatics

    NASA Astrophysics Data System (ADS)

    Wilcox, Bethany R.

    Student learning in upper-division physics courses is a growing area of research in the field of Physics Education. Developing effective new curricular materials and pedagogical techniques to improve student learning in upper-division courses requires knowledge of both what material students struggle with and what curricular approaches help to overcome these struggles. To facilitate the course transformation process for one specific content area --- upper-division electrostatics --- this thesis presents two new methodological tools: (1) an analytical framework designed to investigate students' struggles with the advanced physics content and mathematically sophisticated tools/techniques required at the junior and senior level, and (2) a new multiple-response conceptual assessment designed to measure student learning and assess the effectiveness of different curricular approaches. We first describe the development and theoretical grounding of a new analytical framework designed to characterize how students use mathematical tools and techniques during physics problem solving. We apply this framework to investigate student difficulties with three specific mathematical tools used in upper-division electrostatics: multivariable integration in the context of Coulomb's law, the Dirac delta function in the context of expressing volume charge densities, and separation of variables as a technique to solve Laplace's equation. We find a number of common themes in students' difficulties around these mathematical tools including: recognizing when a particular mathematical tool is appropriate for a given physics problem, mapping between the specific physical context and the formal mathematical structures, and reflecting spontaneously on the solution to a physics problem to gain physical insight or ensure consistency with expected results. We then describe the development of a novel, multiple-response version of an existing conceptual assessment in upper-division electrostatics courses. The goal of this new version is to provide an easily-graded electrostatics assessment that can potentially be implemented to investigate student learning on a large scale. We show that student performance on the new multiple-response version exhibits a significant degree of consistency with performance on the free-response version, and that it continues to provide significant insight into student reasoning and student difficulties. Moreover, we demonstrate that the new assessment is both valid and reliable using data from upper-division physics students at multiple institutions. Overall, the work described in this thesis represents a significant contribution to the methodological tools available to researchers and instructors interested in improving student learning at the upper-division level.

  1. Analytical aspects of plant metabolite profiling platforms: current standings and future aims.

    PubMed

    Seger, Christoph; Sturm, Sonja

    2007-02-01

    Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.

  2. Rating of Dynamic Coefficient for Simple Beam Bridge Design on High-Speed Railways

    NASA Astrophysics Data System (ADS)

    Diachenko, Leonid; Benin, Andrey; Smirnov, Vladimir; Diachenko, Anastasia

    2018-06-01

    The aim of the work is to improve the methodology for the dynamic computation of simple beam spans during the impact of high-speed trains. Mathematical simulation utilizing numerical and analytical methods of structural mechanics is used in the research. The article analyses parameters of the effect of high-speed trains on simple beam spanning bridge structures and suggests a technique of determining of the dynamic index to the live load. Reliability of the proposed methodology is confirmed by results of numerical simulation of high-speed train passage over spans with different speeds. The proposed algorithm of dynamic computation is based on a connection between maximum acceleration of the span in the resonance mode of vibrations and the main factors of stress-strain state. The methodology allows determining maximum and also minimum values of the main efforts in the construction that makes possible to perform endurance tests. It is noted that dynamic additions for the components of the stress-strain state (bending moments, transverse force and vertical deflections) are different. This condition determines the necessity for differentiated approach to evaluation of dynamic coefficients performing design verification of I and II groups of limiting state. The practical importance: the methodology of determining the dynamic coefficients allows making dynamic calculation and determining the main efforts in split beam spans without numerical simulation and direct dynamic analysis that significantly reduces the labour costs for design.

  3. At-Line Cellular Screening Methodology for Bioactives in Mixtures Targeting the α7-Nicotinic Acetylcholine Receptor.

    PubMed

    Otvos, Reka A; Mladic, Marija; Arias-Alpizar, Gabriela; Niessen, Wilfried M A; Somsen, Govert W; Smit, August B; Kool, Jeroen

    2016-06-01

    The α7-nicotinic acetylcholine receptor (α7-nAChR) is a ligand-gated ion channel expressed in different regions of the central nervous system (CNS). The α7-nAChR has been associated with Alzheimer's disease, epilepsy, and schizophrenia, and therefore is extensively studied as a drug target for the treatment of these diseases. Important sources for new compounds in drug discovery are natural extracts. Since natural extracts are complex mixtures, identification of the bioactives demands the use of analytical techniques to separate a bioactive from inactive compounds. This study describes screening methodology for identifying bioactive compounds in mixtures acting on the α7-nAChR. The methodology developed combines liquid chromatography (LC) coupled via a split with both an at-line calcium (Ca(2+))-flux assay and high-resolution mass spectrometry (MS). This allows evaluation of α7-nAChR responses after LC separation, while parallel MS enables compound identification. The methodology was optimized for analysis of agonists and positive allosteric modulators, and was successfully applied to screening of the hallucinogen mushroom Psilocybe Mckennaii The crude mushroom extract was analyzed using both reversed-phase and hydrophilic interaction liquid chromatography. Matching retention times and peak shapes of bioactives found with data from the parallel MS measurements allowed rapid pinpointing of accurate masses corresponding to the bioactives. © 2016 Society for Laboratory Automation and Screening.

  4. The current preference for the immuno-analytical ELISA method for quantitation of steroid hormones (endocrine disruptor compounds) in wastewater in South Africa.

    PubMed

    Manickum, Thavrin; John, Wilson

    2015-07-01

    The availability of national test centers to offer a routine service for analysis and quantitation of some selected steroid hormones [natural estrogens (17-β-estradiol, E2; estrone, E1; estriol, E3), synthetic estrogen (17-α-ethinylestradiol, EE2), androgen (testosterone), and progestogen (progesterone)] in wastewater matrix was investigated; corresponding internationally used chemical- and immuno-analytical test methods were reviewed. The enzyme-linked immunosorbent assay (ELISA) (immuno-analytical technique) was also assessed for its suitability as a routine test method to quantitate the levels of these hormones at a sewage/wastewater treatment plant (WTP) (Darvill, Pietermaritzburg, South Africa), over a 2-year period. The method performance and other relevant characteristics of the immuno-analytical ELISA method were compared to the conventional chemical-analytical methodology, like gas/liquid chromatography-mass spectrometry (GC/LC-MS), and GC-LC/tandem mass spectrometry (MSMS), for quantitation of the steroid hormones in wastewater and environmental waters. The national immuno-analytical ELISA technique was found to be sensitive (LOQ 5 ng/L, LOD 0.2-5 ng/L), accurate (mean recovery 96%), precise (RSD 7-10%), and cost-effective for screening and quantitation of these steroid hormones in wastewater and environmental water matrix. A survey of the most current international literature indicates a fairly equal use of the LC-MS/MS, GC-MS/MS (chemical-analytical), and ELISA (immuno-analytical) test methods for screening and quantitation of the target steroid hormones in both water and wastewater matrix. Internationally, the observed sensitivity, based on LOQ (ng/L), for the steroid estrogens E1, E2, EE2, is, in decreasing order: LC-MSMS (0.08-9.54) > GC-MS (1) > ELISA (5) (chemical-analytical > immuno-analytical). At the national level, the routine, unoptimized chemical-analytical LC-MSMS method was found to lack the required sensitivity for meeting environmental requirements for steroid hormone quantitation. Further optimization of the sensitivity of the chemical-analytical LC-tandem mass spectrometry methods, especially for wastewater screening, in South Africa is required. Risk assessment studies showed that it was not practical to propose standards or allowable limits for the steroid estrogens E1, E2, EE2, and E3; the use of predicted-no-effect concentration values of the steroid estrogens appears to be appropriate for use in their risk assessment in relation to aquatic organisms. For raw water sources, drinking water, raw and treated wastewater, the use of bioassays, with trigger values, is a useful screening tool option to decide whether further examination of specific endocrine activity may be warranted, or whether concentrations of such activity are of low priority, with respect to health concerns in the human population. The achievement of improved quantitation limits for immuno-analytical methods, like ELISA, used for compound quantitation, and standardization of the method for measuring E2 equivalents (EEQs) used for biological activity (endocrine: e.g., estrogenic) are some areas for future EDC research.

  5. Thermal and Chemical Characterization of Non-Metallic Materials Using Coupled Thermogravimetric Analysis and Infrared Spectroscopy

    NASA Technical Reports Server (NTRS)

    Huff, Timothy L.

    2002-01-01

    Thermogravimetric analysis (TGA) is widely employed in the thermal characterization of non-metallic materials, yielding valuable information on decomposition characteristics of a sample over a wide temperature range. However, a potential wealth of chemical information is lost during the process, with the evolving gases generated during thermal decomposition escaping through the exhaust line. Fourier Transform-Infrared spectroscopy (FT-IR) is a powerful analytical technique for determining many chemical constituents while in any material state, in this application, the gas phase. By linking these two techniques, evolving gases generated during the TGA process are directed into an appropriately equipped infrared spectrometer for chemical speciation. Consequently, both thermal decomposition and chemical characterization of a material may be obtained in a single sample run. In practice, a heated transfer line is employed to connect the two instruments while a purge gas stream directs the evolving gases into the FT-IR. The purge gas can be either high purity air or an inert gas such as nitrogen to allow oxidative and pyrolytic processes to be examined, respectively. The FT-IR data is collected realtime, allowing continuous monitoring of chemical compositional changes over the course of thermal decomposition. Using this coupled technique, an array of diverse materials has been examined, including composites, plastics, rubber, fiberglass epoxy resins, polycarbonates, silicones, lubricants and fluorocarbon materials. The benefit of combining these two methodologies is of particular importance in the aerospace community, where newly developing materials have little available data with which to refer. By providing both thermal and chemical data simultaneously, a more definitive and comprehensive characterization of the material is possible. Additionally, this procedure has been found to be a viable screening technique for certain materials, with the generated data useful in the selection of other appropriate analytical procedures for further material characterization.

  6. Ozone Modulation/Membrane Introduction Mass Spectrometry for Analysis of Hydrocarbon Pollutants in Air

    NASA Astrophysics Data System (ADS)

    Atkinson, D. B.

    2001-12-01

    Modulation of volatile hydrocarbons in two-component mixtures is demonstrated using an ozonolysis pretreatment with membrane introduction mass spectrometry (MIMS). The MIMS technique allows selective introduction of volatile and semivolatile analytes into a mass spectrometer via processes known collectively as pervaporation [Kotiaho and Cooks, 1992]. A semipermeable polymer membrane acts as an interface between the sample (vapor or solution) and the vacuum of the mass spectrometer. This technique has been demonstrated to allow for sensitive analysis of hydrocarbons and other non-polar volatile organic compounds (VOC`s) in air samples[Cisper et al., 1995] . The methodology has the advantages of no sample pretreatment and short analysis time, which are promising for online monitoring applications but the chief disadvantage of lack of a separation step for the different analytes in a mixture. Several approaches have been investigated to overcome this problem including use of selective chemical ionization [Bier and Cooks, 1987] and multivariate calibration techniques[Ketola et al., 1999] . A new approach is reported for the quantitative measurement of VOCs in complex matrices. The method seeks to reduce the complexity of mass spectra observed in hydrocarbon mixture analysis by selective pretreatment of the analyte mixture. In the current investigation, the rapid reaction of ozone with alkenes is used, producing oxygenated compounds which are suppressed by the MIMS system. This has the effect of removing signals due to unsaturated analytes from the compound mass spectra, and comparison of the spectra before and after the ozone treatment reveals the nature of the parent compounds. In preliminary investigations, ozone reacted completely with cyclohexene from a mixture of cylohexene and cyclohexane, and with β -pinene from a mixture of toluene and β -pinene, suppressing the ion signals from the olefins. A slight attenuation of the cyclohexane and toluene in those mixtures was also observed. Despite this problem, the hydrocarbon signal response can be calibrated and the method can be used for quantitative analysis of volatile hydrocarbon compounds in air samples. This methodology should augment the efficiency of the MIMS approach in online and onsite monitoring of VOC emissions. Bier, M.R., and R.G. Cooks, Membrane Interface for Selective Introduction of Volatile Compounds Directly into The Ionization Chamber of a Mass Spectrometer, Anal. Chem., 59 (4), 597, 1987. Cisper, M.E., C.G. Gill, L.E. Townsend, and P.H. Hemberger, On-Line Detection of Volatile Organic Compounds in Air at Parts-per-Trillion Levels by Membrane Introduction Mass Spectrometry, Anal. Chem., 67 (8), 1413-1417, 1995. Ketola, R.A., M. Ojala, and J. Heikkonen, A Non-linear Asymmetric Error Function-based Least Mean Square Approach for the Analysis of Multicomponent Mass Spectra Measured by Membrane Inlet Mass Spectrometry, Rapid Commun. Mass Spectrom., 13 (8), 654, 1999. Kotiaho, T., and R.G. Cooks, Membrane Introduction Mass Spectrometry in Environmental Analysis, in: J.J. Breen, M. J. Dellarco, (Eds), Pollution in Industrial processes, 126 pp., ACS Symp. Ser., Washington, D.C. 508, 1992.

  7. Incorporating Information Literacy Skills into Analytical Chemistry: An Evolutionary Step

    ERIC Educational Resources Information Center

    Walczak, Mary M.; Jackson, Paul T.

    2007-01-01

    The American Chemical Society (ACS) has recently decided to incorporate various information literacy skills for teaching analytical chemistry to the students. The methodology has been found to be extremely effective, as it provides better understanding to the students.

  8. A Modern Approach to College Analytical Chemistry.

    ERIC Educational Resources Information Center

    Neman, R. L.

    1983-01-01

    Describes a course which emphasizes all facets of analytical chemistry, including sampling, preparation, interference removal, selection of methodology, measurement of a property, and calculation/interpretation of results. Includes special course features (such as cooperative agreement with an environmental protection center) and course…

  9. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Using design of experiments to optimize derivatization with methyl chloroformate for quantitative analysis of the aqueous phase from hydrothermal liquefaction of biomass.

    PubMed

    Madsen, René Bjerregaard; Jensen, Mads Mørk; Mørup, Anders Juul; Houlberg, Kasper; Christensen, Per Sigaard; Klemmer, Maika; Becker, Jacob; Iversen, Bo Brummerstedt; Glasius, Marianne

    2016-03-01

    Hydrothermal liquefaction is a promising technique for the production of bio-oil. The process produces an oil phase, a gas phase, a solid residue, and an aqueous phase. Gas chromatography coupled with mass spectrometry is used to analyze the complex aqueous phase. Especially small organic acids and nitrogen-containing compounds are of interest. The efficient derivatization reagent methyl chloroformate was used to make analysis of the complex aqueous phase from hydrothermal liquefaction of dried distillers grains with solubles possible. A circumscribed central composite design was used to optimize the responses of both derivatized and nonderivatized analytes, which included small organic acids, pyrazines, phenol, and cyclic ketones. Response surface methodology was used to visualize significant factors and identify optimized derivatization conditions (volumes of methyl chloroformate, NaOH solution, methanol, and pyridine). Twenty-nine analytes of small organic acids, pyrazines, phenol, and cyclic ketones were quantified. An additional three analytes were pseudoquantified with use of standards with similar mass spectra. Calibration curves with high correlation coefficients were obtained, in most cases R (2)  > 0.991. Method validation was evaluated with repeatability, and spike recoveries of all 29 analytes were obtained. The 32 analytes were quantified in samples from the commissioning of a continuous flow reactor and in samples from recirculation experiments involving the aqueous phase. The results indicated when the steady-state condition of the flow reactor was obtained and the effects of recirculation. The validated method will be especially useful for investigations of the effect of small organic acids on the hydrothermal liquefaction process.

  11. Predicting Malignant and Paramalignant Pleural Effusions by Combining Clinical, Radiological and Pleural Fluid Analytical Parameters.

    PubMed

    Herrera Lara, Susana; Fernández-Fabrellas, Estrella; Juan Samper, Gustavo; Marco Buades, Josefa; Andreu Lapiedra, Rafael; Pinilla Moreno, Amparo; Morales Suárez-Varela, María

    2017-10-01

    The usefulness of clinical, radiological and pleural fluid analytical parameters for diagnosing malignant and paramalignant pleural effusion is not clearly stated. Hence this study aimed to identify possible predictor variables of diagnosing malignancy in pleural effusion of unknown aetiology. Clinical, radiological and pleural fluid analytical parameters were obtained from consecutive patients who had suffered pleural effusion of unknown aetiology. They were classified into three groups according to their final diagnosis: malignant, paramalignant and benign pleural effusion. The CHAID (Chi-square automatic interaction detector) methodology was used to estimate the implication of the clinical, radiological and analytical variables in daily practice through decision trees. Of 71 patients, malignant (n = 31), paramalignant (n = 15) and benign (n = 25), smoking habit, dyspnoea, weight loss, radiological characteristics (mass, node, adenopathies and pleural thickening) and pleural fluid analytical parameters (pH and glucose) distinguished malignant and paramalignant pleural effusions (all with a p < 0.05). Decision tree 1 classified 77.8% of malignant and paramalignant pleural effusions in step 2. Decision tree 2 classified 83.3% of malignant pleural effusions in step 2, 73.3% of paramalignant pleural effusions and 91.7% of benign ones. The data herein suggest that the identified predictor values applied to tree diagrams, which required no extraordinary measures, have a higher rate of correct identification of malignant, paramalignant and benign effusions when compared to techniques available today and proved most useful for usual clinical practice. Future studies are still needed to further improve the classification of patients.

  12. Green approach using monolithic column for simultaneous determination of coformulated drugs.

    PubMed

    Yehia, Ali M; Mohamed, Heba M

    2016-06-01

    Green chemistry and sustainability is now entirely encompassed across the majority of pharmaceutical companies and research labs. Researchers' attention is careworn toward implementing the green analytical chemistry principles for more eco-friendly analytical methodologies. Solvents play a dominant role in determining the greenness of the analytical procedure. Using safer solvents, the greenness profile of the methodology could be increased remarkably. In this context, a green chromatographic method has been developed and validated for the simultaneous determination of phenylephrine, paracetamol, and guaifenesin in their ternary pharmaceutical mixture. The chromatographic separation was carried out using monolithic column and green solvents as mobile phase. The use of monolithic column allows efficient separation protocols at higher flow rates, which results in short time of analysis. Two-factor three-level experimental design was used to optimize the chromatographic conditions. The greenness profile of the proposed methodology was assessed using eco-scale as a green metrics and was found to be an excellent green method with regard to the usage and production of hazardous chemicals and solvents, energy consumption, and amount of produced waste. The proposed method improved the environmental impact without compromising the analytical performance criteria and could be used as a safer alternate for the routine analysis of the studied drugs. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    PubMed

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Assessment of analytical quality in Nordic clinical chemistry laboratories using data from contemporary national programs.

    PubMed

    Aronsson, T; Bjørnstad, P; Leskinen, E; Uldall, A; de Verdier, C H

    1984-01-01

    The aim of this investigation was primarily to assess analytical quality expressed as between-laboratory, within-laboratory, and total imprecision, not in order to detect laboratories with poor performance, but in the positive sense to provide data for improving critical steps in analytical methodology. The aim was also to establish the present state of the art in comparison with earlier investigations to see if improvement in analytical quality could be observed.

  15. Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.

    PubMed

    Grdinić, Vladimir; Vuković, Jadranka

    2004-05-28

    A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.

  16. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  17. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  18. Synthesis of qualitative linguistic research--a pilot review integrating and generalizing findings on doctor-patient interaction.

    PubMed

    Nowak, Peter

    2011-03-01

    There is a broad range qualitative linguistic research (sequential analysis) on doctor-patient interaction that had only a marginal impact on clinical research and practice. At least in parts this is due to the lack of qualitative research synthesis in the field. Available research summaries are not systematic in their methodology. This paper proposes a synthesis methodology for qualitative, sequential analytic research on doctor-patient interaction. The presented methodology is not new but specifies standard methodology of qualitative research synthesis for sequential analytic research. This pilot review synthesizes twelve studies on German-speaking doctor-patient interactions, identifies 45 verbal actions of doctors and structures them in a systematics of eight interaction components. Three interaction components ("Listening", "Asking for information", and "Giving information") seem to be central and cover two thirds of the identified action types. This pilot review demonstrates that sequential analytic research can be synthesized in a consistent and meaningful way, thus providing a more comprehensive and unbiased integration of research. Future synthesis of qualitative research in the area of health communication research is very much needed. Qualitative research synthesis can support the development of quantitative research and of educational materials in medical training and patient training. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  19. The Literature Review of Analytical Support to Defence Transformation: Lessons Learned from Turkish Air Force Transformation Activities

    DTIC Science & Technology

    2010-04-01

    available [11]. Additionally, Table-3 is a guide for DMAIC methodology including 29 different methods [12]. RTO-MP-SAS-081 6 - 4 NATO UNCLASSIFIED NATO...Table 3: DMAIC Methodology (5-Phase Methodology). Define Measure Analyze Improve Control Project Charter Prioritization Matrix 5 Whys Analysis...Methodology Scope [13] DMAIC PDCA Develop performance priorities This is a preliminary stage that precedes specific improvement projects, and the aim

  20. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    PubMed

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  2. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Passenger rail vehicle safety assessment methodology. Volume II, Detailed analyses and simulation results.

    DOT National Transportation Integrated Search

    2000-04-01

    This report presents detailed analytic tools and results on dynamic response which are used to develop the safe dynamic performance limits of commuter passenger vehicles. The methodology consists of determining the critical parameters and characteris...

  4. Xpey' Relational Environments: an analytic framework for conceptualizing Indigenous health equity.

    PubMed

    Kent, Alexandra; Loppie, Charlotte; Carriere, Jeannine; MacDonald, Marjorie; Pauly, Bernie

    2017-12-01

    Both health equity research and Indigenous health research are driven by the goal of promoting equitable health outcomes among marginalized and underserved populations. However, the two fields often operate independently, without collaboration. As a result, Indigenous populations are underrepresented in health equity research relative to the disproportionate burden of health inequities they experience. In this methodological article, we present Xpey' Relational Environments, an analytic framework that maps some of the barriers and facilitators to health equity for Indigenous peoples. Health equity research needs to include a focus on Indigenous populations and Indigenized methodologies, a shift that could fill gaps in knowledge with the potential to contribute to 'closing the gap' in Indigenous health. With this in mind, the Equity Lens in Public Health (ELPH) research program adopted the Xpey' Relational Environments framework to add a focus on Indigenous populations to our research on the prioritization and implementation of health equity. The analytic framework introduced an Indigenized health equity lens to our methodology, which facilitated the identification of social, structural and systemic determinants of Indigenous health. To test the framework, we conducted a pilot case study of one of British Columbia's regional health authorities, which included a review of core policies and plans as well as interviews and focus groups with frontline staff, managers and senior executives. ELPH's application of Xpey' Relational Environments serves as an example of the analytic framework's utility for exploring and conceptualizing Indigenous health equity in BC's public health system. Future applications of the framework should be embedded in Indigenous research methodologies.

  5. Surrogate matrix and surrogate analyte approaches for definitive quantitation of endogenous biomolecules.

    PubMed

    Jones, Barry R; Schultz, Gary A; Eckstein, James A; Ackermann, Bradley L

    2012-10-01

    Quantitation of biomarkers by LC-MS/MS is complicated by the presence of endogenous analytes. This challenge is most commonly overcome by calibration using an authentic standard spiked into a surrogate matrix devoid of the target analyte. A second approach involves use of a stable-isotope-labeled standard as a surrogate analyte to allow calibration in the actual biological matrix. For both methods, parallelism between calibration standards and the target analyte in biological matrix must be demonstrated in order to ensure accurate quantitation. In this communication, the surrogate matrix and surrogate analyte approaches are compared for the analysis of five amino acids in human plasma: alanine, valine, methionine, leucine and isoleucine. In addition, methodology based on standard addition is introduced, which enables a robust examination of parallelism in both surrogate analyte and surrogate matrix methods prior to formal validation. Results from additional assays are presented to introduce the standard-addition methodology and to highlight the strengths and weaknesses of each approach. For the analysis of amino acids in human plasma, comparable precision and accuracy were obtained by the surrogate matrix and surrogate analyte methods. Both assays were well within tolerances prescribed by regulatory guidance for validation of xenobiotic assays. When stable-isotope-labeled standards are readily available, the surrogate analyte approach allows for facile method development. By comparison, the surrogate matrix method requires greater up-front method development; however, this deficit is offset by the long-term advantage of simplified sample analysis.

  6. Robotic voltammetry with carbon nanotube-based sensors: a superb blend for convenient high-quality antimicrobial trace analysis.

    PubMed

    Theanponkrang, Somjai; Suginta, Wipa; Weingart, Helge; Winterhalter, Mathias; Schulte, Albert

    2015-01-01

    A new automated pharmacoanalytical technique for convenient quantification of redox-active antibiotics has been established by combining the benefits of a carbon nanotube (CNT) sensor modification with electrocatalytic activity for analyte detection with the merits of a robotic electrochemical device that is capable of sequential nonmanual sample measurements in 24-well microtiter plates. Norfloxacin (NFX) and ciprofloxacin (CFX), two standard fluoroquinolone antibiotics, were used in automated calibration measurements by differential pulse voltammetry (DPV) and accomplished were linear ranges of 1-10 μM and 2-100 μM for NFX and CFX, respectively. The lowest detectable levels were estimated to be 0.3±0.1 μM (n=7) for NFX and 1.6±0.1 μM (n=7) for CFX. In standard solutions or tablet samples of known content, both analytes could be quantified with the robotic DPV microtiter plate assay, with recoveries within ±4% of 100%. And recoveries were as good when NFX was evaluated in human serum samples with added NFX. The use of simple instrumentation, convenience in execution, and high effectiveness in analyte quantitation suggest the merger between automated microtiter plate voltammetry and CNT-supported electrochemical drug detection as a novel methodology for antibiotic testing in pharmaceutical and clinical research and quality control laboratories.

  7. An Affordance-Based Framework for Human Computation and Human-Computer Collaboration.

    PubMed

    Crouser, R J; Chang, R

    2012-12-01

    Visual Analytics is "the science of analytical reasoning facilitated by visual interactive interfaces". The goal of this field is to develop tools and methodologies for approaching problems whose size and complexity render them intractable without the close coupling of both human and machine analysis. Researchers have explored this coupling in many venues: VAST, Vis, InfoVis, CHI, KDD, IUI, and more. While there have been myriad promising examples of human-computer collaboration, there exists no common language for comparing systems or describing the benefits afforded by designing for such collaboration. We argue that this area would benefit significantly from consensus about the design attributes that define and distinguish existing techniques. In this work, we have reviewed 1,271 papers from many of the top-ranking conferences in visual analytics, human-computer interaction, and visualization. From these, we have identified 49 papers that are representative of the study of human-computer collaborative problem-solving, and provide a thorough overview of the current state-of-the-art. Our analysis has uncovered key patterns of design hinging on human and machine-intelligence affordances, and also indicates unexplored avenues in the study of this area. The results of this analysis provide a common framework for understanding these seemingly disparate branches of inquiry, which we hope will motivate future work in the field.

  8. New robust bilinear least squares method for the analysis of spectral-pH matrix data.

    PubMed

    Goicoechea, Héctor C; Olivieri, Alejandro C

    2005-07-01

    A new second-order multivariate method has been developed for the analysis of spectral-pH matrix data, based on a bilinear least-squares (BLLS) model achieving the second-order advantage and handling multiple calibration standards. A simulated Monte Carlo study of synthetic absorbance-pH data allowed comparison of the newly proposed BLLS methodology with constrained parallel factor analysis (PARAFAC) and with the combination multivariate curve resolution-alternating least-squares (MCR-ALS) technique under different conditions of sample-to-sample pH mismatch and analyte-background ratio. The results indicate an improved prediction ability for the new method. Experimental data generated by measuring absorption spectra of several calibration standards of ascorbic acid and samples of orange juice were subjected to second-order calibration analysis with PARAFAC, MCR-ALS, and the new BLLS method. The results indicate that the latter method provides the best analytical results in regard to analyte recovery in samples of complex composition requiring strict adherence to the second-order advantage. Linear dependencies appear when multivariate data are produced by using the pH or a reaction time as one of the data dimensions, posing a challenge to classical multivariate calibration models. The presently discussed algorithm is useful for these latter systems.

  9. Analytical Methodologies for Detection of Gamma-Valerolactone, Delta-Valerolactone, Acephate and Azinphos Methyl and Their Associated Metabolites in Complex Biological Matrices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zink, E.; Clark, R.; Grant, K.

    2005-01-01

    Non-invasive biomonitoring for chemicals of interest in law enforcement and similar monitoring of pesticides, together with their metabolites, can not only save money but can lead to faster medical attention for individuals exposed to these chemicals. This study describes methods developed for the analysis of gamma-valerolactone (GVL), delta-valerolactone (DVL), acephate, and azinphos methyl in saliva and serum. Liquid chromatography/mass spectrometry (LC/MS) operated in the negative and positive ion mode and gas chromatography/mass spectrometry (GC/MS) were used to analyze GVL and DVL. Although both analytical techniques worked well, lower detection limits were obtained with GC/MS. The lactones and their corresponding sodiummore » salts were spiked into both saliva and serum. The lactones were isolated from saliva or serum using newly developed extraction techniques and then subsequently analyzed using GC/MS. The sodium salts of the lactones are nonvolatile and require derivatization prior to analysis by this method. N-methyl-N-(t-butyldimethylsilyl)-trifluoroacetamide (MTBSTFA) was ultimately selected as the reagent for derivatization because the acidic conditions required for reactions with diazomethane caused the salts to undergo intramolecular cyclization to the corresponding lactones. In vitro studies were conducted using rat liver microsomes to determine other metabolites associated with these compounds. Azinphos methyl and acephate are classified as organophosphate pesticides, and are known to be cholinesterase inhibitors in humans and insects, causing neurotoxicity. For this reason they have both exposure and environmental impact implications. These compounds were spiked into serum and saliva and prepared for analysis by GC/MS. Continuation of this research would include analysis by GC/MS under positive ion mode to determine the parent ions of the unknown metabolites. Further research is planned through an in vivo analysis of the lactones and pesticides. These methodologies could be extended for further analysis of other similar compounds.« less

  10. Analytical Methodologies for Detection of Gamma-valerolactone, Delta-valerolactone, Acephate, and Azinphos Methyl and their Associated Metabolites in Complex Biological Matrices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zink, Erika M.; Clark, Ryan J.; Grant, Karen E.

    2005-01-01

    Non-invasive biomonitoring for chemicals of interest in law enforcement and similar monitoring of pesticides together with their metabolites can not only save money but can lead to faster medical attention for individuals exposed to these chemicals. This study describes methods developed for the analysis of gamma-valerolactone (GVL), delta-valerolactone (DVL), acephate, and azinphos methyl in saliva and serum. Liquid chromatography/mass spectrometry (LC/MS) operated in the negative ion mode and in the positive ion mode and gas chromatography/mass spectrometry (GC/MS) were used to analyze GVL and DVL. Although both analytical techniques worked well, lower detection limits were obtained with GC/MS. The lactonesmore » and their corresponding sodium salts were spiked into both saliva and serum. The lactones were isolated from saliva or serum using newly developed extraction techniques and then subsequently analyzed using GC/MS. The sodium salts of the lactones are nonvolatile and require derivatization prior to analysis by this method. N-methyl-N-(t-butyldimethylsilyl)-trifluoroacetamide (MTBSTFA) was ultimately selected as the reagent for derivatization because the acidic conditions required for reactions with diazomethane caused the salts to undergo intramolecular cyclization to the corresponding lactones. In vitro studies were conducted using rat liver microsomes to determine other metabolites associated with these compounds. Azinphos methyl and acephate are classified as organophosphate pesticides, and are known to be cholinesterase inhibitors in humans and insects, causing neurotoxicity. For this reason they have both exposure and environmental impact implications. These compounds were spiked into serum and saliva and prepared for analysis by GC/MS. Continuation of this research would include analysis by GC/MS under positive ion mode to determine the parent ions of the unknown metabolites. Further research is planned through an in vivo analysis of the lactones and pesticides. These methodologies could be extended for further analysis of other similar compounds as well as chemical and biological warfare agents.« less

  11. Robust approximate optimal guidance strategies for aeroassisted orbital transfer missions

    NASA Astrophysics Data System (ADS)

    Ilgen, Marc R.

    This thesis presents the application of game theoretic and regular perturbation methods to the problem of determining robust approximate optimal guidance laws for aeroassisted orbital transfer missions with atmospheric density and navigated state uncertainties. The optimal guidance problem is reformulated as a differential game problem with the guidance law designer and Nature as opposing players. The resulting equations comprise the necessary conditions for the optimal closed loop guidance strategy in the presence of worst case parameter variations. While these equations are nonlinear and cannot be solved analytically, the presence of a small parameter in the equations of motion allows the method of regular perturbations to be used to solve the equations approximately. This thesis is divided into five parts. The first part introduces the class of problems to be considered and presents results of previous research. The second part then presents explicit semianalytical guidance law techniques for the aerodynamically dominated region of flight. These guidance techniques are applied to unconstrained and control constrained aeroassisted plane change missions and Mars aerocapture missions, all subject to significant atmospheric density variations. The third part presents a guidance technique for aeroassisted orbital transfer problems in the gravitationally dominated region of flight. Regular perturbations are used to design an implicit guidance technique similar to the second variation technique but that removes the need for numerically computing an optimal trajectory prior to flight. This methodology is then applied to a set of aeroassisted inclination change missions. In the fourth part, the explicit regular perturbation solution technique is extended to include the class of guidance laws with partial state information. This methodology is then applied to an aeroassisted plane change mission using inertial measurements and subject to uncertainties in the initial value of the flight path angle. A summary of performance results for all these guidance laws is presented in the fifth part of this thesis along with recommendations for further research.

  12. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  13. Continuous state-space representation of a bucket-type rainfall-runoff model: a case study with the GR4 model using state-space GR4 (version 1.0)

    NASA Astrophysics Data System (ADS)

    Santos, Léonard; Thirel, Guillaume; Perrin, Charles

    2018-04-01

    In many conceptual rainfall-runoff models, the water balance differential equations are not explicitly formulated. These differential equations are solved sequentially by splitting the equations into terms that can be solved analytically with a technique called operator splitting. As a result, only the solutions of the split equations are used to present the different models. This article provides a methodology to make the governing water balance equations of a bucket-type rainfall-runoff model explicit and to solve them continuously. This is done by setting up a comprehensive state-space representation of the model. By representing it in this way, the operator splitting, which makes the structural analysis of the model more complex, could be removed. In this state-space representation, the lag functions (unit hydrographs), which are frequent in rainfall-runoff models and make the resolution of the representation difficult, are first replaced by a so-called Nash cascade and then solved with a robust numerical integration technique. To illustrate this methodology, the GR4J model is taken as an example. The substitution of the unit hydrographs with a Nash cascade, even if it modifies the model behaviour when solved using operator splitting, does not modify it when the state-space representation is solved using an implicit integration technique. Indeed, the flow time series simulated by the new representation of the model are very similar to those simulated by the classic model. The use of a robust numerical technique that approximates a continuous-time model also improves the lag parameter consistency across time steps and provides a more time-consistent model with time-independent parameters.

  14. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  15. Analysis of environmental contamination resulting from catastrophic incidents: part 2. Building laboratory capability by selecting and developing analytical methodologies.

    PubMed

    Magnuson, Matthew; Campisano, Romy; Griggs, John; Fitz-James, Schatzi; Hall, Kathy; Mapp, Latisha; Mullins, Marissa; Nichols, Tonya; Shah, Sanjiv; Silvestri, Erin; Smith, Terry; Willison, Stuart; Ernst, Hiba

    2014-11-01

    Catastrophic incidents can generate a large number of samples of analytically diverse types, including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface residue. Such samples may arise not only from contamination from the incident but also from the multitude of activities surrounding the response to the incident, including decontamination. This document summarizes a range of activities to help build laboratory capability in preparation for sample analysis following a catastrophic incident, including selection and development of fit-for-purpose analytical methods for chemical, biological, and radiological contaminants. Fit-for-purpose methods are those which have been selected to meet project specific data quality objectives. For example, methods could be fit for screening contamination in the early phases of investigation of contamination incidents because they are rapid and easily implemented, but those same methods may not be fit for the purpose of remediating the environment to acceptable levels when a more sensitive method is required. While the exact data quality objectives defining fitness-for-purpose can vary with each incident, a governing principle of the method selection and development process for environmental remediation and recovery is based on achieving high throughput while maintaining high quality analytical results. This paper illustrates the result of applying this principle, in the form of a compendium of analytical methods for contaminants of interest. The compendium is based on experience with actual incidents, where appropriate and available. This paper also discusses efforts aimed at adaptation of existing methods to increase fitness-for-purpose and development of innovative methods when necessary. The contaminants of interest are primarily those potentially released through catastrophes resulting from malicious activity. However, the same techniques discussed could also have application to catastrophes resulting from other incidents, such as natural disasters or industrial accidents. Further, the high sample throughput enabled by the techniques discussed could be employed for conventional environmental studies and compliance monitoring, potentially decreasing costs and/or increasing the quantity of data available to decision-makers. Published by Elsevier Ltd.

  16. Uncover the mantle: rediscovering Gregório Lopes palette and technique with a study on the painting "Mater Misericordiae"

    NASA Astrophysics Data System (ADS)

    Antunes, Vanessa; Candeias, António; Oliveira, Maria J.; Carvalho, Maria L.; Dias, Cristina Barrocas; Manhita, Ana; Francisco, Maria J.; Costa, Sónia; Lauw, Alexandra; Manso, Marta

    2016-11-01

    Gregório Lopes (c. 1490-1550) was one of the most prominent painters of the renaissance and Mannerism in Portugal. The painting "Mater Misericordiae" made for the Sesimbra Holy House of Mercy, circa 1535-1538, is one of the most significant works of the artist, and his only painting on this theme, being also one of the most significant Portuguese paintings of sixteenth century. The recent restoration provided the possibility to study materially the painting for the first time, with a multianalytical methodology incorporating portable energy-dispersive X-ray fluorescence spectroscopy, scanning electron microscopy-energy-dispersive spectroscopy, micro-X-ray diffraction, micro-Raman spectroscopy and high-performance liquid chromatography coupled to diode array and mass spectrometry detectors. The analytical study was complemented by infrared reflectography, allowing the study of the underdrawing technique and also by dendrochronology to confirm the date of the wooden panels (1535-1538). The results of this study were compared with previous ones on the painter's workshop, and significant differences and similitudes were found in the materials and techniques used.

  17. Trends in hard X-ray fluorescence mapping: environmental applications in the age of fast detectors.

    PubMed

    Lombi, E; de Jonge, M D; Donner, E; Ryan, C G; Paterson, D

    2011-06-01

    Environmental samples are extremely diverse but share a tendency for heterogeneity and complexity. This heterogeneity poses methodological challenges when investigating biogeochemical processes. In recent years, the development of analytical tools capable of probing element distribution and speciation at the microscale have allowed this challenge to be addressed. Of these available tools, laterally resolved synchrotron techniques such as X-ray fluorescence mapping are key methods for the in situ investigation of micronutrients and inorganic contaminants in environmental samples. This article demonstrates how recent advances in X-ray fluorescence detector technology are bringing new possibilities to environmental research. Fast detectors are helping to circumvent major issues such as X-ray beam damage of hydrated samples, as dwell times during scanning are reduced. They are also helping to reduce temporal beamtime requirements, making particularly time-consuming techniques such as micro X-ray fluorescence (μXRF) tomography increasingly feasible. This article focuses on μXRF mapping of nutrients and metalloids in environmental samples, and suggests that the current divide between mapping and speciation techniques will be increasingly blurred by the development of combined approaches.

  18. Harnessing psychoanalytical methods for a phenomenological neuroscience

    PubMed Central

    Cusumano, Emma P.; Raz, Amir

    2014-01-01

    Psychoanalysis proffers a wealth of phenomenological tools to advance the study of consciousness. Techniques for elucidating the structures of subjective life are sorely lacking in the cognitive sciences; as such, experiential reporting techniques must rise to meet both complex theories of brain function and increasingly sophisticated neuroimaging technologies. Analysis may offer valuable methods for bridging the gap between first-person and third-person accounts of the mind. Using both systematic observational approaches alongside unstructured narrative interactions, psychoanalysts help patients articulate their experience and bring unconscious mental contents into awareness. Similar to seasoned meditators or phenomenologists, individuals who have undergone analysis are experts in discerning and describing their subjective experience, thus making them ideal candidates for neurophenomenology. Moreover, analytic techniques may provide a means of guiding untrained experimental participants to greater awareness of their mental continuum, as well as gathering subjective reports about fundamental yet elusive aspects of experience including selfhood, temporality, and inter-subjectivity. Mining psychoanalysis for its methodological innovations provides a fresh turn for the neuropsychoanalysis movement and cognitive science as a whole – showcasing the integrity of analysis alongside the irreducibility of human experience. PMID:24808869

  19. Comparison of two novel in-syringe dispersive liquid-liquid microextraction techniques for the determination of iodide in water samples using spectrophotometry.

    PubMed

    Kaykhaii, Massoud; Sargazi, Mona

    2014-01-01

    Two new, rapid methodologies have been developed and applied successfully for the determination of trace levels of iodide in real water samples. Both techniques are based on a combination of in-syringe dispersive liquid-liquid microextraction (IS-DLLME) and micro-volume UV-Vis spectrophotometry. In the first technique, iodide is oxidized with nitrous acid to the colorless anion of ICl2(-) at high concentration of hydrochloric acid. Rhodamine B is added and by means of one step IS-DLLME, the ion-pair formed was extracted into toluene and measured spectrophotometrically. Acetone is used as dispersive solvent. The second method is based on the IS-DLLME microextraction of iodide as iodide/1, 10-phenanthroline-iron((II)) chelate cation ion-pair (colored) into nitrobenzene. Methanol was selected as dispersive solvent. Optimal conditions for iodide extraction were determined for both approaches. Methods are compared in terms of analytical parameters such as precision, accuracy, speed and limit of detection. Both methods were successfully applied to determining iodide in tap and river water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Best available techniques (BATs) for oil spill response in the Mediterranean Sea: calm sea and presence of economic activities.

    PubMed

    Guidi, Giambattista; Sliskovic, Merica; Violante, Anna Carmela; Vukic, Luka

    2016-01-01

    An oil spill is the accidental or intentional discharge of petroleum products into the environment due to human activities. Although oil spills are actually just a little percent of the total world oil pollution problem, they represent the most visible form of it. The impact on the ecosystems can be severe as well as the impact on economic activities. Oil spill cleanup is a very difficult and expensive activity, and many techniques are available for it. In previous works, a methodology based on different kinds of criteria in order to come to the most satisfactory technique was proposed and the relative importance of each impact criterion on the basis of the Saaty's Analytic Hierarchy Process (AHP) was also evaluated. After a review of the best available techniques (BATs) available for oil spill response, this work suggests criteria for BATs' selection when oil spills occur in the Mediterranean Sea under well-defined circumstances: calm sea and presence of economic activities in the affected area. A group of experts with different specializations evaluated the alternative BATs by means of AHP method taking into account their respective advantages and disadvantages.

  1. 31P-Magnetization Transfer Magnetic Resonance Spectroscopy Measurements of In Vivo Metabolism

    PubMed Central

    Befroy, Douglas E.; Rothman, Douglas L.; Petersen, Kitt Falk; Shulman, Gerald I.

    2012-01-01

    Magnetic resonance spectroscopy offers a broad range of noninvasive analytical methods for investigating metabolism in vivo. Of these, the magnetization-transfer (MT) techniques permit the estimation of the unidirectional fluxes associated with metabolic exchange reactions. Phosphorus (31P) MT measurements can be used to examine the bioenergetic reactions of the creatine-kinase system and the ATP synthesis/hydrolysis cycle. Observations from our group and others suggest that the inorganic phosphate (Pi) → ATP flux in skeletal muscle may be modulated by certain conditions, including aging, insulin resistance, and diabetes, and may reflect inherent alterations in mitochondrial metabolism. However, such effects on the Pi → ATP flux are not universally observed under conditions in which mitochondrial function, assessed by other techniques, is impaired, and recent articles have raised concerns about the absolute magnitude of the measured reaction rates. As the application of 31P-MT techniques becomes more widespread, this article reviews the methodology and outlines our experience with its implementation in a variety of models in vivo. Also discussed are potential limitations of the technique, complementary methods for assessing oxidative metabolism, and whether the Pi → ATP flux is a viable biomarker of metabolic function in vivo. PMID:23093656

  2. Foodomics: MS-based strategies in modern food science and nutrition.

    PubMed

    Herrero, Miguel; Simó, Carolina; García-Cañas, Virginia; Ibáñez, Elena; Cifuentes, Alejandro

    2012-01-01

    Modern research in food science and nutrition is moving from classical methodologies to advanced analytical strategies in which MS-based techniques play a crucial role. In this context, Foodomics has been recently defined as a new discipline that studies food and nutrition domains through the application of advanced omics technologies in which MS techniques are considered indispensable. Applications of Foodomics include the genomic, transcriptomic, proteomic, and/or metabolomic study of foods for compound profiling, authenticity, and/or biomarker-detection related to food quality or safety; the development of new transgenic foods, food contaminants, and whole toxicity studies; new investigations on food bioactivity, food effects on human health, etc. This review work does not intend to provide an exhaustive revision of the many works published so far on food analysis using MS techniques. The aim of the present work is to provide an overview of the different MS-based strategies that have been (or can be) applied in the new field of Foodomics, discussing their advantages and drawbacks. Besides, some ideas about the foreseen development and applications of MS-techniques in this new discipline are also provided. Copyright © 2011 Wiley Periodicals, Inc.

  3. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    PubMed Central

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  4. Selecting Health Care Improvement Projects: A Methodology Integrating Cause-and-Effect Diagram and Analytical Hierarchy Process.

    PubMed

    Testik, Özlem Müge; Shaygan, Amir; Dasdemir, Erdi; Soydan, Guray

    It is often vital to identify, prioritize, and select quality improvement projects in a hospital. Yet, a methodology, which utilizes experts' opinions with different points of view, is needed for better decision making. The proposed methodology utilizes the cause-and-effect diagram to identify improvement projects and construct a project hierarchy for a problem. The right improvement projects are then prioritized and selected using a weighting scheme of analytical hierarchy process by aggregating experts' opinions. An approach for collecting data from experts and a graphical display for summarizing the obtained information are also provided. The methodology is implemented for improving a hospital appointment system. The top-ranked 2 major project categories for improvements were identified to be system- and accessibility-related causes (45%) and capacity-related causes (28%), respectively. For each of the major project category, subprojects were then ranked for selecting the improvement needs. The methodology is useful in cases where an aggregate decision based on experts' opinions is expected. Some suggestions for practical implementations are provided.

  5. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    PubMed

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the whole range of target substances as well as chemo-taxonomic studies and fingerprinting of complex mixtures, which are present in biological or environmental samples. Due to low consumption of eluent (usually 0.3-1mL/run) mainly composed of water-alcohol binary mixtures, this method can be considered as environmentally friendly and green chemistry focused analytical tool, supplementary to analytical protocols involving column chromatography or planar micro-fluidic devices. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Response Surface Methods For Spatially-Resolved Optical Measurement Techniques

    NASA Technical Reports Server (NTRS)

    Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.

    2003-01-01

    Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatially-resolved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/- 30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-lightweight, inflatable space antenna at NASA Langley Research Center. Photogrammetry is used to simultaneously measure the shape of the antenna at approximately 500 discrete spatial locations. RSM allows an analytic model to be developed that describes the shape of the majority of the antenna with an uncertainty of 0.4 mm, with 95% confidence. This model would allow a quantitative comparison between the actual shape of the antenna and the original design shape. Accurately determining this shape also allows confident interpolation between the measured points. Such a model could, for example, be used for ray tracing of radio-frequency waves up to 95 GHz. to predict the performance of the antenna.

  7. COMPARISON OF ANALYTICAL METHODS FOR THE MEASUREMENT OF NON-VIABLE BIOLOGICAL PM

    EPA Science Inventory

    The paper describes a preliminary research effort to develop a methodology for the measurement of non-viable biologically based particulate matter (PM), analyzing for mold, dust mite, and ragweed antigens and endotoxins. Using a comparison of analytical methods, the research obj...

  8. Cognitive-Developmental and Behavior-Analytic Theories: Evolving into Complementarity

    ERIC Educational Resources Information Center

    Overton, Willis F.; Ennis, Michelle D.

    2006-01-01

    Historically, cognitive-developmental and behavior-analytic approaches to the study of human behavior change and development have been presented as incompatible alternative theoretical and methodological perspectives. This presumed incompatibility has been understood as arising from divergent sets of metatheoretical assumptions that take the form…

  9. Nanomaterials in consumer products: a challenging analytical problem.

    PubMed

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  10. Nanomaterials in consumer products: a challenging analytical problem

    NASA Astrophysics Data System (ADS)

    Contado, Catia

    2015-08-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits versus risks of engineered nanomaterials and consequently to legislate in favor of consumer’s protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices.

  11. Nanomaterials in consumer products: a challenging analytical problem

    PubMed Central

    Contado, Catia

    2015-01-01

    Many products used in everyday life are made with the assistance of nanotechnologies. Cosmetic, pharmaceuticals, sunscreen, powdered food are only few examples of end products containing nano-sized particles (NPs), generally added to improve the product quality. To evaluate correctly benefits vs. risks of engineered nanomaterials and consequently to legislate in favor of consumer's protection, it is necessary to know the hazards connected with the exposure levels. This information implies transversal studies and a number of different competences. On analytical point of view the identification, quantification and characterization of NPs in food matrices and in cosmetic or personal care products pose significant challenges, because NPs are usually present at low concentration levels and the matrices, in which they are dispersed, are complexes and often incompatible with analytical instruments that would be required for their detection and characterization. This paper focused on some analytical techniques suitable for the detection, characterization and quantification of NPs in food and cosmetics products, reports their recent application in characterizing specific metal and metal-oxide NPs in these two important industrial and market sectors. The need of a characterization of the NPs as much as possible complete, matching complementary information about different metrics, possible achieved through validate procedures, is what clearly emerges from this research. More work should be done to produce standardized materials and to set-up methodologies to determine number-based size distributions and to get quantitative date about the NPs in such a complex matrices. PMID:26301216

  12. Bioinspired Methodology for Artificial Olfaction

    PubMed Central

    Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve

    2008-01-01

    Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409

  13. Analytical methodology for determination of helicopter IFR precision approach requirements. [pilot workload and acceptance level

    NASA Technical Reports Server (NTRS)

    Phatak, A. V.

    1980-01-01

    A systematic analytical approach to the determination of helicopter IFR precision approach requirements is formulated. The approach is based upon the hypothesis that pilot acceptance level or opinion rating of a given system is inversely related to the degree of pilot involvement in the control task. A nonlinear simulation of the helicopter approach to landing task incorporating appropriate models for UH-1H aircraft, the environmental disturbances and the human pilot was developed as a tool for evaluating the pilot acceptance hypothesis. The simulated pilot model is generic in nature and includes analytical representation of the human information acquisition, processing, and control strategies. Simulation analyses in the flight director mode indicate that the pilot model used is reasonable. Results of the simulation are used to identify candidate pilot workload metrics and to test the well known performance-work-load relationship. A pilot acceptance analytical methodology is formulated as a basis for further investigation, development and validation.

  14. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  15. Design/Analysis of the JWST ISIM Bonded Joints for Survivability at Cryogenic Temperatures

    NASA Technical Reports Server (NTRS)

    Bartoszyk, Andrew; Johnston, John; Kaprielian, Charles; Kuhn, Jonathan; Kunt, Cengiz; Rodini,Benjamin; Young, Daniel

    1990-01-01

    A major design and analysis challenge for the JWST ISIM structure is thermal survivability of metal/composite bonded joints below the cryogenic temperature of 30K (-405 F). Current bonded joint concepts include internal invar plug fittings, external saddle titanium/invar fittings and composite gusset/clip joints all bonded to M55J/954-6 and T300/954-6 hybrid composite tubes (75mm square). Analytical experience and design work done on metal/composite bonded joints at temperatures below that of liquid nitrogen are limited and important analysis tools, material properties, and failure criteria for composites at cryogenic temperatures are sparse in the literature. Increasing this challenge is the difficulty in testing for these required tools and properties at cryogenic temperatures. To gain confidence in analyzing and designing the ISIM joints, a comprehensive joint development test program has been planned and is currently running. The test program is designed to produce required analytical tools and develop a composite failure criterion for bonded joint strengths at cryogenic temperatures. Finite element analysis is used to design simple test coupons that simulate anticipated stress states in the flight joints; subsequently the test results are used to correlate the analysis technique for the final design of the bonded joints. In this work, we present an overview of the analysis and test methodology, current results, and working joint designs based on developed techniques and properties.

  16. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    PubMed

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.

  17. Review of methodological and experimental LIBS techniques for coal analysis and their application in power plants in China

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Zhang, Lei; Zhao, Shu-Xia; Li, Yu-Fang; Gong, Yao; Dong, Lei; Ma, Wei-Guang; Yin, Wang-Bao; Yao, Shun-Chun; Lu, Ji-Dong; Xiao, Lian-Tuan; Jia, Suo-Tang

    2016-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an emerging analytical spectroscopy technique. This review presents the main recent developments in China regarding the implementation of LIBS for coal analysis. The paper mainly focuses on the progress of the past few years in the fundamentals, data pretreatment, calibration model, and experimental issues of LIBS and its application to coal analysis. Many important domestic studies focusing on coal quality analysis have been conducted. For example, a proposed novel hybrid quantification model can provide more reproducible quantitative analytical results; the model obtained the average absolute errors (AREs) of 0.42%, 0.05%, 0.07%, and 0.17% for carbon, hydrogen, volatiles, and ash, respectively, and a heat value of 0.07 MJ/kg. Atomic/ionic emission lines and molecular bands, such as CN and C2, have been employed to generate more accurate analysis results, achieving an ARE of 0.26% and a 0.16% limit of detection (LOD) for the prediction of unburned carbon in fly ashes. Both laboratory and on-line LIBS apparatuses have been developed for field application in coal-fired power plants. We consider that both the accuracy and the repeatability of the elemental and proximate analysis of coal have increased significantly and further efforts will be devoted to realizing large-scale commercialization of coal quality analyzer in China.

  18. A framework for inference about carnivore density from unstructured spatial sampling of scat using detector dogs

    USGS Publications Warehouse

    Thompson, Craig M.; Royle, J. Andrew; Garner, James D.

    2012-01-01

    Wildlife management often hinges upon an accurate assessment of population density. Although undeniably useful, many of the traditional approaches to density estimation such as visual counts, livetrapping, or mark–recapture suffer from a suite of methodological and analytical weaknesses. Rare, secretive, or highly mobile species exacerbate these problems through the reality of small sample sizes and movement on and off study sites. In response to these difficulties, there is growing interest in the use of non-invasive survey techniques, which provide the opportunity to collect larger samples with minimal increases in effort, as well as the application of analytical frameworks that are not reliant on large sample size arguments. One promising survey technique, the use of scat detecting dogs, offers a greatly enhanced probability of detection while at the same time generating new difficulties with respect to non-standard survey routes, variable search intensity, and the lack of a fixed survey point for characterizing non-detection. In order to account for these issues, we modified an existing spatially explicit, capture–recapture model for camera trap data to account for variable search intensity and the lack of fixed, georeferenced trap locations. We applied this modified model to a fisher (Martes pennanti) dataset from the Sierra National Forest, California, and compared the results (12.3 fishers/100 km2) to more traditional density estimates. We then evaluated model performance using simulations at 3 levels of population density. Simulation results indicated that estimates based on the posterior mode were relatively unbiased. We believe that this approach provides a flexible analytical framework for reconciling the inconsistencies between detector dog survey data and density estimation procedures.

  19. Hybrid experimental/analytical models of structural dynamics - Creation and use for predictions

    NASA Technical Reports Server (NTRS)

    Balmes, Etienne

    1993-01-01

    An original complete methodology for the construction of predictive models of damped structural vibrations is introduced. A consistent definition of normal and complex modes is given which leads to an original method to accurately identify non-proportionally damped normal mode models. A new method to create predictive hybrid experimental/analytical models of damped structures is introduced, and the ability of hybrid models to predict the response to system configuration changes is discussed. Finally a critical review of the overall methodology is made by application to the case of the MIT/SERC interferometer testbed.

  20. Reference values of elements in human hair: a systematic review.

    PubMed

    Mikulewicz, Marcin; Chojnacka, Katarzyna; Gedrange, Thomas; Górecki, Henryk

    2013-11-01

    The lack of systematic review on reference values of elements in human hair with the consideration of methodological approach. The absence of worldwide accepted and implemented universal reference ranges causes that hair mineral analysis has not become yet a reliable and useful method of assessment of nutritional status and exposure of individuals. Systematic review of reference values of elements in human hair. PubMed, ISI Web of Knowledge, Scopus. Humans, hair mineral analysis, elements or minerals, reference values, original studies. The number of studies screened and assessed for eligibility was 52. Eventually, included in the review were 5 papers. The studies report reference ranges for the content of elements in hair: macroelements, microelements, toxic elements and other elements. Reference ranges were elaborated for different populations in the years 2000-2012. The analytical methodology differed, in particular sample preparation, digestion and analysis (ICP-AES, ICP-MS). Consequently, the levels of hair minerals reported as reference values varied. It is necessary to elaborate the standard procedures and furtherly validate hair mineral analysis and deliver detailed methodology. Only then it would be possible to provide meaningful reference ranges and take advantage of the potential that lies in Hair Mineral Analysis as a medical diagnostic technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Robust Feedback Control of Flow Induced Structural Radiation of Sound

    NASA Technical Reports Server (NTRS)

    Heatwole, Craig M.; Bernhard, Robert J.; Franchek, Matthew A.

    1997-01-01

    A significant component of the interior noise of aircraft and automobiles is a result of turbulent boundary layer excitation of the vehicular structure. In this work, active robust feedback control of the noise due to this non-predictable excitation is investigated. Both an analytical model and experimental investigations are used to determine the characteristics of the flow induced structural sound radiation problem. The problem is shown to be broadband in nature with large system uncertainties associated with the various operating conditions. Furthermore the delay associated with sound propagation is shown to restrict the use of microphone feedback. The state of the art control methodologies, IL synthesis and adaptive feedback control, are evaluated and shown to have limited success for solving this problem. A robust frequency domain controller design methodology is developed for the problem of sound radiated from turbulent flow driven plates. The control design methodology uses frequency domain sequential loop shaping techniques. System uncertainty, sound pressure level reduction performance, and actuator constraints are included in the design process. Using this design method, phase lag was added using non-minimum phase zeros such that the beneficial plant dynamics could be used. This general control approach has application to lightly damped vibration and sound radiation problems where there are high bandwidth control objectives requiring a low controller DC gain and controller order.

  2. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  3. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  4. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  5. Analysis of Environmental Contamination resulting from Catastrophic Incidents: Part two: Building Laboratory Capability by Selecting and Developing Analytical Methodologies

    EPA Science Inventory

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...

  6. New analytical methodology for analysing S(IV) species at low pH solutions by one stage titration method (bichromatometry) with a clear colour change. Could potentially replace the state-of-art-method iodometry at low pH analysis due higher accuracy

    PubMed Central

    Galfi, Istvan; Virtanen, Jorma; Gasik, Michael M.

    2017-01-01

    A new, faster and more reliable analytical methodology for S(IV) species analysis at low pH solutions by bichromatometry is proposed. For decades the state of the art methodology has been iodometry that is still well justified method for neutral solutions, thus at low pH media possess various side reactions increasing inaccuracy. In contrast, the new methodology has no side reactions at low pH media, requires only one titration step and provides a clear color change if S(IV) species are present in the solution. The method is validated using model solutions with known concentrations and applied to analyses of gaseous SO2 from purged solution in low pH media samples. The results indicate that bichromatometry can accurately analyze SO2 from liquid samples having pH even below 0 relevant to metallurgical industrial processes. PMID:29145479

  7. Structural Sizing Methodology for the Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) System

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Dorsey, John T.; Doggett, William R.

    2015-01-01

    The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.

  8. Meeting future information needs for Great Lakes fisheries management

    USGS Publications Warehouse

    Christie, W.J.; Collins, John J.; Eck, Gary W.; Goddard, Chris I.; Hoenig, John M.; Holey, Mark; Jacobson, Lawrence D.; MacCallum, Wayne; Nepszy, Stephen J.; O'Gorman, Robert; Selgeby, James

    1987-01-01

    Description of information needs for management of Great Lakes fisheries is complicated by recent changes in biology and management of the Great Lakes, development of new analytical methodologies, and a transition in management from a traditional unispecies approach to a multispecies/community approach. A number of general problems with the collection and management of data and information for fisheries management need to be addressed (i.e. spatial resolution, reliability, computerization and accessibility of data, design of sampling programs, standardization and coordination among agencies, and the need for periodic review of procedures). Problems with existing data collection programs include size selectivity and temporal trends in the efficiency of fishing gear, inadequate creel survey programs, bias in age estimation, lack of detailed sea lamprey (Petromyzon marinus) wounding data, and data requirements for analytical techniques that are underutilized by managers of Great Lakes fisheries. The transition to multispecies and community approaches to fisheries management will require policy decisions by the management agencies, adequate funding, and a commitment to develop programs for collection of appropriate data on a long-term basis.

  9. Computation of turbulent boundary layers employing the defect wall-function method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Brown, Douglas L.

    1994-01-01

    In order to decrease overall computational time requirements of spatially-marching parabolized Navier-Stokes finite-difference computer code when applied to turbulent fluid flow, a wall-function methodology, originally proposed by R. Barnwell, was implemented. This numerical effort increases computational speed and calculates reasonably accurate wall shear stress spatial distributions and boundary-layer profiles. Since the wall shear stress is analytically determined from the wall-function model, the computational grid near the wall is not required to spatially resolve the laminar-viscous sublayer. Consequently, a substantially increased computational integration step size is achieved resulting in a considerable decrease in net computational time. This wall-function technique is demonstrated for adiabatic flat plate test cases from Mach 2 to Mach 8. These test cases are analytically verified employing: (1) Eckert reference method solutions, (2) experimental turbulent boundary-layer data of Mabey, and (3) finite-difference computational code solutions with fully resolved laminar-viscous sublayers. Additionally, results have been obtained for two pressure-gradient cases: (1) an adiabatic expansion corner and (2) an adiabatic compression corner.

  10. Portable laser-induced breakdown spectroscopy/diffuse reflectance hybrid spectrometer for analysis of inorganic pigments

    NASA Astrophysics Data System (ADS)

    Siozos, Panagiotis; Philippidis, Aggelos; Anglos, Demetrios

    2017-11-01

    A novel, portable spectrometer, combining two analytical techniques, laser-induced breakdown spectroscopy (LIBS) and diffuse reflectance spectroscopy, was developed with the aim to provide an enhanced instrumental and methodological approach with regard to the analysis of pigments in objects of cultural heritage. Technical details about the hybrid spectrometer and its operation are presented and examples are given relevant to the analysis of paint materials. Both LIBS and diffuse reflectance spectra in the visible and part of the near infrared, corresponding to several neat mineral pigment samples, were recorded and the complementary information was used to effectively distinguish different types of pigments even if they had similar colour or elemental composition. The spectrometer was also employed in the analysis of different paints on the surface of an ancient pottery sherd demonstrating the capabilities of the proposed hybrid diagnostic approach. Despite its instrumental simplicity and compact size, the spectrometer is capable of supporting analytical campaigns relevant to archaeological, historical or art historical investigations, particularly when quick data acquisition is required in the context of surveys of large numbers of objects and samples.

  11. Applications of flight control system methods to an advanced combat rotorcraft

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.

    1989-01-01

    Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.

  12. Analysis of polar and non-polar VOCs from ambient and source matrices: Development of a new canister autosampler which meets TO-15 QA/QC criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, M.L.W.; Neal, D.; Uchtman, R.

    1997-12-31

    Approximately 108 of the Hazardous Air Pollutants (HAPs) specified in the 1990 Clean Air Act Amendments are classified as volatile organic compounds (VOCs). Of the 108 VOCs, nearly 35% are oxygenated or polar compounds. While more than one sample introduction technique exists for the analysis of these air toxics, SUMMA{reg_sign} canister sampling is suitable for the most complete range of analytes. A broad concentration range of polar and non-polar species can be analyzed from canisters. A new canister autosampler, the Tekmar AUTOCan{trademark} Elite autosampler, has been developed which incorporates the autosampler and concentrator into a single unit. Analysis of polarmore » and non-polar VOCs has been performed. This paper demonstrates adherence to the technical acceptance objectives outlined in the TO-15 methodology including initial calibration, daily calibration, blank analysis, method detection limits and laboratory control samples. The analytical system consists of a Tekmar AUTOCan{trademark} Elite autosampler interfaced to a Hewlett Packard{reg_sign} 5890/5972 MSD.« less

  13. Differential change in integrative psychotherapy: a re-analysis of a change-factor based RCT in a naturalistic setting.

    PubMed

    Holtforth, Martin Grosse; Wilm, Katharina; Beyermann, Stefanie; Rhode, Annemarie; Trost, Stephanie; Steyer, Rolf

    2011-11-01

    General Psychotherapy (GPT; Grawe, 1997) is a research-informed psychotherapy that combines cognitive-behavioral and process-experiential techniques and that assumes motivational clarification and problem mastery as central mechanisms of change. To isolate the effect of motivational clarification, GPT was compared to a treatment that proscribed motivational clarification (General Psychotherapy Minus Clarification, GPT-C) in a randomized-controlled trial with 67 diagnostically heterogeneous outpatients. Previous analyses demonstrated equal outcomes and some superiority for highly avoidant patients in GPT. Re-analyses using causal-analytic methods confirmed equal changes, but also showed superior effects for GPT in highly symptomatic patients. Results are discussed regarding theory, methodological limitations, and implications for research and practice.

  14. Temperature-programmed technique accompanied with high-throughput methodology for rapidly searching the optimal operating temperature of MOX gas sensors.

    PubMed

    Zhang, Guozhu; Xie, Changsheng; Zhang, Shunping; Zhao, Jianwei; Lei, Tao; Zeng, Dawen

    2014-09-08

    A combinatorial high-throughput temperature-programmed method to obtain the optimal operating temperature (OOT) of gas sensor materials is demonstrated here for the first time. A material library consisting of SnO2, ZnO, WO3, and In2O3 sensor films was fabricated by screen printing. Temperature-dependent conductivity curves were obtained by scanning this gas sensor library from 300 to 700 K in different atmospheres (dry air, formaldehyde, carbon monoxide, nitrogen dioxide, toluene and ammonia), giving the OOT of each sensor formulation as a function of the carrier and analyte gases. A comparative study of the temperature-programmed method and a conventional method showed good agreement in measured OOT.

  15. [The socio-hygienic monitoring as an integral system for health risk assessment and risk management at the regional level].

    PubMed

    Kuzmin, S V; Gurvich, V B; Dikonskaya, O V; Malykh, O L; Yarushin, S V; Romanov, S V; Kornilkov, A S

    2013-01-01

    The information and analytical framework for the introduction of health risk assessment and risk management methodologies in the Sverdlovsk Region is the system of socio-hygienic monitoring. Techniques of risk management that take into account the choice of most cost-effective and efficient actions for improvement of the sanitary and epidemiologic situation at the level of the region, municipality, or a business entity of the Russian Federation, have been developed and proposed. To assess the efficiency of planning and activities for health risk management common method approaches and economic methods of "cost-effectiveness" and "cost-benefit" analyses provided in method recommendations and introduced in the Russian Federation are applied.

  16. Multilayer limb quasi-static electromagnetic modeling with experiments for Galvanic coupling type IBC.

    PubMed

    Pun, S H; Gao, Y M; Mou, P A; Mak, P U; Vai, M I; Du, M

    2010-01-01

    Intra-body communication (IBC) is a new, emerging, short-range and human body based communication methodology. It is a technique to network various devices on human body, by utilizing the conducting properties of human tissues. For currently fast developed Body area network(BAN)/Body sensor network(BSN), IBC is believed to have advantages in power consumption, electromagnetic radiation, interference from external electromagnetic noise, security, and restriction in spectrum resource. In this article, the authors propose an improved mathematical model, which includes both electrical properties and proportion of human tissues, for IBC on a human limb. By solving the mathematical model analytically on four-layer system (skin, fat, muscle, and bone) and conducting in-vivo experiment, a comparison has been conducted.

  17. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review.

    PubMed

    Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth

    2017-11-28

    The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.

  18. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  19. Combinations of techniques that effectively change health behavior: evidence from Meta-CART analysis.

    PubMed

    Dusseldorp, Elise; van Genugten, Lenneke; van Buuren, Stef; Verheijden, Marieke W; van Empelen, Pepijn

    2014-12-01

    Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic effects. This study overcomes some of the shortcomings in the current methodology by applying classification and regression trees (CART) to meta-analytic data in a special way, referred to as Meta-CART. The aim was to identify particular combinations of BCTs that explain intervention success. A reanalysis of data from Michie, Abraham, Whittington, McAteer, and Gupta (2009) was performed. These data included effect sizes from 122 interventions targeted at physical activity and healthy eating, and the coding of the interventions into 26 BCTs. A CART analysis was performed using the BCTs as predictors and treatment success (i.e., effect size) as outcome. A subgroup meta-analysis using a mixed effects model was performed to compare the treatment effect in the subgroups found by CART. Meta-CART identified the following most effective combinations: Provide information about behavior-health link with Prompt intention formation (mean effect size ḡ = 0.46), and Provide information about behavior-health link with Provide information on consequences and Use of follow-up prompts (ḡ = 0.44). Least effective interventions were those using Provide feedback on performance without using Provide instruction (ḡ = 0.05). Specific combinations of BCTs increase the likelihood of achieving change in health behavior, whereas other combinations decrease this likelihood. Meta-CART successfully identified these combinations and thus provides a viable methodology in the context of meta-analysis.

  20. Hierarchical Analytical Approaches for Unraveling the Composition of Proprietary Mixtures

    EPA Pesticide Factsheets

    The composition of commercial mixtures including pesticide inert ingredients, aircraft deicers, and aqueous film-forming foam (AFFF) formulations, and by analogy, fracking fluids, are proprietary. Quantitative analytical methodologies can only be developed for mixture components once their identities are known. Because proprietary mixtures may contain volatile and non-volatile components, a hierarchy of analytical methods is often required for the full identification of all proprietary mixture components.

Top