Sample records for analytical techniques needed

  1. Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.

    PubMed

    Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs

    2018-01-01

    While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.

  2. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  3. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  4. Sensor failure detection for jet engines using analytical redundance

    NASA Technical Reports Server (NTRS)

    Merrill, W. C.

    1984-01-01

    Analytical redundant sensor failure detection, isolation and accommodation techniques for gas turbine engines are surveyed. Both the theoretical technology base and demonstrated concepts are discussed. Also included is a discussion of current technology needs and ongoing Government sponsored programs to meet those needs.

  5. Analytical techniques for steroid estrogens in water samples - A review.

    PubMed

    Fang, Ting Yien; Praveena, Sarva Mangala; deBurbure, Claire; Aris, Ahmad Zaharin; Ismail, Sharifah Norkhadijah Syed; Rasdi, Irniza

    2016-12-01

    In recent years, environmental concerns over ultra-trace levels of steroid estrogens concentrations in water samples have increased because of their adverse effects on human and animal life. Special attention to the analytical techniques used to quantify steroid estrogens in water samples is therefore increasingly important. The objective of this review was to present an overview of both instrumental and non-instrumental analytical techniques available for the determination of steroid estrogens in water samples, evidencing their respective potential advantages and limitations using the Need, Approach, Benefit, and Competition (NABC) approach. The analytical techniques highlighted in this review were instrumental and non-instrumental analytical techniques namely gas chromatography mass spectrometry (GC-MS), liquid chromatography mass spectrometry (LC-MS), enzyme-linked immuno sorbent assay (ELISA), radio immuno assay (RIA), yeast estrogen screen (YES) assay, and human breast cancer cell line proliferation (E-screen) assay. The complexity of water samples and their low estrogenic concentrations necessitates the use of highly sensitive instrumental analytical techniques (GC-MS and LC-MS) and non-instrumental analytical techniques (ELISA, RIA, YES assay and E-screen assay) to quantify steroid estrogens. Both instrumental and non-instrumental analytical techniques have their own advantages and limitations. However, the non-instrumental ELISA analytical techniques, thanks to its lower detection limit and simplicity, its rapidity and cost-effectiveness, currently appears to be the most reliable for determining steroid estrogens in water samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  7. The contribution of Raman spectroscopy to the analytical quality control of cytotoxic drugs in a hospital environment: eliminating the exposure risks for staff members and their work environment.

    PubMed

    Bourget, Philippe; Amin, Alexandre; Vidal, Fabrice; Merlette, Christophe; Troude, Pénélope; Baillet-Guffroy, Arlette

    2014-08-15

    The purpose of the study was to perform a comparative analysis of the technical performance, respective costs and environmental effect of two invasive analytical methods (HPLC and UV/visible-FTIR) as compared to a new non-invasive analytical technique (Raman spectroscopy). Three pharmacotherapeutic models were used to compare the analytical performances of the three analytical techniques. Statistical inter-method correlation analysis was performed using non-parametric correlation rank tests. The study's economic component combined calculations relative to the depreciation of the equipment and the estimated cost of an AQC unit of work. In any case, analytical validation parameters of the three techniques were satisfactory, and strong correlations between the two spectroscopic techniques vs. HPLC were found. In addition, Raman spectroscopy was found to be superior as compared to the other techniques for numerous key criteria including a complete safety for operators and their occupational environment, a non-invasive procedure, no need for consumables, and a low operating cost. Finally, Raman spectroscopy appears superior for technical, economic and environmental objectives, as compared with the other invasive analytical methods. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  9. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  10. The forensic validity of visual analytics

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.

    2008-01-01

    The wider use of visualization and visual analytics in wide ranging fields has led to the need for visual analytics capabilities to be legally admissible, especially when applied to digital forensics. This brings the need to consider legal implications when performing visual analytics, an issue not traditionally examined in visualization and visual analytics techniques and research. While digital data is generally admissible under the Federal Rules of Evidence [10][21], a comprehensive validation of the digital evidence is considered prudent. A comprehensive validation requires validation of the digital data under rules for authentication, hearsay, best evidence rule, and privilege. Additional issues with digital data arise when exploring digital data related to admissibility and the validity of what information was examined, to what extent, and whether the analysis process was sufficiently covered by a search warrant. For instance, a search warrant generally covers very narrow requirements as to what law enforcement is allowed to examine and acquire during an investigation. When searching a hard drive for child pornography, how admissible is evidence of an unrelated crime, i.e. drug dealing. This is further complicated by the concept of "in plain view". When performing an analysis of a hard drive what would be considered "in plain view" when analyzing a hard drive. The purpose of this paper is to discuss the issues of digital forensics and the related issues as they apply to visual analytics and identify how visual analytics techniques fit into the digital forensics analysis process, how visual analytics techniques can improve the legal admissibility of digital data, and identify what research is needed to further improve this process. The goal of this paper is to open up consideration of legal ramifications among the visualization community; the author is not a lawyer and the discussions are not meant to be inclusive of all differences in laws between states and countries.

  11. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  12. A multi-analyte biosensor for the simultaneous label-free detection of pathogens and biomarkers in point-of-need animal testing.

    PubMed

    Ewald, Melanie; Fechner, Peter; Gauglitz, Günter

    2015-05-01

    For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.

  13. Predictive modeling of complications.

    PubMed

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  14. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  15. A graph algebra for scalable visual analytics.

    PubMed

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  16. The Analytical Bibliographer and the Conservator.

    ERIC Educational Resources Information Center

    Koda, Paul S.

    1979-01-01

    Discusses areas where the work of analytical bibliographers and conservators overlaps and diverges in relation to their techniques and inquiries in handling physical books. Special attention is paid to their attitudes to physical details, the ways they record information, ethical questions, and the need for a common language. (Author)

  17. User-Centered Evaluation of Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean C.

    Visual analytics systems are becoming very popular. More domains now use interactive visualizations to analyze the ever-increasing amount and heterogeneity of data. More novel visualizations are being developed for more tasks and users. We need to ensure that these systems can be evaluated to determine that they are both useful and usable. A user-centered evaluation for visual analytics needs to be developed for these systems. While many of the typical human-computer interaction (HCI) evaluation methodologies can be applied as is, others will need modification. Additionally, new functionality in visual analytics systems needs new evaluation methodologies. There is a difference betweenmore » usability evaluations and user-centered evaluations. Usability looks at the efficiency, effectiveness, and user satisfaction of users carrying out tasks with software applications. User-centered evaluation looks more specifically at the utility provided to the users by the software. This is reflected in the evaluations done and in the metrics used. In the visual analytics domain this is very challenging as users are most likely experts in a particular domain, the tasks they do are often not well defined, the software they use needs to support large amounts of different kinds of data, and often the tasks last for months. These difficulties are discussed more in the section on User-centered Evaluation. Our goal is to provide a discussion of user-centered evaluation practices for visual analytics, including existing practices that can be carried out and new methodologies and metrics that need to be developed and agreed upon by the visual analytics community. The material provided here should be of use for both researchers and practitioners in the field of visual analytics. Researchers and practitioners in HCI and interested in visual analytics will find this information useful as well as a discussion on changes that need to be made to current HCI practices to make them more suitable to visual analytics. A history of analysis and analysis techniques and problems is provided as well as an introduction to user-centered evaluation and various evaluation techniques for readers from different disciplines. The understanding of these techniques is imperative if we wish to support analysis in the visual analytics software we develop. Currently the evaluations that are conducted and published for visual analytics software are very informal and consist mainly of comments from users or potential users. Our goal is to help researchers in visual analytics to conduct more formal user-centered evaluations. While these are time-consuming and expensive to carryout, the outcomes of these studies will have a defining impact on the field of visual analytics and help point the direction for future features and visualizations to incorporate. While many researchers view work in user-centered evaluation as a less-than-exciting area to work, the opposite is true. First of all, the goal is user-centered evaluation is to help visual analytics software developers, researchers, and designers improve their solutions and discover creative ways to better accommodate their users. Working with the users is extremely rewarding as well. While we use the term “users” in almost all situations there are a wide variety of users that all need to be accommodated. Moreover, the domains that use visual analytics are varied and expanding. Just understanding the complexities of a number of these domains is exciting. Researchers are trying out different visualizations and interactions as well. And of course, the size and variety of data are expanding rapidly. User-centered evaluation in this context is rapidly changing. There are no standard processes and metrics and thus those of us working on user-centered evaluation must be creative in our work with both the users and with the researchers and developers.« less

  18. DEVELOPMENT AND VALIDATION OF ANALYTICAL METHODS FOR ENUMERATION OF FECAL INDICATORS AND EMERGING CHEMICAL CONTAMINANTS IN BIOSOLIDS

    EPA Science Inventory

    In 2002 the National Research Council (NRC) issued a report which identified a number of issues regarding biosolids land application practices and pointed out the need for improved and validated analytical techniques for regulated indicator organisms and pathogens. They also call...

  19. Introducing Chemometrics to the Analytical Curriculum: Combining Theory and Lab Experience

    ERIC Educational Resources Information Center

    Gilbert, Michael K.; Luttrell, Robert D.; Stout, David; Vogt, Frank

    2008-01-01

    Beer's law is an ideal technique that works only in certain situations. A method for dealing with more complex conditions needs to be integrated into the analytical chemistry curriculum. For that reason, the capabilities and limitations of two common chemometric algorithms, classical least squares (CLS) and principal component regression (PCR),…

  20. Identification and confirmation of chemical residues by chromatography-mass spectrometry and other techniques

    USDA-ARS?s Scientific Manuscript database

    A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...

  1. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  2. On Establishing Big Data Wave Breakwaters with Analytics (Invited)

    NASA Astrophysics Data System (ADS)

    Riedel, M.

    2013-12-01

    The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.

  3. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  4. Geophysical technique for mineral exploration and discrimination based on electromagnetic methods and associated systems

    DOEpatents

    Zhdanov,; Michael, S [Salt Lake City, UT

    2008-01-29

    Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.

  5. Precise Heat Control: What Every Scientist Needs to Know About Pyrolytic Techniques to Solve Real Problems

    NASA Technical Reports Server (NTRS)

    Devivar, Rodrigo

    2014-01-01

    The performance of a material is greatly influenced by its thermal and chemical properties. Analytical pyrolysis, when coupled to a GC-MS system, is a powerful technique that can unlock the thermal and chemical properties of almost any substance and provide vital information. At NASA, we depend on precise thermal analysis instrumentation for understanding aerospace travel. Our analytical techniques allow us to test materials in the laboratory prior to an actual field test; whether the field test is miles up in the sky or miles underground, the properties of any involved material must be fully studied and understood in the laboratory.

  6. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management

    PubMed Central

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-01-01

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM3 ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs. PMID:27983713

  7. Problem Formulation in Knowledge Discovery via Data Analytics (KDDA) for Environmental Risk Management.

    PubMed

    Li, Yan; Thomas, Manoj; Osei-Bryson, Kweku-Muata; Levy, Jason

    2016-12-15

    With the growing popularity of data analytics and data science in the field of environmental risk management, a formalized Knowledge Discovery via Data Analytics (KDDA) process that incorporates all applicable analytical techniques for a specific environmental risk management problem is essential. In this emerging field, there is limited research dealing with the use of decision support to elicit environmental risk management (ERM) objectives and identify analytical goals from ERM decision makers. In this paper, we address problem formulation in the ERM understanding phase of the KDDA process. We build a DM³ ontology to capture ERM objectives and to inference analytical goals and associated analytical techniques. A framework to assist decision making in the problem formulation process is developed. It is shown how the ontology-based knowledge system can provide structured guidance to retrieve relevant knowledge during problem formulation. The importance of not only operationalizing the KDDA approach in a real-world environment but also evaluating the effectiveness of the proposed procedure is emphasized. We demonstrate how ontology inferencing may be used to discover analytical goals and techniques by conceptualizing Hazardous Air Pollutants (HAPs) exposure shifts based on a multilevel analysis of the level of urbanization (and related economic activity) and the degree of Socio-Economic Deprivation (SED) at the local neighborhood level. The HAPs case highlights not only the role of complexity in problem formulation but also the need for integrating data from multiple sources and the importance of employing appropriate KDDA modeling techniques. Challenges and opportunities for KDDA are summarized with an emphasis on environmental risk management and HAPs.

  8. An analytical-numerical approach for parameter determination of a five-parameter single-diode model of photovoltaic cells and modules

    NASA Astrophysics Data System (ADS)

    Hejri, Mohammad; Mokhtari, Hossein; Azizian, Mohammad Reza; Söder, Lennart

    2016-04-01

    Parameter extraction of the five-parameter single-diode model of solar cells and modules from experimental data is a challenging problem. These parameters are evaluated from a set of nonlinear equations that cannot be solved analytically. On the other hand, a numerical solution of such equations needs a suitable initial guess to converge to a solution. This paper presents a new set of approximate analytical solutions for the parameters of a five-parameter single-diode model of photovoltaic (PV) cells and modules. The proposed solutions provide a good initial point which guarantees numerical analysis convergence. The proposed technique needs only a few data from the PV current-voltage characteristics, i.e. open circuit voltage Voc, short circuit current Isc and maximum power point current and voltage Im; Vm making it a fast and low cost parameter determination technique. The accuracy of the presented theoretical I-V curves is verified by experimental data.

  9. Computing sensitivity and selectivity in parallel factor analysis and related multiway techniques: the need for further developments in net analyte signal theory.

    PubMed

    Olivieri, Alejandro C

    2005-08-01

    Sensitivity and selectivity are important figures of merit in multiway analysis, regularly employed for comparison of the analytical performance of methods and for experimental design and planning. They are especially interesting in the second-order advantage scenario, where the latter property allows for the analysis of samples with a complex background, permitting analyte determination even in the presence of unsuspected interferences. Since no general theory exists for estimating the multiway sensitivity, Monte Carlo numerical calculations have been developed for estimating variance inflation factors, as a convenient way of assessing both sensitivity and selectivity parameters for the popular parallel factor (PARAFAC) analysis and also for related multiway techniques. When the second-order advantage is achieved, the existing expressions derived from net analyte signal theory are only able to adequately cover cases where a single analyte is calibrated using second-order instrumental data. However, they fail for certain multianalyte cases, or when third-order data are employed, calling for an extension of net analyte theory. The results have strong implications in the planning of multiway analytical experiments.

  10. Advanced analytical electron microscopy for alkali-ion batteries

    DOE PAGES

    Qian, Danna; Ma, Cheng; Meng, Ying Shirley; ...

    2015-06-26

    Lithium-ion batteries are a leading candidate for electric vehicle and smart grid applications. However, further optimizations of the energy/power density, coulombic efficiency and cycle life are still needed, and this requires a thorough understanding of the dynamic evolution of each component and their synergistic behaviors during battery operation. With the capability of resolving the structure and chemistry at an atomic resolution, advanced analytical transmission electron microscopy (AEM) is an ideal technique for this task. The present review paper focuses on recent contributions of this important technique to the fundamental understanding of the electrochemical processes of battery materials. A detailed reviewmore » of both static (ex situ) and real-time (in situ) studies will be given, and issues that still need to be addressed will be discussed.« less

  11. Analytical Plan for Roman Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strachan, Denis M.; Buck, Edgar C.; Mueller, Karl T.

    Roman glasses that have been in the sea or underground for about 1800 years can serve as the independent “experiment” that is needed for validation of codes and models that are used in performance assessment. Two sets of Roman-era glasses have been obtained for this purpose. One set comes from the sunken vessel the Iulia Felix; the second from recently excavated glasses from a Roman villa in Aquileia, Italy. The specimens contain glass artifacts and attached sediment or soil. In the case of the Iulia Felix glasses quite a lot of analytical work has been completed at the University ofmore » Padova, but from an archaeological perspective. The glasses from Aquileia have not been so carefully analyzed, but they are similar to other Roman glasses. Both glass and sediment or soil need to be analyzed and are the subject of this analytical plan. The glasses need to be analyzed with the goal of validating the model used to describe glass dissolution. The sediment and soil need to be analyzed to determine the profile of elements released from the glass. This latter need represents a significant analytical challenge because of the trace quantities that need to be analyzed. Both pieces of information will yield important information useful in the validation of the glass dissolution model and the chemical transport code(s) used to determine the migration of elements once released from the glass. In this plan, we outline the analytical techniques that should be useful in obtaining the needed information and suggest a useful starting point for this analytical effort.« less

  12. Analytical challenges for conducting rapid metabolism characterization for QIVIVE.

    PubMed

    Tolonen, Ari; Pelkonen, Olavi

    2015-06-05

    For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  13. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Characterization of carrier erythrocytes for biosensing applications

    NASA Astrophysics Data System (ADS)

    Bustamante López, Sandra C.; Meissner, Kenith E.

    2017-09-01

    Erythrocyte abundance, mobility, and carrying capacity make them attractive as a platform for blood analyte sensing as well as for drug delivery. Sensor-loaded erythrocytes, dubbed erythrosensors, could be reinfused into the bloodstream, excited noninvasively through the skin, and used to provide measurement of analyte levels in the bloodstream. Several techniques to load erythrocytes, thus creating carrier erythrocytes, exist. However, their cellular characteristics remain largely unstudied. Changes in cellular characteristics lead to removal from the bloodstream. We hypothesize that erythrosensors need to maintain native erythrocytes' (NEs) characteristics to serve as a long-term sensing platform. Here, we investigate two loading techniques and the properties of the resulting erythrosensors. For loading, hypotonic dilution requires a hypotonic solution while electroporation relies on electrical pulses to perforate the erythrocyte membrane. We analyze the resulting erythrosensor signal, size, morphology, and hemoglobin content. Although the resulting erythrosensors exhibit morphological changes, their size was comparable with NEs. The hypotonic dilution technique was found to load erythrosensors much more efficiently than electroporation, and the sensors were loaded throughout the volume of the erythrosensors. Finally, both techniques resulted in significant loss of hemoglobin. This study points to the need for continued development of loading techniques that better preserve NE characteristics.

  15. Modern analytical methods for the detection of food fraud and adulteration by food category.

    PubMed

    Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook

    2017-09-01

    This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  16. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    NASA Technical Reports Server (NTRS)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  17. Status and Needs Research for On-line Monitoring of VOCs Emissions from Stationary Sources

    NASA Astrophysics Data System (ADS)

    Zhou, Gang; Wang, Qiang; Zhong, Qi; Zhao, Jinbao; Yang, Kai

    2018-01-01

    Based on atmospheric volatile organic compounds (VOCs) pollution control requirements during the twelfth-five year plan and the current status of monitoring and management at home and abroad, instrumental architecture and technical characteristics of continuous emission monitoring systems (CEMS) for VOCs emission from stationary sources are investigated and researched. Technological development needs of VOCs emission on-line monitoring techniques for stationary sources in china are proposed from the system sampling pretreatment technology and analytical measurement techniques.

  18. [Composition of chicken and quail eggs].

    PubMed

    Closa, S J; Marchesich, C; Cabrera, M; Morales, J C

    1999-06-01

    Qualified food composition data on lipids composition are needed to evaluate intakes as a risk factor in the development of heart disease. Proximal composition, cholesterol and fatty acid content of chicken and quail eggs, usually consumed or traded, were analysed. Proximal composition were determined using AOAC (1984) specific techniques; lipids were extracted by a Folch's modified technique and cholesterol and fatty acids were determined by gas chromatography. Results corroborate the stability of eggs composition. Cholesterol content of quail eggs is similar to chicken eggs, but it is almost the half content of data registered in Handbook 8. Differences may be attributed to the analytical methodology used to obtain them. This study provides data obtained with up-date analytical techniques and accessory information useful for food composition tables.

  19. Comparison of three chromatographic techniques for the detection of mitragynine and other indole and oxindole alkaloids in mitragyna speciosa (Kratom) plants

    USDA-ARS?s Scientific Manuscript database

    Leaves of the Southeast Asian plant Mitragyna speciosa (kratom) are used to suppress pain and mitigate opioid withdrawal syndromes. The potential threat of abuse and ready availability of this uncontrolled psychoactive plant material in the U.S. have led to the need for improved analytical technique...

  20. Techniques of Differentiation and Integration, Mathematics (Experimental): 5297.27.

    ERIC Educational Resources Information Center

    Forrester, Gary B.

    This guidebook on minimum course content was designed for students who have mastered the skills and concepts of analytic geometry. It is a short course in the basic techniques of calculus recommended for the student who has need of these skills in other courses such as beginning physics, economics or statistics. The course does not intend to teach…

  1. Analytical reverse time migration: An innovation in imaging of infrastructures using ultrasonic shear waves.

    PubMed

    Asadollahi, Aziz; Khazanovich, Lev

    2018-04-11

    The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Practical guidelines for the characterization and quality control of pure drug nanoparticles and nano-cocrystals in the pharmaceutical industry.

    PubMed

    Peltonen, Leena

    2018-06-16

    The number of poorly soluble drug candidates is increasing, and this is also seen in the research interest towards drug nanoparticles and (nano-)cocrystals; improved solubility is the most important application of these nanosystems. In order to confirm the functionality of these nanoparticles throughout their lifecycle, repeatability of the formulation processes, functional performance of the formed systems in pre-determined way and system stability, a thorough physicochemical understanding with the aid of necessary analytical techniques is needed. Even very minor deviations in for example particle size or size deviation in nanoscale can alter the product bioavailability, and the effect is even more dramatic with the smallest particle size fractions. Also, small particle size sets special requirements for the analytical techniques. In this review most important physicochemical properties of drug nanocrystals and nano-cocrystals are presented, suitable analytical techniques, their pros and cons, are described with the extra input on practical point of view. Copyright © 2018. Published by Elsevier B.V.

  3. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    NASA Astrophysics Data System (ADS)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  4. Rethinking Visual Analytics for Streaming Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crouser, R. Jordan; Franklin, Lyndsey; Cook, Kris

    In the age of data science, the use of interactive information visualization techniques has become increasingly ubiquitous. From online scientific journals to the New York Times graphics desk, the utility of interactive visualization for both storytelling and analysis has become ever more apparent. As these techniques have become more readily accessible, the appeal of combining interactive visualization with computational analysis continues to grow. Arising out of a need for scalable, human-driven analysis, primary objective of visual analytics systems is to capitalize on the complementary strengths of human and machine analysis, using interactive visualization as a medium for communication between themore » two. These systems leverage developments from the fields of information visualization, computer graphics, machine learning, and human-computer interaction to support insight generation in areas where purely computational analyses fall short. Over the past decade, visual analytics systems have generated remarkable advances in many historically challenging analytical contexts. These include areas such as modeling political systems [Crouser et al. 2012], detecting financial fraud [Chang et al. 2008], and cybersecurity [Harrison et al. 2012]. In each of these contexts, domain expertise and human intuition is a necessary component of the analysis. This intuition is essential to building trust in the analytical products, as well as supporting the translation of evidence into actionable insight. In addition, each of these examples also highlights the need for scalable analysis. In each case, it is infeasible for a human analyst to manually assess the raw information unaided, and the communication overhead to divide the task between a large number of analysts makes simple parallelism intractable. Regardless of the domain, visual analytics tools strive to optimize the allocation of human analytical resources, and to streamline the sensemaking process on data that is massive, complex, incomplete, and uncertain in scenarios requiring human judgment.« less

  5. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Astrophysics Data System (ADS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-05-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  6. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    NASA Technical Reports Server (NTRS)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  7. SOIL AND SEDIMENT SAMPLING METHODS

    EPA Science Inventory

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout th...

  8. Instrumentation for analytical scale supercritical fluid chromatography.

    PubMed

    Berger, Terry A

    2015-11-20

    Analytical scale supercritical fluid chromatography (SFC) is largely a sub-discipline of high performance liquid chromatography (HPLC), in that most of the hardware and software can be used for either technique. The aspects that separate the 2 techniques stem from the use of carbon dioxide (CO2) as the main component of the mobile phase in SFC. The high compressibility and low viscosity of CO2 mean that pumps, and autosamplers designed for HPLC either need to be modified or an alternate means of dealing with compressibility needs to be found. The inclusion of a back pressure regulator and a high pressure flow cell for any UV-Vis detector are also necessary. Details of the various approaches, problems and solutions are described. Characteristics, such as adiabatic vs. isothermal compressibility, thermal gradients, and refractive index issues are dealt with in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Suitability of analytical methods to measure solubility for the purpose of nanoregulation.

    PubMed

    Tantra, Ratna; Bouwmeester, Hans; Bolea, Eduardo; Rey-Castro, Carlos; David, Calin A; Dogné, Jean-Michel; Jarman, John; Laborda, Francisco; Laloy, Julie; Robinson, Kenneth N; Undas, Anna K; van der Zande, Meike

    2016-01-01

    Solubility is an important physicochemical parameter in nanoregulation. If nanomaterial is completely soluble, then from a risk assessment point of view, its disposal can be treated much in the same way as "ordinary" chemicals, which will simplify testing and characterisation regimes. This review assesses potential techniques for the measurement of nanomaterial solubility and evaluates the performance against a set of analytical criteria (based on satisfying the requirements as governed by the cosmetic regulation as well as the need to quantify the concentration of free (hydrated) ions). Our findings show that no universal method exists. A complementary approach is thus recommended, to comprise an atomic spectrometry-based method in conjunction with an electrochemical (or colorimetric) method. This article shows that although some techniques are more commonly used than others, a huge research gap remains, related with the need to ensure data reliability.

  10. Analytical toxicology.

    PubMed

    Flanagan, R J; Widdop, B; Ramsey, J D; Loveland, M

    1988-09-01

    1. Major advances in analytical toxicology followed the introduction of spectroscopic and chromatographic techniques in the 1940s and early 1950s and thin layer chromatography remains important together with some spectrophotometric and other tests. However, gas- and high performance-liquid chromatography together with a variety of immunoassay techniques are now widely used. 2. The scope and complexity of forensic and clinical toxicology continues to increase, although the compounds for which emergency analyses are needed to guide therapy are few. Exclusion of the presence of hypnotic drugs can be important in suspected 'brain death' cases. 3. Screening for drugs of abuse has assumed greater importance not only for the management of the habituated patient, but also in 'pre-employment' and 'employment' screening. The detection of illicit drug administration in sport is also an area of increasing importance. 4. In industrial toxicology, the range of compounds for which blood or urine measurements (so called 'biological monitoring') can indicate the degree of exposure is increasing. The monitoring of environmental contaminants (lead, chlorinated pesticides) in biological samples has also proved valuable. 5. In the near future a consensus as to the units of measurement to be used is urgently required and more emphasis will be placed on interpretation, especially as regards possible behavioural effects of drugs or other poisons. Despite many advances in analytical techniques there remains a need for reliable, simple tests to detect poisons for use in smaller hospital and other laboratories.

  11. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  12. Analytical methods for determination of mycotoxins: a review.

    PubMed

    Turner, Nicholas W; Subrahmanyam, Sreenath; Piletsky, Sergey A

    2009-01-26

    Mycotoxins are small (MW approximately 700), toxic chemical products formed as secondary metabolites by a few fungal species that readily colonise crops and contaminate them with toxins in the field or after harvest. Ochratoxins and Aflatoxins are mycotoxins of major significance and hence there has been significant research on broad range of analytical and detection techniques that could be useful and practical. Due to the variety of structures of these toxins, it is impossible to use one standard technique for analysis and/or detection. Practical requirements for high-sensitivity analysis and the need for a specialist laboratory setting create challenges for routine analysis. Several existing analytical techniques, which offer flexible and broad-based methods of analysis and in some cases detection, have been discussed in this manuscript. There are a number of methods used, of which many are lab-based, but to our knowledge there seems to be no single technique that stands out above the rest, although analytical liquid chromatography, commonly linked with mass spectroscopy is likely to be popular. This review manuscript discusses (a) sample pre-treatment methods such as liquid-liquid extraction (LLE), supercritical fluid extraction (SFE), solid phase extraction (SPE), (b) separation methods such as (TLC), high performance liquid chromatography (HPLC), gas chromatography (GC), and capillary electrophoresis (CE) and (c) others such as ELISA. Further currents trends, advantages and disadvantages and future prospects of these methods have been discussed.

  13. Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey

    State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less

  14. Analytical solutions for determining residual stresses in two-dimensional domains using the contour method

    PubMed Central

    Kartal, Mehmet E.

    2013-01-01

    The contour method is one of the most prevalent destructive techniques for residual stress measurement. Up to now, the method has involved the use of the finite-element (FE) method to determine the residual stresses from the experimental measurements. This paper presents analytical solutions, obtained for a semi-infinite strip and a finite rectangle, which can be used to calculate the residual stresses directly from the measured data; thereby, eliminating the need for an FE approach. The technique is then used to determine the residual stresses in a variable-polarity plasma-arc welded plate and the results show good agreement with independent neutron diffraction measurements. PMID:24204187

  15. Prediction of light aircraft interior noise

    NASA Technical Reports Server (NTRS)

    Howlett, J. T.; Morales, D. A.

    1976-01-01

    At the present time, predictions of aircraft interior noise depend heavily on empirical correction factors derived from previous flight measurements. However, to design for acceptable interior noise levels and to optimize acoustic treatments, analytical techniques which do not depend on empirical data are needed. This paper describes a computerized interior noise prediction method for light aircraft. An existing analytical program (developed for commercial jets by Cockburn and Jolly in 1968) forms the basis of some modal analysis work which is described. The accuracy of this modal analysis technique for predicting low-frequency coupled acoustic-structural natural frequencies is discussed along with trends indicating the effects of varying parameters such as fuselage length and diameter, structural stiffness, and interior acoustic absorption.

  16. Challenges in Modern Anti-Doping Analytical Science.

    PubMed

    Ayotte, Christiane; Miller, John; Thevis, Mario

    2017-01-01

    The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.

  17. Incorporating Ecosystem Goods and Services in Environmental Planning - Definitions, Classification and Operational Approaches

    DTIC Science & Technology

    2013-07-01

    water resource project planning and management; the authors also seek to identify any research needs to accommodate that goal. This technical note and...review of the state of the science of EGS and highlights the types of analytical tools, techniques, and considerations that would be needed within a...instructions, searching existing data sources, gathering and maintaining the data needed , and completing and reviewing the collection of information. Send

  18. An Overview of the Analysis of Trace Organics in Water.

    ERIC Educational Resources Information Center

    Trussell, Albert R.; Umphres, Mark D.

    1978-01-01

    Summarized are current analytical techniques used to classify, isolate, resolve, identify, and quantify organic compounds present in drinking water. A variety of methods are described, then drawbacks and advantages are listed, and research needs and future trends are noted. (CS)

  19. Position Paper: NO equals x Measurement

    ERIC Educational Resources Information Center

    Hauser, Thomas R.; Shy, Carl M.

    1972-01-01

    Doubts about the accuracy of measured concentrations of nitrogen dioxide (NO 2) in ambient air have led the Environmental Protection Agency to reassess both the analytical technique and the extent to which nitrogen oxides (NO equals x) control will need to satisfy federal laws. (BL)

  20. Ambient Air Monitoring for Sulfur Compounds

    ERIC Educational Resources Information Center

    Forrest, Joseph; Newman, Leonard

    1973-01-01

    A literature review of analytical techniques available for the study of compounds at low concentrations points up some of the areas where further research is needed. Compounds reviewed are sulfur dioxide, sulfuric acid, ammonium sulfate and bisulfate, metal sulfates, hydrogen sulfide, and organic sulfides. (BL)

  1. Analytical Glycobiology at High Sensitivity: Current Approaches and Directions

    PubMed Central

    Novotny, Milos V.; Alley, William R.; Mann, Benjamin F.

    2013-01-01

    This review summarizes the analytical advances made during the last several years in the structural and quantitative determinations of glycoproteins in complex biological mixtures. The main analytical techniques used in the fields of glycomics and glycoproteomics involve different modes of mass spectrometry and their combinations with capillary separation methods such as microcolumn liquid chromatography and capillary electrophoresis. The needs for high-sensitivity measurements have been emphasized in the oligosaccharide profiling used in the field of biomarker discovery through MALDI mass spectrometry. High-sensitivity profiling of both glycans and glycopeptides from biological fluids and tissue extracts has been aided significantly through lectin preconcentration and the uses of affinity chromatography. PMID:22945852

  2. Good quantification practices of flavours and fragrances by mass spectrometry.

    PubMed

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  3. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs.

    PubMed

    Bayliss, Elizabeth A; Powers, J David; Ellis, Jennifer L; Barrow, Jennifer C; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0-4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery.

  4. Use of data sources, analytical techniques, and public involvement : MPO environmental justice report

    DOT National Transportation Integrated Search

    2001-01-01

    In the wake of new Federal guidelines on environmental justice that amplify Title VI of the Civil Rights Act, growing attention has been placed on the need to incorporate environmental justice principles into the processes and products of transportat...

  5. Analysis of high-aspect-ratio jet-flap wings of arbitrary geometry

    NASA Technical Reports Server (NTRS)

    Lissaman, P. B. S.

    1973-01-01

    An analytical technique to compute the performance of an arbitrary jet-flapped wing is developed. The solution technique is based on the method of Maskell and Spence in which the well-known lifting-line approach is coupled with an auxiliary equation providing the extra function needed in jet-flap theory. The present method is generalized to handle straight, uncambered wings of arbitrary planform, twist, and blowing (including unsymmetrical cases). An analytical procedure is developed for continuous variations in the above geometric data with special functions to exactly treat discontinuities in any of the geometric and blowing data. A rational theory for the effect of finite wing thickness is introduced as well as simplified concepts of effective aspect ratio for rapid estimation of performance.

  6. Visual analytics techniques for large multi-attribute time series data

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Keim, Daniel A.

    2008-01-01

    Time series data commonly occur when variables are monitored over time. Many real-world applications involve the comparison of long time series across multiple variables (multi-attributes). Often business people want to compare this year's monthly sales with last year's sales to make decisions. Data warehouse administrators (DBAs) want to know their daily data loading job performance. DBAs need to detect the outliers early enough to act upon them. In this paper, two new visual analytic techniques are introduced: The color cell-based Visual Time Series Line Charts and Maps highlight significant changes over time in a long time series data and the new Visual Content Query facilitates finding the contents and histories of interesting patterns and anomalies, which leads to root cause identification. We have applied both methods to two real-world applications to mine enterprise data warehouse and customer credit card fraud data to illustrate the wide applicability and usefulness of these techniques.

  7. XPS Protocol for the Characterization of Pristine and Functionalized Single Wall Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Sosa, E. D.; Allada, R.; Huffman, C. B.; Arepalli, S.

    2009-01-01

    Recent interest in developing new applications for carbon nanotubes (CNT) has fueled the need to use accurate macroscopic and nanoscopic techniques to characterize and understand their chemistry. X-ray photoelectron spectroscopy (XPS) has proved to be a useful analytical tool for nanoscale surface characterization of materials including carbon nanotubes. Recent nanotechnology research at NASA Johnson Space Center (NASA-JSC) helped to establish a characterization protocol for quality assessment for single wall carbon nanotubes (SWCNTs). Here, a review of some of the major factors of the XPS technique that can influence the quality of analytical data, suggestions for methods to maximize the quality of data obtained by XPS, and the development of a protocol for XPS characterization as a complementary technique for analyzing the purity and surface characteristics of SWCNTs is presented. The XPS protocol is then applied to a number of experiments including impurity analysis and the study of chemical modifications for SWCNTs.

  8. Application of analytical methods in authentication and adulteration of honey.

    PubMed

    Siddiqui, Amna Jabbar; Musharraf, Syed Ghulam; Choudhary, M Iqbal; Rahman, Atta-Ur-

    2017-02-15

    Honey is synthesized from flower nectar and it is famous for its tremendous therapeutic potential since ancient times. Many factors influence the basic properties of honey including the nectar-providing plant species, bee species, geographic area, and harvesting conditions. Quality and composition of honey is also affected by many other factors, such as overfeeding of bees with sucrose, harvesting prior to maturity, and adulteration with sugar syrups. Due to the complex nature of honey, it is often challenging to authenticate the purity and quality by using common methods such as physicochemical parameters and more specialized procedures need to be developed. This article reviews the literature (between 2000 and 2016) on the use of analytical techniques, mainly NMR spectroscopy, for authentication of honey, its botanical and geographical origin, and adulteration by sugar syrups. NMR is a powerful technique and can be used as a fingerprinting technique to compare various samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Synthesis of Human Factors Research on Older Drivers and Highway Safety. Volume I: Older Driver Research Synthesis

    DOT National Transportation Integrated Search

    1997-10-01

    The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...

  10. SYNTHESIS OF HUMAN FACTORS RESEARCH ON OLDER DRIVERS AND HIGHWAY SAFETY, Volume I: Older Driver Research Synthesis

    DOT National Transportation Integrated Search

    1999-11-23

    The overall goals in this project were to perform literature reviews and syntheses, using meta-analytic techniques, where appropriate, for a broad and comprehensive body of research findings on older driver needs and (diminished) capabilities, and a ...

  11. Problems and Issues in Meta-Analysis.

    ERIC Educational Resources Information Center

    George, Carrie A.

    Single studies, by themselves, rarely explain the effect of treatments or interventions definitively in the social sciences. Researchers created meta-analysis in the 1970s to address this need. Since then, meta-analytic techniques have been used to support certain treatment modalities and to influence policymakers. Although these techniques…

  12. Using predictive analytics and big data to optimize pharmaceutical outcomes.

    PubMed

    Hernandez, Inmaculada; Zhang, Yuting

    2017-09-15

    The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  13. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    PubMed

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. An Application of Trimethylsilyl Derivatives with Temperature Programmed Gas Chromatography to the Senior Analytical Laboratory.

    ERIC Educational Resources Information Center

    Kelter, Paul B.; Carr, James D.

    1983-01-01

    Describes an experiment designed to teach temperature programed gas chromatography (TPGC) techniques and importance of derivatizing many classes of substrated to be separated. Includes equipment needed, procedures for making trimethylsilyl derivatives, applications, sample calculations, and typical results. Procedure required one, three-hour…

  15. Nanometrology and its perspectives in environmental research.

    PubMed

    Kim, Hyun-A; Seo, Jung-Kwan; Kim, Taksoo; Lee, Byung-Tae

    2014-01-01

    Rapid increase in engineered nanoparticles (ENPs) in many goods has raised significant concern about their environmental safety. Proper methodologies are therefore needed to conduct toxicity and exposure assessment of nanoparticles in the environment. This study reviews several analytical techniques for nanoparticles and summarizes their principles, advantages and disadvantages, reviews the state of the art, and offers the perspectives of nanometrology in relation to ENP studies. Nanometrology is divided into five techniques with regard to the instrumental principle: microscopy, light scattering, spectroscopy, separation, and single particle inductively coupled plasma-mass spectrometry. Each analytical method has its own drawbacks, such as detection limit, ability to quantify or qualify ENPs, and matrix effects. More than two different analytical methods should be used to better characterize ENPs. In characterizing ENPs, the researchers should understand the nanometrology and its demerits, as well as its merits, to properly interpret their experimental results. Challenges lie in the nanometrology and pretreatment of ENPs from various matrices; in the extraction without dissolution or aggregation, and concentration of ENPs to satisfy the instrumental detection limit.

  16. Heat transfer with hockey-stick steam generator. [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moody, E; Gabler, M J

    1977-11-01

    The hockey-stick modular design concept is a good answer to future needs for reliable, economic LMFBR steam generators. The concept was successfully demonstrated in the 30 Mwt MSG test unit; scaled up versions are currently in fabrication for CRBRP usage, and further scaling has been accomplished for PLBR applications. Design and performance characteristics are presented for the three generations of hockey-stick steam generators. The key features of the design are presented based on extensive analytical effort backed up by extensive ancillary test data. The bases for and actual performance evaluations are presented with emphasis on the CRBRP design. The designmore » effort on these units has resulted in the development of analytical techniques that are directly applicable to steam generators for any LMFBR application. In conclusion, the hockey-stick steam generator concept has been proven to perform both thermally and hydraulically as predicted. The heat transfer characteristics are well defined, and proven analytical techniques are available as are personnel experienced in their use.« less

  17. Review of methods used for identification of biothreat agents in environmental protection and human health aspects.

    PubMed

    Mirski, Tomasz; Bartoszcze, Michał; Bielawska-Drózd, Agata; Cieślik, Piotr; Michalski, Aleksander J; Niemcewicz, Marcin; Kocik, Janusz; Chomiczewski, Krzysztof

    2014-01-01

    Modern threats of bioterrorism force the need to develop methods for rapid and accurate identification of dangerous biological agents. Currently, there are many types of methods used in this field of studies that are based on immunological or genetic techniques, or constitute a combination of both methods (immuno-genetic). There are also methods that have been developed on the basis of physical and chemical properties of the analytes. Each group of these analytical assays can be further divided into conventional methods (e.g. simple antigen-antibody reactions, classical PCR, real-time PCR), and modern technologies (e.g. microarray technology, aptamers, phosphors, etc.). Nanodiagnostics constitute another group of methods that utilize the objects at a nanoscale (below 100 nm). There are also integrated and automated diagnostic systems, which combine different methods and allow simultaneous sampling, extraction of genetic material and detection and identification of the analyte using genetic, as well as immunological techniques.

  18. Recent Advances in Paper-Based Sensors

    PubMed Central

    Liana, Devi D.; Raguse, Burkhard; Gooding, J. Justin; Chow, Edith

    2012-01-01

    Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed. PMID:23112667

  19. Current applications of high-resolution mass spectrometry for the analysis of new psychoactive substances: a critical review.

    PubMed

    Pasin, Daniel; Cawley, Adam; Bidny, Sergei; Fu, Shanlin

    2017-10-01

    The proliferation of new psychoactive substances (NPS) in recent years has resulted in the development of numerous analytical methods for the detection and identification of known and unknown NPS derivatives. High-resolution mass spectrometry (HRMS) has been identified as the method of choice for broad screening of NPS in a wide range of analytical contexts because of its ability to measure accurate masses using data-independent acquisition (DIA) techniques. Additionally, it has shown promise for non-targeted screening strategies that have been developed in order to detect and identify novel analogues without the need for certified reference materials (CRMs) or comprehensive mass spectral libraries. This paper reviews the applications of HRMS for the analysis of NPS in forensic drug chemistry and analytical toxicology. It provides an overview of the sample preparation procedures in addition to data acquisition, instrumental analysis, and data processing techniques. Furthermore, it gives an overview of the current state of non-targeted screening strategies with discussion on future directions and perspectives of this technique. Graphical Abstract Missing the bullseye - a graphical respresentation of non-targeted screening. Image courtesy of Christian Alonzo.

  20. A generic standard additions based method to determine endogenous analyte concentrations by immunoassays to overcome complex biological matrix interference.

    PubMed

    Pang, Susan; Cowen, Simon

    2017-12-13

    We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.

  1. Protein molecular data from ancient (>1 million years old) fossil material: pitfalls, possibilities and grand challenges.

    PubMed

    Schweitzer, Mary Higby; Schroeter, Elena R; Goshe, Michael B

    2014-07-15

    Advances in resolution and sensitivity of analytical techniques have provided novel applications, including the analyses of fossil material. However, the recovery of original proteinaceous components from very old fossil samples (defined as >1 million years (1 Ma) from previously named limits in the literature) is far from trivial. Here, we discuss the challenges to recovery of proteinaceous components from fossils, and the need for new sample preparation techniques, analytical methods, and bioinformatics to optimize and fully utilize the great potential of information locked in the fossil record. We present evidence for survival of original components across geological time, and discuss the potential benefits of recovery, analyses, and interpretation of fossil materials older than 1 Ma, both within and outside of the fields of evolutionary biology.

  2. Characterization and measurement of natural gas trace constituents. Volume 1. Arsenic. Final report, June 1989-October 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, S.S.; Attari, A.

    1995-01-01

    The discovery of arsenic compounds, as alkylarsines, in natural gas prompted this research program to develop reliable measurement techniques needed to assess the efficiency of removal processes for these environmentally sensitive substances. These techniques include sampling, speciation, quantitation and on-line instrumental methods for monitoring the total arsenic concentration. The current program has yielded many products, including calibration standards, arsenic-specific sorbents, sensitive analytical methods and instrumentation. Four laboratory analytical methods have been developed and successfully employed for arsenic determination in natural gas. These methods use GC-AED and GC-MS instruments to speciate alkylarsines, and peroxydisulfate extraction with FIAS, special carbon sorbent withmore » XRF and an IGT developed sorbent with GFAA for total arsenic measurement.« less

  3. Developing the role of big data and analytics in health professional education.

    PubMed

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  4. Recent advances in analytical methods for the determination of 4-alkylphenols and bisphenol A in solid environmental matrices: A critical review.

    PubMed

    Salgueiro-González, N; Castiglioni, S; Zuccato, E; Turnes-Carou, I; López-Mahía, P; Muniategui-Lorenzo, S

    2018-09-18

    The problem of endocrine disrupting compounds (EDCs) in the environment has become a worldwide concern in recent decades. Besides their toxicological effects at low concentrations and their widespread use in industrial and household applications, these pollutants pose a risk for non-target organisms and also for public safety. Analytical methods to determine these compounds at trace levels in different matrices are urgently needed. This review critically discusses trends in analytical methods for well-known EDCs like alkylphenols and bisphenol A in solid environmental matrices, including sediment and aquatic biological samples (from 2006 to 2018). Information about extraction, clean-up and determination is covered in detail, including analytical quality parameters (QA/QC). Conventional and novel analytical techniques are compared, with their advantages and drawbacks. Ultrasound assisted extraction followed by solid phase extraction clean-up is the most widely used procedure for sediment and aquatic biological samples, although softer extraction conditions have been employed for the latter. The use of liquid chromatography followed by tandem mass spectrometry has greatly increased in the last five years. The majority of these methods have been employed for the analysis of river sediments and bivalve molluscs because of their usefulness in aquatic ecosystem (bio)monitoring programs. Green, simple, fast analytical methods are now needed to determine these compounds in complex matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Big data in health care: using analytics to identify and manage high-risk and high-cost patients.

    PubMed

    Bates, David W; Saria, Suchi; Ohno-Machado, Lucila; Shah, Anand; Escobar, Gabriel

    2014-07-01

    The US health care system is rapidly adopting electronic health records, which will dramatically increase the quantity of clinical data that are available electronically. Simultaneously, rapid progress has been made in clinical analytics--techniques for analyzing large quantities of data and gleaning new insights from that analysis--which is part of what is known as big data. As a result, there are unprecedented opportunities to use big data to reduce the costs of health care in the United States. We present six use cases--that is, key examples--where some of the clearest opportunities exist to reduce costs through the use of big data: high-cost patients, readmissions, triage, decompensation (when a patient's condition worsens), adverse events, and treatment optimization for diseases affecting multiple organ systems. We discuss the types of insights that are likely to emerge from clinical analytics, the types of data needed to obtain such insights, and the infrastructure--analytics, algorithms, registries, assessment scores, monitoring devices, and so forth--that organizations will need to perform the necessary analyses and to implement changes that will improve care while reducing costs. Our findings have policy implications for regulatory oversight, ways to address privacy concerns, and the support of research on analytics. Project HOPE—The People-to-People Health Foundation, Inc.

  6. Problems of the Randomization Test for AB Designs

    ERIC Educational Resources Information Center

    Manolov, Rumen; Solanas, Antonio

    2009-01-01

    N = 1 designs imply repeated registrations of the behaviour of the same experimental unit and the measurements obtained are often few due to time limitations, while they are also likely to be sequentially dependent. The analytical techniques needed to enhance statistical and clinical decision making have to deal with these problems. Different…

  7. Drug screening in clinical or forensic toxicology: are there differences?

    PubMed

    Gerostamoulos, Dimitri; Beyer, Jochen

    2010-09-01

    Legal and medical practitioners need to remember that, with respect to drug analysis, there are two distinct disciplines in analytical toxicology concerned with human biological matrices, namely clinical and forensic toxicology. Both fields use similar analytical techniques designed to detect and quantify drugs, chemicals and poisons in fluids or tissues. In clinical toxicology, analytical results help to specify the appropriate treatment of a poisoned or intoxicated patient. In forensic toxicology, the results often play a vital role in determining the possible impairment or behavioural changes in an individual, or the contribution of drugs or poisons to death in a medico-legal investigation. This column provides an overview of the similarities and differences inherent in clinical and forensic toxicology.

  8. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    NASA Astrophysics Data System (ADS)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  9. High-freezing-point fuel studies

    NASA Technical Reports Server (NTRS)

    Tolle, F. F.

    1980-01-01

    Considerable progress in developing the experimental and analytical techniques needed to design airplanes to accommodate fuels with less stringent low temperature specifications is reported. A computer technique for calculating fuel temperature profiles in full tanks was developed. The computer program is being extended to include the case of partially empty tanks. Ultimately, the completed package is to be incorporated into an aircraft fuel tank thermal analyser code to permit the designer to fly various thermal exposure patterns, study fuel temperatures versus time, and determine holdup.

  10. Integrated study plan for space bioprocessing (phase 1)

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Current economic evaluation and analytical techniques are applied to decision problems faced by the space bioprocessing program. NASA decision makers are enabled to choose candidate substances, after ranking them according to their potential economic benefit. The determination of appropriate evaluation techniques necessary to obtain measures of potential economic benefits which result from the pursuit of various space bioprocessing endeavors are focused upon. The treatment of each disease is impacted by a successful outcome of space bioprocessing and specify data and other input needs for each candidate substance.

  11. Model and Analytic Processes for Export License Assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.

    2011-09-29

    This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less

  12. A FMEA clinical laboratory case study: how to make problems and improvements measurable.

    PubMed

    Capunzo, Mario; Cavallo, Pierpaolo; Boccia, Giovanni; Brunetti, Luigi; Pizzuti, Sante

    2004-01-01

    The authors have experimented the application of the Failure Mode and Effect Analysis (FMEA) technique in a clinical laboratory. FMEA technique allows: a) to evaluate and measure the hazards of a process malfunction, b) to decide where to execute improvement actions, and c) to measure the outcome of those actions. A small sample of analytes has been studied: there have been determined the causes of the possible malfunctions of the analytical process, calculating the risk probability index (RPI), with a value between 1 and 1,000. Only for the cases of RPI > 400, improvement actions have been implemented that allowed a reduction of RPI values between 25% to 70% with a costs increment of < 1%. FMEA technique can be applied to the processes of a clinical laboratory, even if of small dimensions, and offers a high potential of improvement. Nevertheless, such activity needs a thorough planning because it is complex, even if the laboratory already operates an ISO 9000 Quality Management System.

  13. [Gene doping: gene transfer and possible molecular detection].

    PubMed

    Argüelles, Carlos Francisco; Hernández-Zamora, Edgar

    2007-01-01

    The use of illegal substances in sports to enhance athletic performance during competition has caused international sports organizations such as the COI and WADA to take anti doping measures. A new doping method know as gene doping is defined as "the non-therapeutic use of genes, genetic elements and/or cells that have the capacity to enhance athletic performance". However, gene doping in sports is not easily identified and can cause serious consequences. Molecular biology techniques are needed in order to distinguish the difference between a "normal" and an "altered" genome. Further, we need to develop new analytic methods and biological molecular techniques in anti-doping laboratories, and design programs that avoid the non therapeutic use of genes.

  14. Accelerator-based analytical technique in the evaluation of some Nigeria’s natural minerals: Fluorite, tourmaline and topaz

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, O. A.; Mazzoli, C.; Ceccato, D.; Akintunde, J. A.; De Poli, M.; Moschini, G.

    2005-10-01

    For the first time, the complementary accelerator-based analytical technique of PIXE and electron microprobe analysis (EMPA) were employed for the characterization of some Nigeria's natural minerals namely fluorite, tourmaline and topaz. These minerals occur in different areas in Nigeria. The minerals are mainly used as gemstones and for other scientific and technological applications and therefore are very important. There is need to characterize them to know the quality of these gemstones and update the geochemical data on them geared towards useful applications. PIXE analysis was carried out using the 1.8 MeV collimated proton beam from the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy. The novel results which show many elements at different concentrations in these minerals are presented and discussed.

  15. Use of the Threshold of Toxicological Concern (TTC) approach for deriving target values for drinking water contaminants.

    PubMed

    Mons, M N; Heringa, M B; van Genderen, J; Puijker, L M; Brand, W; van Leeuwen, C J; Stoks, P; van der Hoek, J P; van der Kooij, D

    2013-03-15

    Ongoing pollution and improving analytical techniques reveal more and more anthropogenic substances in drinking water sources, and incidentally in treated water as well. In fact, complete absence of any trace pollutant in treated drinking water is an illusion as current analytical techniques are capable of detecting very low concentrations. Most of the substances detected lack toxicity data to derive safe levels and have not yet been regulated. Although the concentrations in treated water usually do not have adverse health effects, their presence is still undesired because of customer perception. This leads to the question how sensitive analytical methods need to become for water quality screening, at what levels water suppliers need to take action and how effective treatment methods need to be designed to remove contaminants sufficiently. Therefore, in the Netherlands a clear and consistent approach called 'Drinking Water Quality for the 21st century (Q21)' has been developed within the joint research program of the drinking water companies. Target values for anthropogenic drinking water contaminants were derived by using the recently introduced Threshold of Toxicological Concern (TTC) approach. The target values for individual genotoxic and steroid endocrine chemicals were set at 0.01 μg/L. For all other organic chemicals the target values were set at 0.1 μg/L. The target value for the total sum of genotoxic chemicals, the total sum of steroid hormones and the total sum of all other organic compounds were set at 0.01, 0.01 and 1.0 μg/L, respectively. The Dutch Q21 approach is further supplemented by the standstill-principle and effect-directed testing. The approach is helpful in defining the goals and limits of future treatment process designs and of analytical methods to further improve and ensure the quality of drinking water, without going to unnecessary extents. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  17. System safety education focused on industrial engineering

    NASA Technical Reports Server (NTRS)

    Johnston, W. L.; Morris, R. S.

    1971-01-01

    An educational program, designed to train students with the specific skills needed to become safety specialists, is described. The discussion concentrates on application, selection, and utilization of various system safety analytical approaches. Emphasis is also placed on the management of a system safety program, its relationship with other disciplines, and new developments and applications of system safety techniques.

  18. Using Exploratory Spatial Data Analysis to Leverage Social Indicator Databases: The Discovery of Interesting Patterns

    ERIC Educational Resources Information Center

    Anselin, Luc; Sridharan, Sanjeev; Gholston, Susan

    2007-01-01

    With the proliferation of social indicator databases, the need for powerful techniques to study patterns of change has grown. In this paper, the utility of spatial data analytical methods such as exploratory spatial data analysis (ESDA) is suggested as a means to leverage the information contained in social indicator databases. The principles…

  19. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  20. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    NASA Astrophysics Data System (ADS)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  1. The role of analytical chemistry in Niger Delta petroleum exploration: a review.

    PubMed

    Akinlua, Akinsehinwa

    2012-06-12

    Petroleum and organic matter from which the petroleum is derived are composed of organic compounds with some trace elements. These compounds give an insight into the origin, thermal maturity and paleoenvironmental history of petroleum, which are essential elements in petroleum exploration. The main tool to acquire the geochemical data is analytical techniques. Due to progress in the development of new analytical techniques, many hitherto petroleum exploration problems have been resolved. Analytical chemistry has played a significant role in the development of petroleum resources of Niger Delta. Various analytical techniques that have aided the success of petroleum exploration in the Niger Delta are discussed. The analytical techniques that have helped to understand the petroleum system of the basin are also described. Recent and emerging analytical methodologies including green analytical methods as applicable to petroleum exploration particularly Niger Delta petroleum province are discussed in this paper. Analytical chemistry is an invaluable tool in finding the Niger Delta oils. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  3. Liquid air cycle engines

    NASA Technical Reports Server (NTRS)

    Rosevear, Jerry

    1992-01-01

    Given here is a definition of Liquid Air Cycle Engines (LACE) and existing relevant technologies. Heat exchanger design and fabrication techniques, the handling of liquid hydrogen to achieve the greatest heat sink capabilities, and air decontamination to prevent heat exchanger fouling are discussed. It was concluded that technology needs to be extended in the areas of design and fabrication of heat exchangers to improve reliability along with weight and volume reductions. Catalysts need to be improved so that conversion can be achieved with lower quantities and lower volumes. Packaging studies need to be investigated both analytically and experimentally. Recycling with slush hydrogen needs further evaluation with experimental testing.

  4. Analytical Chemistry: A retrospective view on some current trends.

    PubMed

    Niessner, Reinhard

    2018-04-01

    In a retrospective view some current trends in Analytical Chemistry are outlined and connected to work published more than a hundred years ago in the same field. For example, gravimetric microanalysis after specific precipitation, once the sole basis for chemical analysis, has been transformed into a mass-sensitive transducer in combination with compound-specific receptors. Molecular spectroscopy, still practising the classical absorption/emission techniques for detecting elements or molecules experiences a change to Raman spectroscopy, is now allowing analysis of a multitude of additional features. Chemical sensors are now used to perform a vast number of analytical measurements. Especially paper-based devices (dipsticks, microfluidic pads) celebrate a revival as they can potentially revolutionize medicine in the developing world. Industry 4.0 will lead to a further increase of sensor applications. Preceding separation and enrichment of analytes from complicated matrices remains the backbone for a successful analysis, despite increasing attempts to avoid clean-up. Continuous separation techniques will become a key element for 24/7 production of goods with certified quality. Attempts to get instantaneous and specific chemical information by optical or electrical transduction will need highly selective receptors in large quantities. Further understanding of ligand - receptor complex structures is the key for successful generation of artificial bio-inspired receptors. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Antimony in the environment as a global pollutant: a review on analytical methodologies for its determination in atmospheric aerosols.

    PubMed

    Smichowski, Patricia

    2008-03-15

    This review summarizes and discusses the research carried out on the determination of antimony and its predominant chemical species in atmospheric aerosols. Environmental matrices such as airborne particulate matter, fly ash and volcanic ash present a number of complex analytical challenges as very sensitive analytical techniques and highly selective separation methodologies for speciation studies. Given the diversity of instrumental approaches and methodologies employed for the determination of antimony and its species in environmental matrices, the objective of this review is to briefly discuss the most relevant findings reported in the last years for this remarkable element and to identify the future needs and trends. The survey includes 92 references and covers principally the literature published over the last decade.

  6. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    PubMed Central

    Suvarapu, Lakshmi Narayana; Baek, Sung-Ok

    2015-01-01

    This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed. PMID:26236539

  7. An overview of the characterization of occupational exposure to nanoaerosols in workplaces

    NASA Astrophysics Data System (ADS)

    Castellano, Paola; Ferrante, Riccardo; Curini, Roberta; Canepari, Silvia

    2009-05-01

    Currently, there is a lack of standardized sampling and metric methods that can be applied to measure the level of exposure to nanosized aerosols. Therefore, any attempt to characterize exposure to nanoparticles (NP) in a workplace must involve a multifaceted approach characterized by different sampling and analytical techniques to measure all relevant characteristics of NP exposure. Furthermore, as NP aerosols are always complex mixtures of multiple origins, sampling and analytical methods need to be improved to selectively evaluate the apportionment from specific sources to the final nanomaterials. An open question at the world's level is how to relate specific toxic effects of NP with one or more among several different parameters (such as particle size, mass, composition, surface area, number concentration, aggregation or agglomeration state, water solubility and surface chemistry). As the evaluation of occupational exposure to NP in workplaces needs dimensional and chemical characterization, the main problem is the choice of the sampling and dimensional separation techniques. Therefore a convenient approach to allow a satisfactory risk assessment could be the contemporary use of different sampling and measuring techniques for particles with known toxicity in selected workplaces. Despite the lack of specific NP exposure limit values, exposure metrics, appropriate to nanoaerosols, are discussed in the Technical Report ISO/TR 27628:2007 with the aim to enable occupational hygienists to characterize and monitor nanoaerosols in workplaces. Moreover, NIOSH has developed the Document Approaches to Safe Nanotechnology (intended to be an information exchange with NIOSH) in order to address current and future research needs to understanding the potential risks that nanotechnology may have to workers.

  8. Advanced analytical techniques for the extraction and characterization of plant-derived essential oils by gas chromatography with mass spectrometry.

    PubMed

    Waseem, Rabia; Low, Kah Hin

    2015-02-01

    In recent years, essential oils have received a growing interest because of the positive health effects of their novel characteristics such as antibacterial, antifungal, and antioxidant activities. For the extraction of plant-derived essential oils, there is the need of advanced analytical techniques and innovative methodologies. An exhaustive study of hydrodistillation, supercritical fluid extraction, ultrasound- and microwave-assisted extraction, solid-phase microextraction, pressurized liquid extraction, pressurized hot water extraction, liquid-liquid extraction, liquid-phase microextraction, matrix solid-phase dispersion, and gas chromatography (one- and two-dimensional) hyphenated with mass spectrometry for the extraction through various plant species and analysis of essential oils has been provided in this review. Essential oils are composed of mainly terpenes and terpenoids with low-molecular-weight aromatic and aliphatic constituents that are particularly important for public health. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  10. Transient well flow in vertically heterogeneous aquifers

    NASA Astrophysics Data System (ADS)

    Hemker, C. J.

    1999-11-01

    A solution for the general problem of computing well flow in vertically heterogeneous aquifers is found by an integration of both analytical and numerical techniques. The radial component of flow is treated analytically; the drawdown is a continuous function of the distance to the well. The finite-difference technique is used for the vertical flow component only. The aquifer is discretized in the vertical dimension and the heterogeneous aquifer is considered to be a layered (stratified) formation with a finite number of homogeneous sublayers, where each sublayer may have different properties. The transient part of the differential equation is solved with Stehfest's algorithm, a numerical inversion technique of the Laplace transform. The well is of constant discharge and penetrates one or more of the sublayers. The effect of wellbore storage on early drawdown data is taken into account. In this way drawdowns are found for a finite number of sublayers as a continuous function of radial distance to the well and of time since the pumping started. The model is verified by comparing results with published analytical and numerical solutions for well flow in homogeneous and heterogeneous, confined and unconfined aquifers. Instantaneous and delayed drainage of water from above the water table are considered, combined with the effects of partially penetrating and finite-diameter wells. The model is applied to demonstrate that the transient effects of wellbore storage in unconfined aquifers are less pronounced than previous numerical experiments suggest. Other applications of the presented solution technique are given for partially penetrating wells in heterogeneous formations, including a demonstration of the effect of decreasing specific storage values with depth in an otherwise homogeneous aquifer. The presented solution can be a powerful tool for the analysis of drawdown from pumping tests, because hydraulic properties of layered heterogeneous aquifer systems with partially penetrating wells may be estimated without the need to construct transient numerical models. A computer program based on the hybrid analytical-numerical technique is available from the author.

  11. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  12. Meeting future information needs for Great Lakes fisheries management

    USGS Publications Warehouse

    Christie, W.J.; Collins, John J.; Eck, Gary W.; Goddard, Chris I.; Hoenig, John M.; Holey, Mark; Jacobson, Lawrence D.; MacCallum, Wayne; Nepszy, Stephen J.; O'Gorman, Robert; Selgeby, James

    1987-01-01

    Description of information needs for management of Great Lakes fisheries is complicated by recent changes in biology and management of the Great Lakes, development of new analytical methodologies, and a transition in management from a traditional unispecies approach to a multispecies/community approach. A number of general problems with the collection and management of data and information for fisheries management need to be addressed (i.e. spatial resolution, reliability, computerization and accessibility of data, design of sampling programs, standardization and coordination among agencies, and the need for periodic review of procedures). Problems with existing data collection programs include size selectivity and temporal trends in the efficiency of fishing gear, inadequate creel survey programs, bias in age estimation, lack of detailed sea lamprey (Petromyzon marinus) wounding data, and data requirements for analytical techniques that are underutilized by managers of Great Lakes fisheries. The transition to multispecies and community approaches to fisheries management will require policy decisions by the management agencies, adequate funding, and a commitment to develop programs for collection of appropriate data on a long-term basis.

  13. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  14. An FT-raman study of softwood, hardwood, and chemically modified black spruce MWLS

    Treesearch

    Umesh P. Agarwal; James D. McSweeny; Sally A. Ralph

    1999-01-01

    Raman spectroscopy is being increasingly used to carry out in situ analysis of wood and other lignocellulosics. To obtain useful information from the spectra, the vibrational bands need to be assigned in terms of contributions from various chemical components and component sub-structures. In additional, so that the technique can be better applied as an analytical...

  15. Process analytical technologies (PAT) in freeze-drying of parenteral products.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael

    2009-01-01

    Quality by Design (QbD), aims at assuring quality by proper design and control, utilizing appropriate Process Analytical Technologies (PAT) to monitor critical process parameters during processing to ensure that the product meets the desired quality attributes. This review provides a comprehensive list of process monitoring devices that can be used to monitor critical process parameters and will focus on a critical review of the viability of the PAT schemes proposed. R&D needs in PAT for freeze-drying have also been addressed with particular emphasis on batch techniques that can be used on all the dryers independent of the dryer scale.

  16. Targeted Analyte Detection by Standard Addition Improves Detection Limits in MALDI Mass Spectrometry

    PubMed Central

    Eshghi, Shadi Toghi; Li, Xingde; Zhang, Hui

    2014-01-01

    Matrix-assisted laser desorption/ionization has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications. PMID:22877355

  17. Targeted analyte detection by standard addition improves detection limits in matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Toghi Eshghi, Shadi; Li, Xingde; Zhang, Hui

    2012-09-18

    Matrix-assisted laser desorption/ionization (MALDI) has proven an effective tool for fast and accurate determination of many molecules. However, the detector sensitivity and chemical noise compromise the detection of many invaluable low-abundance molecules from biological and clinical samples. To challenge this limitation, we developed a targeted analyte detection (TAD) technique. In TAD, the target analyte is selectively elevated by spiking a known amount of that analyte into the sample, thereby raising its concentration above the noise level, where we take advantage of the improved sensitivity to detect the presence of the endogenous analyte in the sample. We assessed TAD on three peptides in simple and complex background solutions with various exogenous analyte concentrations in two MALDI matrices. TAD successfully improved the limit of detection (LOD) of target analytes when the target peptides were added to the sample in a concentration close to optimum concentration. The optimum exogenous concentration was estimated through a quantitative method to be approximately equal to the original LOD for each target. Also, we showed that TAD could achieve LOD improvements on an average of 3-fold in a simple and 2-fold in a complex sample. TAD provides a straightforward assay to improve the LOD of generic target analytes without the need for costly hardware modifications.

  18. Big Data: Are Biomedical and Health Informatics Training Programs Ready?

    PubMed Central

    Hersh, W.; Ganesh, A. U. Jai

    2014-01-01

    Summary Objectives The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? Methods We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. Results The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one’s area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Conclusions Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in “deep analytical talent” as well as those who need knowledge to support such individuals. PMID:25123740

  19. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    NASA Technical Reports Server (NTRS)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  20. Big Data: Are Biomedical and Health Informatics Training Programs Ready? Contribution of the IMIA Working Group for Health and Medical Informatics Education.

    PubMed

    Otero, P; Hersh, W; Jai Ganesh, A U

    2014-08-15

    The growing volume and diversity of health and biomedical data indicate that the era of Big Data has arrived for healthcare. This has many implications for informatics, not only in terms of implementing and evaluating information systems, but also for the work and training of informatics researchers and professionals. This article addresses the question: What do biomedical and health informaticians working in analytics and Big Data need to know? We hypothesize a set of skills that we hope will be discussed among academic and other informaticians. The set of skills includes: Programming - especially with data-oriented tools, such as SQL and statistical programming languages; Statistics - working knowledge to apply tools and techniques; Domain knowledge - depending on one's area of work, bioscience or health care; and Communication - being able to understand needs of people and organizations, and articulate results back to them. Biomedical and health informatics educational programs must introduce concepts of analytics, Big Data, and the underlying skills to use and apply them into their curricula. The development of new coursework should focus on those who will become experts, with training aiming to provide skills in "deep analytical talent" as well as those who need knowledge to support such individuals.

  1. Commodity-Free Calibration

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Commodity-free calibration is a reaction rate calibration technique that does not require the addition of any commodities. This technique is a specific form of the reaction rate technique, where all of the necessary reactants, other than the sample being analyzed, are either inherent in the analyzing system or specifically added or provided to the system for a reason other than calibration. After introduction, the component of interest is exposed to other reactants or flow paths already present in the system. The instrument detector records one of the following to determine the rate of reaction: the increase in the response of the reaction product, a decrease in the signal of the analyte response, or a decrease in the signal from the inherent reactant. With this data, the initial concentration of the analyte is calculated. This type of system can analyze and calibrate simultaneously, reduce the risk of false positives and exposure to toxic vapors, and improve accuracy. Moreover, having an excess of the reactant already present in the system eliminates the need to add commodities, which further reduces cost, logistic problems, and potential contamination. Also, the calculations involved can be simplified by comparison to those of the reaction rate technique. We conducted tests with hypergols as an initial investigation into the feasiblility of the technique.

  2. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  3. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  4. Technique and final cause in psychoanalysis: four ways of looking at one moment.

    PubMed

    Lear, Jonathan

    2009-12-01

    This paper argues that if one considers just a single clinical moment there may be no principled way to choose among different approaches to psychoanalytic technique. One must in addition take into account what Aristotle called the final cause of psychoanalysis, which this paper argues is freedom. However, freedom is itself an open-ended concept with many aspects that need to be explored and developed from a psychoanalytic perspective. This paper considers one analytic moment from the perspectives of the techniques of Paul Gray, Hans Loewald, the contemporary Kleinians and Jacques Lacan. It argues that, if we are to evaluate these techniques, we must take into account the different conceptions of freedom they are trying to facilitate.

  5. Optical Drug Monitoring: Photoacoustic Imaging of Nanosensors to Monitor Therapeutic Lithium In Vivo

    PubMed Central

    Cash, Kevin J.; Li, Chiye; Xia, Jun; Wang, Lihong V.; Clark, Heather A.

    2015-01-01

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes. PMID:25588028

  6. Optical drug monitoring: photoacoustic imaging of nanosensors to monitor therapeutic lithium in vivo.

    PubMed

    Cash, Kevin J; Li, Chiye; Xia, Jun; Wang, Lihong V; Clark, Heather A

    2015-02-24

    Personalized medicine could revolutionize how primary care physicians treat chronic disease and how researchers study fundamental biological questions. To realize this goal, we need to develop more robust, modular tools and imaging approaches for in vivo monitoring of analytes. In this report, we demonstrate that synthetic nanosensors can measure physiologic parameters with photoacoustic contrast, and we apply that platform to continuously track lithium levels in vivo. Photoacoustic imaging achieves imaging depths that are unattainable with fluorescence or multiphoton microscopy. We validated the photoacoustic results that illustrate the superior imaging depth and quality of photoacoustic imaging with optical measurements. This powerful combination of techniques will unlock the ability to measure analyte changes in deep tissue and will open up photoacoustic imaging as a diagnostic tool for continuous physiological tracking of a wide range of analytes.

  7. Analytical Electrochemistry: Methodology and Applications of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Heineman, William R.; Kissinger, Peter T.

    1980-01-01

    Reports developments involving the experimental aspects of finite and current analytical electrochemistry including electrode materials (97 cited references), hydrodynamic techniques (56), spectroelectrochemistry (62), stripping voltammetry (70), voltammetric techniques (27), polarographic techniques (59), and miscellany (12). (CS)

  8. Decision-analytic modeling studies: An overview for clinicians using multiple myeloma as an example.

    PubMed

    Rochau, U; Jahn, B; Qerimi, V; Burger, E A; Kurzthaler, C; Kluibenschaedl, M; Willenbacher, E; Gastl, G; Willenbacher, W; Siebert, U

    2015-05-01

    The purpose of this study was to provide a clinician-friendly overview of decision-analytic models evaluating different treatment strategies for multiple myeloma (MM). We performed a systematic literature search to identify studies evaluating MM treatment strategies using mathematical decision-analytic models. We included studies that were published as full-text articles in English, and assessed relevant clinical endpoints, and summarized methodological characteristics (e.g., modeling approaches, simulation techniques, health outcomes, perspectives). Eleven decision-analytic modeling studies met our inclusion criteria. Five different modeling approaches were adopted: decision-tree modeling, Markov state-transition modeling, discrete event simulation, partitioned-survival analysis and area-under-the-curve modeling. Health outcomes included survival, number-needed-to-treat, life expectancy, and quality-adjusted life years. Evaluated treatment strategies included novel agent-based combination therapies, stem cell transplantation and supportive measures. Overall, our review provides a comprehensive summary of modeling studies assessing treatment of MM and highlights decision-analytic modeling as an important tool for health policy decision making. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Data to decisions.

    PubMed

    White, B J; Amrine, D E; Larson, R L

    2018-04-14

    Big data are frequently used in many facets of business and agronomy to enhance knowledge needed to improve operational decisions. Livestock operations collect data of sufficient quantity to perform predictive analytics. Predictive analytics can be defined as a methodology and suite of data evaluation techniques to generate a prediction for specific target outcomes. The objective of this manuscript is to describe the process of using big data and the predictive analytic framework to create tools to drive decisions in livestock production, health, and welfare. The predictive analytic process involves selecting a target variable, managing the data, partitioning the data, then creating algorithms, refining algorithms, and finally comparing accuracy of the created classifiers. The partitioning of the datasets allows model building and refining to occur prior to testing the predictive accuracy of the model with naive data to evaluate overall accuracy. Many different classification algorithms are available for predictive use and testing multiple algorithms can lead to optimal results. Application of a systematic process for predictive analytics using data that is currently collected or that could be collected on livestock operations will facilitate precision animal management through enhanced livestock operational decisions.

  10. Determination of nonylphenol and short-chained nonylphenol ethoxylates in drain water from an agricultural area.

    PubMed

    Zgoła-Grześkowiak, Agnieszka; Grześkowiak, Tomasz; Rydlichowski, Robert; Łukaszewski, Zenon

    2009-04-01

    Water samples from agricultural drains were tested for the presence of nonylphenol and nonylphenol mono- and diethoxylates. The analytes belong to biodegradation products of long-chained nonylphenol ethoxylates, which are used as additives in pesticide formulations. Quantification of these analytes was performed by HPLC with fluorescence detection after isolation by using multi-capillary polytetrafluoroethylene (PTFE) trap extraction. This newly developed technique allowed obtaining about 90% recovery of these analytes in synthetic samples and several percent lower recovery in real samples. Also, no additional sample cleaning was needed before chromatographic analysis. The limit of quantitation for all the analytes was 0.1 microg L(-1). The nonylphenol, nonylphenol mono- and diethoxylates were detected at the concentrations ranging from 0.5 to 6.0 microg L(-1), from 0.2 to 0.7 microg L(-1) and from below 0.02 to 0.4 microg L(-1), respectively. Concentrations of nonylphenol and its derivatives were higher in samples taken in spring than in summer.

  11. Trends in analytical techniques applied to particulate matter characterization: A critical review of fundaments and applications.

    PubMed

    Galvão, Elson Silva; Santos, Jane Meri; Lima, Ana Teresa; Reis, Neyval Costa; Orlando, Marcos Tadeu D'Azeredo; Stuetz, Richard Michael

    2018-05-01

    Epidemiological studies have shown the association of airborne particulate matter (PM) size and chemical composition with health problems affecting the cardiorespiratory and central nervous systems. PM also act as cloud condensation nuclei (CNN) or ice nuclei (IN), taking part in the clouds formation process, and therefore can impact the climate. There are several works using different analytical techniques in PM chemical and physical characterization to supply information to source apportionment models that help environmental agencies to assess damages accountability. Despite the numerous analytical techniques described in the literature available for PM characterization, laboratories are normally limited to the in-house available techniques, which raises the question if a given technique is suitable for the purpose of a specific experimental work. The aim of this work consists of summarizing the main available technologies for PM characterization, serving as a guide for readers to find the most appropriate technique(s) for their investigation. Elemental analysis techniques like atomic spectrometry based and X-ray based techniques, organic and carbonaceous techniques and surface analysis techniques are discussed, illustrating their main features as well as their advantages and drawbacks. We also discuss the trends in analytical techniques used over the last two decades. The choice among all techniques is a function of a number of parameters such as: the relevant particles physical properties, sampling and measuring time, access to available facilities and the costs associated to equipment acquisition, among other considerations. An analytical guide map is presented as a guideline for choosing the most appropriated technique for a given analytical information required. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Analytical Validation of a Portable Mass Spectrometer Featuring Interchangeable, Ambient Ionization Sources for High Throughput Forensic Evidence Screening

    NASA Astrophysics Data System (ADS)

    Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.

    2017-06-01

    Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.

  13. Methodological challenges and analytic opportunities for modeling and interpreting Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    2016-01-01

    Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.

  14. Automated Predictive Big Data Analytics Using Ontology Based Semantics.

    PubMed

    Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A

    2015-10-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.

  15. Automated Predictive Big Data Analytics Using Ontology Based Semantics

    PubMed Central

    Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.

    2017-01-01

    Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954

  16. 3-MCPD in food other than soy sauce or hydrolysed vegetable protein (HVP).

    PubMed

    Baer, Ines; de la Calle, Beatriz; Taylor, Philip

    2010-01-01

    This review gives an overview of current knowledge about 3-monochloropropane-1,2-diol (3-MCPD) formation and detection. Although 3-MCPD is often mentioned with regard to soy sauce and acid-hydrolysed vegetable protein (HVP), and much research has been done in that area, the emphasis here is placed on other foods. This contaminant can be found in a great variety of foodstuffs and is difficult to avoid in our daily nutrition. Despite its low concentration in most foods, its carcinogenic properties are of general concern. Its formation is a multivariate problem influenced by factors such as heat, moisture and sugar/lipid content, depending on the type of food and respective processing employed. Understanding the formation of this contaminant in food is fundamental to not only preventing or reducing it, but also developing efficient analytical methods of detecting it. Considering the differences between 3-MCPD-containing foods, and the need to test for the contaminant at different levels of food processing, one would expect a variety of analytical approaches. In this review, an attempt is made to provide an up-to-date list of available analytical methods and to highlight the differences among these techniques. Finally, the emergence of 3-MCPD esters and analytical techniques for them are also discussed here, although they are not the main focus of this review.

  17. Application of Semi-analytical Satellite Theory orbit propagator to orbit determination for space object catalog maintenance

    NASA Astrophysics Data System (ADS)

    Setty, Srinivas J.; Cefola, Paul J.; Montenbruck, Oliver; Fiedler, Hauke

    2016-05-01

    Catalog maintenance for Space Situational Awareness (SSA) demands an accurate and computationally lean orbit propagation and orbit determination technique to cope with the ever increasing number of observed space objects. As an alternative to established numerical and analytical methods, we investigate the accuracy and computational load of the Draper Semi-analytical Satellite Theory (DSST). The standalone version of the DSST was enhanced with additional perturbation models to improve its recovery of short periodic motion. The accuracy of DSST is, for the first time, compared to a numerical propagator with fidelity force models for a comprehensive grid of low, medium, and high altitude orbits with varying eccentricity and different inclinations. Furthermore, the run-time of both propagators is compared as a function of propagation arc, output step size and gravity field order to assess its performance for a full range of relevant use cases. For use in orbit determination, a robust performance of DSST is demonstrated even in the case of sparse observations, which is most sensitive to mismodeled short periodic perturbations. Overall, DSST is shown to exhibit adequate accuracy at favorable computational speed for the full set of orbits that need to be considered in space surveillance. Along with the inherent benefits of a semi-analytical orbit representation, DSST provides an attractive alternative to the more common numerical orbit propagation techniques.

  18. ESIP Earth Sciences Data Analytics (ESDA) Cluster - Work in Progress

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2015-01-01

    The purpose of this poster is to promote a common understanding of the usefulness of, and activities that pertain to, Data Analytics and more broadly, the Data Scientist; Facilitate collaborations to better understand the cross usage of heterogeneous datasets and to provide accommodating data analytics expertise, now and as the needs evolve into the future; Identify gaps that, once filled, will further collaborative activities. Objectives Provide a forum for Academic discussions that provides ESIP members a better understanding of the various aspects of Earth Science Data Analytics Bring in guest speakers to describe external efforts, and further teach us about the broader use of Data Analytics. Perform activities that:- Compile use cases generated from specific community needs to cross analyze heterogeneous data- Compile sources of analytics tools, in particular, to satisfy the needs of the above data users- Examine gaps between needs and sources- Examine gaps between needs and community expertise- Document specific data analytics expertise needed to perform Earth science data analytics Seek graduate data analytics Data Science student internship opportunities.

  19. Comparison of commercial analytical techniques for measuring chlorine dioxide in urban desalinated drinking water.

    PubMed

    Ammar, T A; Abid, K Y; El-Bindary, A A; El-Sonbati, A Z

    2015-12-01

    Most drinking water industries are closely examining options to maintain a certain level of disinfectant residual through the entire distribution system. Chlorine dioxide is one of the promising disinfectants that is usually used as a secondary disinfectant, whereas the selection of the proper monitoring analytical technique to ensure disinfection and regulatory compliance has been debated within the industry. This research endeavored to objectively compare the performance of commercially available analytical techniques used for chlorine dioxide measurements (namely, chronoamperometry, DPD (N,N-diethyl-p-phenylenediamine), Lissamine Green B (LGB WET) and amperometric titration), to determine the superior technique. The commonly available commercial analytical techniques were evaluated over a wide range of chlorine dioxide concentrations. In reference to pre-defined criteria, the superior analytical technique was determined. To discern the effectiveness of such superior technique, various factors, such as sample temperature, high ionic strength, and other interferences that might influence the performance were examined. Among the four techniques, chronoamperometry technique indicates a significant level of accuracy and precision. Furthermore, the various influencing factors studied did not diminish the technique's performance where it was fairly adequate in all matrices. This study is a step towards proper disinfection monitoring and it confidently assists engineers with chlorine dioxide disinfection system planning and management.

  20. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-05-10

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  1. Value of Earth Observations: Key principles and techniques of socioeconomic benefits analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Friedl, L.; Macauley, M.; Bernknopf, R.

    2013-12-01

    Internationally, multiple organizations are placing greater emphasis on the societal benefits that governments, businesses, and NGOs can derive from applications of Earth-observing satellite observations, research, and models. A growing set of qualitative, anecdotal examples on the uses of Earth observations across a range of sectors can be complemented by the quantitative substantiation of the socioeconomic benefits. In turn, the expanding breadth of environmental data available and the awareness of their beneficial applications to inform decisions can support new products and services by companies, agencies, and civil society. There are, however, significant efforts needed to bridge the Earth sciences and social and economic sciences fields to build capacity, develop case studies, and refine analytic techniques in quantifying socioeconomic benefits from the use of Earth observations. Some government programs, such as the NASA Earth Science Division's Applied Sciences Program have initiated activities in recent years to quantify the socioeconomic benefits from applications of Earth observations research, and to develop multidisciplinary models for organizations' decision-making activities. A community of practice has conducted workshops, developed impact analysis reports, published a book, developed a primer, and pursued other activities to advance analytic methodologies and build capacity. This paper will present an overview of measuring socioeconomic impacts of Earth observations and how the measures can be translated into a value of Earth observation information. It will address key terms, techniques, principles and applications of socioeconomic impact analyses. It will also discuss activities to pursue a research agenda on analytic techniques, develop a body of knowledge, and promote broader skills and capabilities.

  2. Toward rigorous idiographic research in prevention science: comparison between three analytic strategies for testing preventive intervention in very small samples.

    PubMed

    Ridenour, Ty A; Pineo, Thomas Z; Maldonado Molina, Mildred M; Hassmiller Lich, Kristen

    2013-06-01

    Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N = 4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates.

  3. Toward Rigorous Idiographic Research in Prevention Science: Comparison Between Three Analytic Strategies for Testing Preventive Intervention in Very Small Samples

    PubMed Central

    Pineo, Thomas Z.; Maldonado Molina, Mildred M.; Lich, Kristen Hassmiller

    2013-01-01

    Psychosocial prevention research lacks evidence from intensive within-person lines of research to understand idiographic processes related to development and response to intervention. Such data could be used to fill gaps in the literature and expand the study design options for prevention researchers, including lower-cost yet rigorous studies (e.g., for program evaluations), pilot studies, designs to test programs for low prevalence outcomes, selective/indicated/ adaptive intervention research, and understanding of differential response to programs. This study compared three competing analytic strategies designed for this type of research: autoregressive moving average, mixed model trajectory analysis, and P-technique. Illustrative time series data were from a pilot study of an intervention for nursing home residents with diabetes (N=4) designed to improve control of blood glucose. A within-person, intermittent baseline design was used. Intervention effects were detected using each strategy for the aggregated sample and for individual patients. The P-technique model most closely replicated observed glucose levels. ARIMA and P-technique models were most similar in terms of estimated intervention effects and modeled glucose levels. However, ARIMA and P-technique also were more sensitive to missing data, outliers and number of observations. Statistical testing suggested that results generalize both to other persons as well as to idiographic, longitudinal processes. This study demonstrated the potential contributions of idiographic research in prevention science as well as the need for simulation studies to delineate the research circumstances when each analytic approach is optimal for deriving the correct parameter estimates. PMID:23299558

  4. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  5. Isolation and analysis of ginseng: advances and challenges

    PubMed Central

    Wang, Chong-Zhi

    2011-01-01

    Ginseng occupies a prominent position in the list of best-selling natural products in the world. Because of its complex constituents, multidisciplinary techniques are needed to validate the analytical methods that support ginseng’s use worldwide. In the past decade, rapid development of technology has advanced many aspects of ginseng research. The aim of this review is to illustrate the recent advances in the isolation and analysis of ginseng, and to highlight their new applications and challenges. Emphasis is placed on recent trends and emerging techniques. The current article reviews the literature between January 2000 and September 2010. PMID:21258738

  6. Dynamic characterization, monitoring and control of rotating flexible beam-mass structures via piezo-embedded techniques

    NASA Technical Reports Server (NTRS)

    Lai, Steven H.-Y.

    1992-01-01

    A variational principle and a finite element discretization technique were used to derive the dynamic equations for a high speed rotating flexible beam-mass system embedded with piezo-electric materials. The dynamic equation thus obtained allows the development of finite element models which accommodate both the original structural element and the piezoelectric element. The solutions of finite element models provide system dynamics needed to design a sensing system. The characterization of gyroscopic effect and damping capacity of smart rotating devices are addressed. Several simulation examples are presented to validate the analytical solution.

  7. Automated measurement of respiratory gas exchange by an inert gas dilution technique

    NASA Technical Reports Server (NTRS)

    Sawin, C. F.; Rummel, J. A.; Michel, E. L.

    1974-01-01

    A respiratory gas analyzer (RGA) has been developed wherein a mass spectrometer is the sole transducer required for measurement of respiratory gas exchange. The mass spectrometer maintains all signals in absolute phase relationships, precluding the need to synchronize flow and gas composition as required in other systems. The RGA system was evaluated by comparison with the Douglas bag technique. The RGA system established the feasibility of the inert gas dilution method for measuring breath-by-breath respiratory gas exchange. This breath-by-breath analytical capability permits detailed study of transient respiratory responses to exercise.

  8. Analytic integration of real-virtual counterterms in NNLO jet cross sections I

    NASA Astrophysics Data System (ADS)

    Aglietti, Ugo; Del Duca, Vittorio; Duhr, Claude; Somogyi, Gábor; Trócsányi, Zoltán

    2008-09-01

    We present analytic evaluations of some integrals needed to give explicitly the integrated real-virtual counterterms, based on a recently proposed subtraction scheme for next-to-next-to-leading order (NNLO) jet cross sections. After an algebraic reduction of the integrals, integration-by-parts identities are used for the reduction to master integrals and for the computation of the master integrals themselves by means of differential equations. The results are written in terms of one- and two-dimensional harmonic polylogarithms, once an extension of the standard basis is made. We expect that the techniques described here will be useful in computing other integrals emerging in calculations in perturbative quantum field theories.

  9. Literature Review of the Extraction and Analysis of Trace Contaminants in Food

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Audrey Martin; Alcaraz, Armando

    2010-06-15

    There exists a serious concern that chemical warfare agents (CWA) may be used in a terrorist attack against military or civilian populations. While many precautions have been taken on the military front (e.g. protective clothing, gas masks), such precautions are not suited for the widespread application to civilian populations. Thus, defense of the civilian population, and applicable to the military population, has focused on prevention and early detection. Early detection relies on accurate and sensitive analytical methods to detect and identify CWA in a variety of matrices. Once a CWA is detected, the analytical needs take on a forensic applicationmore » – are there any chemical signatures present in the sample that could indicate its source? These signatures could include byproducts of the reaction, unreacted starting materials, degradation products, or impurities. Therefore, it is important that the analytical method used can accurately identify such signatures, as well as the CWA itself. Contained herein is a review of the open literature describing the detection of CWA in various matrices and the detection of trace toxic chemicals in food. Several relevant reviews have been published in the literature,1-5 including a review of analytical separation techniques for CWA by Hooijschuur et al.1 The current review is not meant to reiterate the published manuscripts; is focused mainly on extraction procedures, as well as the detection of VX and its hydrolysis products, as it is closely related to Russian VX, which is not prevalent in the literature. It is divided by the detection technique used, as such; extraction techniques are included with each detection method.« less

  10. Quantitative Electron Probe Microanalysis: State of the Art

    NASA Technical Reports Server (NTRS)

    Carpernter, P. K.

    2005-01-01

    Quantitative electron-probe microanalysis (EPMA) has improved due to better instrument design and X-ray correction methods. Design improvement of the electron column and X-ray spectrometer has resulted in measurement precision that exceeds analytical accuracy. Wavelength-dispersive spectrometer (WDS) have layered-dispersive diffraction crystals with improved light-element sensitivity. Newer energy-dispersive spectrometers (EDS) have Si-drift detector elements, thin window designs, and digital processing electronics with X-ray throughput approaching that of WDS Systems. Using these systems, digital X-ray mapping coupled with spectrum imaging is a powerful compositional mapping tool. Improvements in analytical accuracy are due to better X-ray correction algorithms, mass absorption coefficient data sets,and analysis method for complex geometries. ZAF algorithms have ban superceded by Phi(pz) algorithms that better model the depth distribution of primary X-ray production. Complex thin film and particle geometries are treated using Phi(pz) algorithms, end results agree well with Monte Carlo simulations. For geological materials, X-ray absorption dominates the corretions end depends on the accuracy of mass absorption coefficient (MAC) data sets. However, few MACs have been experimentally measured, and the use of fitted coefficients continues due to general success of the analytical technique. A polynomial formulation of the Bence-Albec alpha-factor technique, calibrated using Phi(pz) algorithms, is used to critically evaluate accuracy issues and can be also be used for high 2% relative and is limited by measurement precision for ideal cases, but for many elements the analytical accuracy is unproven. The EPMA technique has improved to the point where it is frequently used instead of the petrogaphic microscope for reconnaissance work. Examples of stagnant research areas are: WDS detector design characterization of calibration standards, and the need for more complete treatment of the continuum X-ray fluorescence correction.

  11. Depth-resolved monitoring of analytes diffusion in ocular tissues

    NASA Astrophysics Data System (ADS)

    Larin, Kirill V.; Ghosn, Mohamad G.; Tuchin, Valery V.

    2007-02-01

    Optical coherence tomography (OCT) is a noninvasive imaging technique with high in-depth resolution. We employed OCT technique for monitoring and quantification of analyte and drug diffusion in cornea and sclera of rabbit eyes in vitro. Different analytes and drugs such as metronidazole, dexamethasone, ciprofloxacin, mannitol, and glucose solution were studied and whose permeability coefficients were calculated. Drug diffusion monitoring was performed as a function of time and as a function of depth. Obtained results suggest that OCT technique might be used for analyte diffusion studies in connective and epithelial tissues.

  12. Immunoanalysis Methods for the Detection of Dioxins and Related Chemicals

    PubMed Central

    Tian, Wenjing; Xie, Heidi Qunhui; Fu, Hualing; Pei, Xinhui; Zhao, Bin

    2012-01-01

    With the development of biotechnology, approaches based on antibodies, such as enzyme-linked immunosorbent assay (ELISA), active aryl hydrocarbon immunoassay (Ah-I) and other multi-analyte immunoassays, have been utilized as alternatives to the conventional techniques based on gas chromatography and mass spectroscopy for the analysis of dioxin and dioxin-like compounds in environmental and biological samples. These screening methods have been verified as rapid, simple and cost-effective. This paper provides an overview on the development and application of antibody-based approaches, such as ELISA, Ah-I, and multi-analyte immunoassays, covering the sample extraction and cleanup, antigen design, antibody preparation and immunoanalysis. However, in order to meet the requirements for on-site fast detection and relative quantification of dioxins in the environment, further optimization is needed to make these immuno-analytical methods more sensitive and easy to use. PMID:23443395

  13. On-line soft sensing in upstream bioprocessing.

    PubMed

    Randek, Judit; Mandenius, Carl-Fredrik

    2018-02-01

    This review provides an overview and a critical discussion of novel possibilities of applying soft sensors for on-line monitoring and control of industrial bioprocesses. Focus is on bio-product formation in the upstream process but also the integration with other parts of the process is addressed. The term soft sensor is used for the combination of analytical hardware data (from sensors, analytical devices, instruments and actuators) with mathematical models that create new real-time information about the process. In particular, the review assesses these possibilities from an industrial perspective, including sensor performance, information value and production economy. The capabilities of existing analytical on-line techniques are scrutinized in view of their usefulness in soft sensor setups and in relation to typical needs in bioprocessing in general. The review concludes with specific recommendations for further development of soft sensors for the monitoring and control of upstream bioprocessing.

  14. Big data in medical informatics: improving education through visual analytics.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2014-01-01

    A continuous effort to improve healthcare education today is currently driven from the need to create competent health professionals able to meet healthcare demands. Limited research reporting how educational data manipulation can help in healthcare education improvement. The emerging research field of visual analytics has the advantage to combine big data analysis and manipulation techniques, information and knowledge representation, and human cognitive strength to perceive and recognise visual patterns. The aim of this study was therefore to explore novel ways of representing curriculum and educational data using visual analytics. Three approaches of visualization and representation of educational data were presented. Five competencies at undergraduate medical program level addressed in courses were identified to inaccurately correspond to higher education board competencies. Different visual representations seem to have a potential in impacting on the ability to perceive entities and connections in the curriculum data.

  15. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  16. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  17. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  18. Combustion Fundamentals Research

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Increased emphasis is placed on fundamental and generic research at Lewis Research Center with less systems development efforts. This is especially true in combustion research, where the study of combustion fundamentals has grown significantly in order to better address the perceived long term technical needs of the aerospace industry. The main thrusts for this combustion fundamentals program area are as follows: analytical models of combustion processes, model verification experiments, fundamental combustion experiments, and advanced numeric techniques.

  19. A new basaltic glass microanalytical reference material for multiple techniques

    USGS Publications Warehouse

    Wilson, Steve; Koenig, Alan; Lowers, Heather

    2012-01-01

    The U.S. Geological Survey (USGS) has been producing reference materials since the 1950s. Over 50 materials have been developed to cover bulk rock, sediment, and soils for the geological community. These materials are used globally in geochemistry, environmental, and analytical laboratories that perform bulk chemistry and/or microanalysis for instrument calibration and quality assurance testing. To answer the growing demand for higher spatial resolution and sensitivity, there is a need to create a new generation of microanalytical reference materials suitable for a variety of techniques, such as scanning electron microscopy/X-ray spectrometry (SEM/EDS), electron probe microanalysis (EPMA), laser ablation inductively coupled mass spectrometry (LA-ICP-MS), and secondary ion mass spectrometry (SIMS). As such, the microanalytical reference material (MRM) needs to be stable under the beam, be homogeneous at scales of better than 10–25 micrometers for the major to ultra-trace element level, and contain all of the analytes (elements or isotopes) of interest. Previous development of basaltic glasses intended for LA-ICP-MS has resulted in a synthetic basaltic matrix series of glasses (USGS GS-series) and a natural basalt series of glasses (BCR-1G, BHVO-2G, and NKT-1G). These materials have been useful for the LA-ICP-MS community but were not originally intended for use by the electron or ion beam community. A material developed from start to finish with intended use in multiple microanalytical instruments would be useful for inter-laboratory and inter-instrument platform comparisons. This article summarizes the experiments undertaken to produce a basalt glass reference material suitable for distribution as a multiple-technique round robin material. The goal of the analytical work presented here is to demonstrate that the elemental homogeneity of the new glass is acceptable for its use as a reference material. Because the round robin exercise is still underway, only nominal compositional ranges for each element are given in the article.

  20. Simulation and statistics: Like rhythm and song

    NASA Astrophysics Data System (ADS)

    Othman, Abdul Rahman

    2013-04-01

    Simulation has been introduced to solve problems in the form of systems. By using this technique the following two problems can be overcome. First, a problem that has an analytical solution but the cost of running an experiment to solve is high in terms of money and lives. Second, a problem exists but has no analytical solution. In the field of statistical inference the second problem is often encountered. With the advent of high-speed computing devices, a statistician can now use resampling techniques such as the bootstrap and permutations to form pseudo sampling distribution that will lead to the solution of the problem that cannot be solved analytically. This paper discusses how a Monte Carlo simulation was and still being used to verify the analytical solution in inference. This paper also discusses the resampling techniques as simulation techniques. The misunderstandings about these two techniques are examined. The successful usages of both techniques are also explained.

  1. Analytical Techniques and Pharmacokinetics of Gastrodia elata Blume and Its Constituents.

    PubMed

    Wu, Jinyi; Wu, Bingchu; Tang, Chunlan; Zhao, Jinshun

    2017-07-08

    Gastrodia elata Blume ( G. elata ), commonly called Tianma in Chinese, is an important and notable traditional Chinese medicine (TCM), which has been used in China as an anticonvulsant, analgesic, sedative, anti-asthma, anti-immune drug since ancient times. The aim of this review is to provide an overview of the abundant efforts of scientists in developing analytical techniques and performing pharmacokinetic studies of G. elata and its constituents, including sample pretreatment methods, analytical techniques, absorption, distribution, metabolism, excretion (ADME) and influence factors to its pharmacokinetics. Based on the reported pharmacokinetic property data of G. elata and its constituents, it is hoped that more studies will focus on the development of rapid and sensitive analytical techniques, discovering new therapeutic uses and understanding the specific in vivo mechanisms of action of G. elata and its constituents from the pharmacokinetic viewpoint in the near future. The present review discusses analytical techniques and pharmacokinetics of G. elata and its constituents reported from 1985 onwards.

  2. Nitrate biosensors and biological methods for nitrate determination.

    PubMed

    Sohail, Manzar; Adeloju, Samuel B

    2016-06-01

    The inorganic nitrate (NO3‾) anion is present under a variety of both natural and artificial environmental conditions. Nitrate is ubiquitous within the environment, food, industrial and physiological systems and is mostly present as hydrated anion of a corresponding dissolved salt. Due to the significant environmental and toxicological effects of nitrate, its determination and monitoring in environmental and industrial waters are often necessary. A wide range of analytical techniques are available for nitrate determination in various sample matrices. This review discusses biosensors available for nitrate determination using the enzyme nitrate reductase (NaR). We conclude that nitrate determination using biosensors is an excellent non-toxic alternative to all other available analytical methods. Over the last fifteen years biosensing technology for nitrate analysis has progressed very well, however, there is a need to expedite the development of nitrate biosensors as a suitable alternative to non-enzymatic techniques through the use of different polymers, nanostructures, mediators and strategies to overcome oxygen interference. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Modelling a suitable location for Urban Solid Waste Management using AHP method and GIS -A geospatial approach and MCDM Model

    NASA Astrophysics Data System (ADS)

    Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.

    2016-12-01

    Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.

  4. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  5. Autonomous driving in NMR.

    PubMed

    Perez, Manuel

    2017-01-01

    The automatic analysis of NMR data has been a much-desired endeavour for the last six decades, as it is the case with any other analytical technique. This need for automation has only grown as advances in hardware; pulse sequences and automation have opened new research areas to NMR and increased the throughput of data. Full automatic analysis is a worthy, albeit hard, challenge, but in a world of artificial intelligence, instant communication and big data, it seems that this particular fight is happening with only one technique at a time (let this be NMR, MS, IR, UV or any other), when the reality of most laboratories is that there are several types of analytical instrumentation present. Data aggregation, verification and elucidation by using complementary techniques (e.g. MS and NMR) is a desirable outcome to pursue, although a time-consuming one if performed manually; hence, the use of automation to perform the heavy lifting for users is required to make the approach attractive for scientists. Many of the decisions and workflows that could be implemented under automation will depend on the two-way communication with databases that understand analytical data, because it is desirable not only to query these databases but also to grow them in as much of an automatic manner as possible. How these databases are designed, set up and the data inside classified will determine what workflows can be implemented. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. A study to define an in-flight dynamics measurement and data applications program for space shuttle payloads

    NASA Technical Reports Server (NTRS)

    Rader, W. P.; Barrett, S.; Payne, K. R.

    1975-01-01

    Data measurement and interpretation techniques were defined for application to the first few space shuttle flights, so that the dynamic environment could be sufficiently well established to be used to reduce the cost of future payloads through more efficient design and environmental test techniques. It was concluded that: (1) initial payloads must be given comprehensive instrumentation coverage to obtain detailed definition of acoustics, vibration, and interface loads, (2) analytical models of selected initial payloads must be developed and verified by modal surveys and flight measurements, (3) acoustic tests should be performed on initial payloads to establish realistic test criteria for components and experiments in order to minimize unrealistic failures and retest requirements, (4) permanent data banks should be set up to establish statistical confidence in the data to be used, (5) a more unified design/test specification philosophy is needed, (6) additional work is needed to establish a practical testing technique for simulation of vehicle transients.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, D.W.; Boparai, A.S.; Bowers, D.L.

    This report summarizes the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 2000 (October 1999 through September 2000). This annual progress report, which is the seventeenth in this series for the ACL, describes effort on continuing projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The ACL operates within the ANL system as a full-cost-recovery service center, but it has a mission that includes a complementary research and development component: The Analytical Chemistry Laboratory will provide high-quality, cost-effective chemical analysis and related technical support tomore » solve research problems of our clients--Argonne National Laboratory, the Department of Energy, and others--and will conduct world-class research and development in analytical chemistry and its applications. The ACL handles a wide range of analytical problems that reflects the diversity of research and development (R&D) work at ANL. Some routine or standard analyses are done, but the ACL operates more typically in a problem-solving mode in which development of methods is required or adaptation of techniques is needed to obtain useful analytical data. The ACL works with clients and commercial laboratories if a large number of routine analyses are required. Much of the support work done by the ACL is very similar to applied analytical chemistry research work.« less

  8. Uranium determination in natural water by the fissiontrack technique

    USGS Publications Warehouse

    Reimer, G.M.

    1975-01-01

    The fission track technique, utilizing the neutron-induced fission of uranium-235, provides a versatile analytical method for the routine analysis of uranium in liquid samples of natural water. A detector is immersed in the sample and both are irradiated. The fission track density observed in the detector is directly proportional to the uranium concentration. The specific advantages of this technique are: (1) only a small quantity of sample, typically 0.1-1 ml, is needed; (2) no sample concentration is necessary; (3) it is capable of providing analyses with a lower reporting limit of 1 ??g per liter; and (4) the actual time spent on an analysis can be only a few minutes. This paper discusses and describes the method. ?? 1975.

  9. Electrodynamic balance-mass spectrometry of single particles as a new platform for atmospheric chemistry research

    NASA Astrophysics Data System (ADS)

    Birdsall, Adam W.; Krieger, Ulrich K.; Keutsch, Frank N.

    2018-01-01

    New analytical techniques are needed to improve our understanding of the intertwined physical and chemical processes that affect the composition of aerosol particles in the Earth's atmosphere, such as gas-particle partitioning and homogenous or heterogeneous chemistry, and their ultimate relation to air quality and climate. We describe a new laboratory setup that couples an electrodynamic balance (EDB) to a mass spectrometer (MS). The EDB stores a single laboratory-generated particle in an electric field under atmospheric conditions for an arbitrarily long length of time. The particle is then transferred via gas flow to an ionization region that vaporizes and ionizes the analyte molecules before MS measurement. We demonstrate the feasibility of the technique by tracking evaporation of polyethylene glycol molecules and finding agreement with a kinetic model. Fitting data to the kinetic model also allows determination of vapor pressures to within a factor of 2. This EDB-MS system can be used to study fundamental chemical and physical processes involving particles that are difficult to isolate and study with other techniques. The results of such measurements can be used to improve our understanding of atmospheric particles.

  10. A subjective framework for seat comfort based on a heuristic multi criteria decision making technique and anthropometry.

    PubMed

    Fazlollahtabar, Hamed

    2010-12-01

    Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Technique for Calculating Solution Derivatives With Respect to Geometry Parameters in a CFD Code

    NASA Technical Reports Server (NTRS)

    Mathur, Sanjay

    2011-01-01

    A solution has been developed to the challenges of computation of derivatives with respect to geometry, which is not straightforward because these are not typically direct inputs to the computational fluid dynamics (CFD) solver. To overcome these issues, a procedure has been devised that can be used without having access to the mesh generator, while still being applicable to all types of meshes. The basic approach is inspired by the mesh motion algorithms used to deform the interior mesh nodes in a smooth manner when the surface nodes, for example, are in a fluid structure interaction problem. The general idea is to model the mesh edges and nodes as constituting a spring-mass system. Changes to boundary node locations are propagated to interior nodes by allowing them to assume their new equilibrium positions, for instance, one where the forces on each node are in balance. The main advantage of the technique is that it is independent of the volumetric mesh generator, and can be applied to structured, unstructured, single- and multi-block meshes. It essentially reduces the problem down to defining the surface mesh node derivatives with respect to the geometry parameters of interest. For analytical geometries, this is quite straightforward. In the more general case, one would need to be able to interrogate the underlying parametric CAD (computer aided design) model and to evaluate the derivatives either analytically, or by a finite difference technique. Because the technique is based on a partial differential equation (PDE), it is applicable not only to forward mode problems (where derivatives of all the output quantities are computed with respect to a single input), but it could also be extended to the adjoint problem, either by using an analytical adjoint of the PDE or a discrete analog.

  12. The new statistics: why and how.

    PubMed

    Cumming, Geoff

    2014-01-01

    We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.

  13. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  14. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Advanced Signal Processing Analysis of Laser-Induced Breakdown Spectroscopy Data for the Discrimination of Obsidian Sources

    DTIC Science & Technology

    2012-02-09

    different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16

  16. Two-dimensional convolute integers for analytical instrumentation

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.

    1982-01-01

    As new analytical instruments and techniques emerge with increased dimensionality, a corresponding need is seen for data processing logic which can appropriately address the data. Two-dimensional measurements reveal enhanced unknown mixture analysis capability as a result of the greater spectral information content over two one-dimensional methods taken separately. It is noted that two-dimensional convolute integers are merely an extension of the work by Savitzky and Golay (1964). It is shown that these low-pass, high-pass and band-pass digital filters are truly two-dimensional and that they can be applied in a manner identical with their one-dimensional counterpart, that is, a weighted nearest-neighbor, moving average with zero phase shifting, convoluted integer (universal number) weighting coefficients.

  17. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  18. One-calibrant kinetic calibration for on-site water sampling with solid-phase microextraction.

    PubMed

    Ouyang, Gangfeng; Cui, Shufen; Qin, Zhipei; Pawliszyn, Janusz

    2009-07-15

    The existing solid-phase microextraction (SPME) kinetic calibration technique, using the desorption of the preloaded standards to calibrate the extraction of the analytes, requires that the physicochemical properties of the standard should be similar to those of the analyte, which limited the application of the technique. In this study, a new method, termed the one-calibrant kinetic calibration technique, which can use the desorption of a single standard to calibrate all extracted analytes, was proposed. The theoretical considerations were validated by passive water sampling in laboratory and rapid water sampling in the field. To mimic the variety of the environment, such as temperature, turbulence, and the concentration of the analytes, the flow-through system for the generation of standard aqueous polycyclic aromatic hydrocarbons (PAHs) solution was modified. The experimental results of the passive samplings in the flow-through system illustrated that the effect of the environmental variables was successfully compensated with the kinetic calibration technique, and all extracted analytes can be calibrated through the desorption of a single calibrant. On-site water sampling with rotated SPME fibers also illustrated the feasibility of the new technique for rapid on-site sampling of hydrophobic organic pollutants in water. This technique will accelerate the application of the kinetic calibration method and also will be useful for other microextraction techniques.

  19. Therapeutic drug monitoring of flucytosine in serum using a SERS-active membrane system

    NASA Astrophysics Data System (ADS)

    Berger, Adam G.; White, Ian M.

    2017-02-01

    A need exists for near real-time therapeutic drug monitoring (TDM), in particular for antibiotics and antifungals in patient samples at the point-of-care. To truly fit the point-of-care need, techniques must be rapid and easy to use. Here we report a membrane system utilizing inkjet-fabricated surface enhanced Raman spectroscopy (SERS) sensors that allows sensitive and specific analysis despite the elimination of sophisticated chromatography equipment, expensive analytical instruments, and other systems relegated to the central lab. We utilize inkjet-fabricated paper SERS sensors as substrates for 5FC detection; the use of paper-based SERS substrates leverages the natural wicking ability and filtering properties of microporous membranes. We investigate the use of microporous membranes in the vertical flow assay to allow separation of the flucytosine from whole blood. The passive vertical flow assay serves as a valuable method for physical separation of target analytes from complex biological matrices. This work further establishes a platform for easy, sensitive, and specific TDM of 5FC from whole blood.

  20. Inline roasting hyphenated with gas chromatography-mass spectrometry as an innovative approach for assessment of cocoa fermentation quality and aroma formation potential.

    PubMed

    Van Durme, Jim; Ingels, Isabel; De Winne, Ann

    2016-08-15

    Today, the cocoa industry is in great need of faster and robust analytical techniques to objectively assess incoming cocoa quality. In this work, inline roasting hyphenated with a cooled injection system coupled to a gas chromatograph-mass spectrometer (ILR-CIS-GC-MS) has been explored for the first time to assess fermentation quality and/or overall aroma formation potential of cocoa. This innovative approach resulted in the in-situ formation of relevant cocoa aroma compounds. After comparison with data obtained by headspace solid phase micro extraction (HS-SPME-GC-MS) on conventional roasted cocoa beans, ILR-CIS-GC-MS data on unroasted cocoa beans showed similar formation trends of important cocoa aroma markers as a function of fermentation quality. The latter approach only requires small aliquots of unroasted cocoa beans, can be automatated, requires no sample preparation, needs relatively short analytical times (<1h) and is highly reproducible. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  2. Temperature control of the Mariner class spacecraft - A seven mission summary.

    NASA Technical Reports Server (NTRS)

    Dumas, L. N.

    1973-01-01

    Mariner spacecraft have completed five missions of scientific investigation of the planets. Two additional missions are planned. A description of the thermal design of these seven spacecraft is given herein. The factors which have influenced the thermal design include the mission requirements and constraints, the flight environment, certain programmatic considerations and the experience gained as each mission is completed. These factors are reviewed and the impact of each on thermal design and developmental techniques is assessed. It is concluded that the flight success of these spacecraft indicates that adequate temperature control has been obtained, but that improvements in design data, hardware performance and analytical techniques are needed.

  3. Microextraction by packed sorbent: an emerging, selective and high-throughput extraction technique in bioanalysis.

    PubMed

    Pereira, Jorge; Câmara, José S; Colmsjö, Anders; Abdel-Rehim, Mohamed

    2014-06-01

    Sample preparation is an important analytical step regarding the isolation and concentration of desired components from complex matrices and greatly influences their reliable and accurate analysis and data quality. It is the most labor-intensive and error-prone process in analytical methodology and, therefore, may influence the analytical performance of the target analytes quantification. Many conventional sample preparation methods are relatively complicated, involving time-consuming procedures and requiring large volumes of organic solvents. Recent trends in sample preparation include miniaturization, automation, high-throughput performance, on-line coupling with analytical instruments and low-cost operation through extremely low volume or no solvent consumption. Micro-extraction techniques, such as micro-extraction by packed sorbent (MEPS), have these advantages over the traditional techniques. This paper gives an overview of MEPS technique, including the role of sample preparation in bioanalysis, the MEPS description namely MEPS formats (on- and off-line), sorbents, experimental and protocols, factors that affect the MEPS performance, and the major advantages and limitations of MEPS compared with other sample preparation techniques. We also summarize MEPS recent applications in bioanalysis. Copyright © 2014 John Wiley & Sons, Ltd.

  4. The use of surface-enhanced Raman scattering for detecting molecular evidence of life in rocks, sediments, and sedimentary deposits.

    PubMed

    Bowden, Stephen A; Wilson, Rab; Cooper, Jonathan M; Parnell, John

    2010-01-01

    Raman spectroscopy is a versatile analytical technique capable of characterizing the composition of both inorganic and organic materials. Consequently, it is frequently suggested as a payload on many planetary landers. Only approximately 1 in every 10(6) photons are Raman scattered; therefore, the detection of trace quantities of an analyte dispersed in a sample matrix can be much harder to achieve. To overcome this, surface-enhanced Raman scattering (SERS) and surface-enhanced resonance Raman scattering (SERRS) both provide greatly enhanced signals (enhancements between 10(5) and 10(9)) through the analyte's interaction with the locally generated surface plasmons, which occur at a "roughened" or nanostructured metallic surface (e.g., Cu, Au, and Ag). Both SERS and SERRS may therefore provide a viable technique for trace analysis of samples. In this paper, we describe the development of SERS assays for analyzing trace amounts of compounds present in the solvent extracts of sedimentary deposits. These assays were used to detect biological pigments present in an Arctic microoasis (a small locale of elevated biological productivity) and its detrital regolith, characterize the pigmentation of microbial mats around hydrothermal springs, and detect fossil organic matter in hydrothermal deposits. These field study examples demonstrate that SERS technology is sufficiently mature to be applied to many astrobiological analog studies on Earth. Many current and proposed imaging systems intended for remote deployment already posses the instrumental components needed for SERS. The addition of wet chemistry sample processing facilities to these instruments could yield field-deployable analytical instruments with a broadened analytical window for detecting organic compounds with a biological or geological origin.

  5. On-line focusing of flavin derivatives using Dynamic pH junction-sweeping capillary electrophoresis with laser-induced fluorescence detection.

    PubMed

    Britz-McKibbin, Philip; Otsuka, Koji; Terabe, Shigeru

    2002-08-01

    Simple yet effective methods to enhance concentration sensitivity is needed for capillary electrophoresis (CE) to become a practical method to analyze trace levels of analytes in real samples. In this report, the development of a novel on-line preconcentration technique combining dynamic pH junction and sweeping modes of focusing is applied to the sensitive and selective analysis of three flavin derivatives: riboflavin, flavin mononucleotide (FMN) and flavin adenine dinucleotide (FAD). Picomolar (pM) detectability of flavins by CE with laser-induced fluorescence (LIF) detection is demonstrated through effective focusing of large sample volumes (up to 22% capillary length) using a dual pH junction-sweeping focusing mode. This results in greater than a 1,200-fold improvement in sensitivity relative to conventional injection methods, giving a limit of detection (S/N = 3) of approximately 4.0 pM for FAD and FMN. Flavin focusing is examined in terms of analyte mobility dependence on buffer pH, borate complexation and SDS interaction. Dynamic pH junction-sweeping extends on-line focusing to both neutral (hydrophobic) and weakly acidic (hydrophilic) species and is considered useful in cases when either conventional sweeping or dynamic pH junction techniques used alone are less effective for certain classes of analytes. Enhanced focusing performance by this hyphenated method was demonstrated by greater than a 4-fold reduction in flavin bandwidth, as compared to either sweeping or dynamic pH junction, reflected by analyte detector bandwidths <0.20 cm. Novel on-line focusing strategies are required to improve sensitivity in CE, which may be applied toward more effective biochemical analysis methods for diverse types of analytes.

  6. Simulated In Situ Determination of Soil Profile Organic and Inorganic Carbon With LIBS and VisNIR

    NASA Astrophysics Data System (ADS)

    Bricklemyer, R. S.; Brown, D. J.; Clegg, S. M.; Barefield, J. E.

    2008-12-01

    There is growing need for rapid, accurate, and inexpensive methods to measure, and verify soil organic carbon (SOC) change for national greenhouse gas accounting and the development of a soil carbon trading market. Laser Induced Breakdown Spectroscopy (LIBS) and Visible and Near Infrared Spectroscopy (VisNIR) are complementary analytical techniques that have the potential to fill that need. The LIBS method provides precise elemental analysis of soils, but generally cannot distinguish between organic C and inorganic C. VisNIR has been established as a viable technique for measuring soil properties including SOC and inorganic carbon (IC). As part of the Big Sky Carbon Sequestration Regional Partnership, 240 intact core samples (3.8 x 50 cm) have been collected from six agricultural fields in north central Montana, USA. Each of these core samples were probed concurrently with LIBS and VisNIR at 2.5, 7.5, 12.5, 17.5, 22.5, 27.5, 35 and 45 cm (+/- 1.5 cm) depths. VisNIR measurements were taken using an Analytical Spectral Devices (ASD, Boulder, CO, USA) Agrispec spectrometer to determine the partition of SOC vs. IC in the samples. The LIBS scans were collected with the LANL LIBS Core Scanner Instrument which collected the entire 200 - 900 nm plasma emission including the 247.8 nm carbon emission line. This instrument also collected the emission from the elements typically found in inorganic carbon (Ca and Mg) and organic carbon (H, O, and N). Subsamples of soil (~ 4 g) were taken from interrogation points for laboratory determination of SOC and IC. Using this analytical data, we constructed several full spectrum multivariate VisNIR/LIBS calibration models for SOC and IC. These models were then applied to independent validation cores for model evaluation.

  7. Experimental and analytical determination of stability parameters for a balloon tethered in a wind

    NASA Technical Reports Server (NTRS)

    Redd, L. T.; Bennett, R. M.; Bland, S. R.

    1973-01-01

    Experimental and analytical techniques for determining stability parameters for a balloon tethered in a steady wind are described. These techniques are applied to a particular 7.64-meter-long balloon, and the results are presented. The stability parameters of interest appear as coefficients in linearized stability equations and are derived from the various forces and moments acting on the balloon. In several cases the results from the experimental and analytical techniques are compared and suggestions are given as to which techniques are the most practical means of determining values for the stability parameters.

  8. Hypersonic airframe structures: Technology needs and flight test requirements

    NASA Technical Reports Server (NTRS)

    Stone, J. E.; Koch, L. C.

    1979-01-01

    Hypersonic vehicles, that may be produced by the year 2000, were identified. Candidate thermal/structural concepts that merit consideration for these vehicles were described. The current status of analytical methods, materials, manufacturing techniques, and conceptual developments pertaining to these concepts were reviewed. Guidelines establishing meaningful technology goals were defined and twenty-eight specific technology needs were identified. The extent to which these technology needs can be satisfied, using existing capabilities and facilities without the benefit of a hypersonic research aircraft, was assessed. The role that a research aircraft can fill in advancing this technology was discussed and a flight test program was outlined. Research aircraft thermal/structural design philosophy was also discussed. Programs, integrating technology advancements with the projected vehicle needs, were presented. Program options were provided to reflect various scheduling and cost possibilities.

  9. Analytical Chemistry: A Literary Approach.

    ERIC Educational Resources Information Center

    Lucy, Charles A.

    2000-01-01

    Provides an anthology of references to descriptions of analytical chemistry techniques from history, popular fiction, and film which can be used to capture student interest and frame discussions of chemical techniques. (WRM)

  10. Advances in analytical technologies for environmental protection and public safety.

    PubMed

    Sadik, O A; Wanekaya, A K; Andreescu, S

    2004-06-01

    Due to the increased threats of chemical and biological agents of injury by terrorist organizations, a significant effort is underway to develop tools that can be used to detect and effectively combat chemical and biochemical toxins. In addition to the right mix of policies and training of medical personnel on how to recognize symptoms of biochemical warfare agents, the major success in combating terrorism still lies in the prevention, early detection and the efficient and timely response using reliable analytical technologies and powerful therapies for minimizing the effects in the event of an attack. The public and regulatory agencies expect reliable methodologies and devices for public security. Today's systems are too bulky or slow to meet the "detect-to-warn" needs for first responders such as soldiers and medical personnel. This paper presents the challenges in monitoring technologies for warfare agents and other toxins. It provides an overview of how advances in environmental analytical methodologies could be adapted to design reliable sensors for public safety and environmental surveillance. The paths to designing sensors that meet the needs of today's measurement challenges are analyzed using examples of novel sensors, autonomous cell-based toxicity monitoring, 'Lab-on-a-Chip' devices and conventional environmental analytical techniques. Finally, in order to ensure that the public and legal authorities are provided with quality data to make informed decisions, guidelines are provided for assessing data quality and quality assurance using the United States Environmental Protection Agency (US-EPA) methodologies.

  11. Analysis of laser light-scattering interferometric devices for in-line diagnostics of moving particles

    NASA Astrophysics Data System (ADS)

    Naqwi, Amir A.; Durst, Franz

    1993-07-01

    Dual-beam laser measuring techniques are now being used, not only for velocimetry, but also for simultaneous measurements of particle size and velocity in particulate two-phase flows. However, certain details of these optical techniques, such as the effect of Gaussian beam profiles on the accuracy of the measurements, need to be further explored. To implement innovative improvements, a general analytic framework is needed in which performances of various dual-beam instruments could be quantitatively studied and compared. For this purpose, the analysis of light scattering in a generalized dual-wave system is presented in this paper. The present simulation model provides a basis for studying effects of nonplanar beam structures of incident waves, taking into account arbitrary modes of polarization. A polarizer is included in the receiving optics as well. The peculiar aspects of numerical integration of scattered light over circular, rectangular, and truncated circular apertures are also considered.

  12. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. A survey of nested grid techniques and their potential for use within the MASS weather prediction model

    NASA Technical Reports Server (NTRS)

    Koch, Steven E.; Mcqueen, Jeffery T.

    1987-01-01

    A survey of various one- and two-way interactive nested grid techniques used in hydrostatic numerical weather prediction models is presented and the advantages and disadvantages of each method are discussed. The techniques for specifying the lateral boundary conditions for each nested grid scheme are described in detail. Averaging and interpolation techniques used when applying the coarse mesh grid (CMG) and fine mesh grid (FMG) interface conditions during two-way nesting are discussed separately. The survey shows that errors are commonly generated at the boundary between the CMG and FMG due to boundary formulation or specification discrepancies. Methods used to control this noise include application of smoothers, enhanced diffusion, or damping-type time integration schemes to model variables. The results from this survey provide the information needed to decide which one-way and two-way nested grid schemes merit future testing with the Mesoscale Atmospheric Simulation System (MASS) model. An analytically specified baroclinic wave will be used to conduct systematic tests of the chosen schemes since this will allow for objective determination of the interfacial noise in the kind of meteorological setting for which MASS is designed. Sample diagnostic plots from initial tests using the analytic wave are presented to illustrate how the model-generated noise is ascertained. These plots will be used to compare the accuracy of the various nesting schemes when incorporated into the MASS model.

  14. Earth materials research: Report of a Workshop on Physics and Chemistry of Earth Materials

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The report concludes that an enhanced effort of earth materials research is necessary to advance the understanding of the processes that shape the planet. In support of such an effort, there are new classes of experiments, new levels of analytical sensitivity and precision, and new levels of theory that are now applicable in understanding the physical and chemical properties of geological materials. The application of these capabilities involves the need to upgrade and make greater use of existing facilities as well as the development of new techniques. A concomitant need is for a sample program involving their collection, synthesis, distribution, and analysis.

  15. Analytical Applications of Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    Guell, Oscar A.; Holcombe, James A.

    1990-01-01

    Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)

  16. Evaluation of new laser spectrometer techniques for in-situ carbon monoxide measurements

    NASA Astrophysics Data System (ADS)

    Zellweger, C.; Steinbacher, M.; Buchmann, B.

    2012-10-01

    Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. The current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came to market with promising properties for trace gas analytics. The current study compares three instruments that have recently become commercially available (since 2011) with the best currently available technique (Vacuum UV Fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques perform considerably better compared to previous techniques, although some issues, such as temperature influence and cross sensitivities, need further attention.

  17. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  18. Screening of synthetic PDE-5 inhibitors and their analogues as adulterants: analytical techniques and challenges.

    PubMed

    Patel, Dhavalkumar Narendrabhai; Li, Lin; Kee, Chee-Leong; Ge, Xiaowei; Low, Min-Yong; Koh, Hwee-Ling

    2014-01-01

    The popularity of phosphodiesterase type 5 (PDE-5) enzyme inhibitors for the treatment of erectile dysfunction has led to the increase in prevalence of illicit sexual performance enhancement products. PDE-5 inhibitors, namely sildenafil, tadalafil and vardenafil, and their unapproved designer analogues are being increasingly used as adulterants in the herbal products and health supplements marketed for sexual performance enhancement. To date, more than 50 unapproved analogues of prescription PDE-5 inhibitors were found as adulterants in the literature. To avoid detection of such adulteration by standard screening protocols, the perpetrators of such illegal products are investing time and resources to synthesize exotic analogues and devise novel means for adulteration. A comprehensive review of conventional and advance analytical techniques to detect and characterize the adulterants is presented. The rapid identification and structural elucidation of unknown analogues as adulterants is greatly enhanced by the wide myriad of analytical techniques employed, including high performance liquid chromatography (HPLC), gas chromatography-mass spectrometry (GC-MS), liquid chromatography mass-spectrometry (LC-MS), nuclear magnetic resonance (NMR) spectroscopy, vibrational spectroscopy, liquid chromatography-Fourier transform ion cyclotron resonance-mass spectrometry (LC-FT-ICR-MS), liquid chromatograph-hybrid triple quadrupole linear ion trap mass spectrometer with information dependent acquisition, ultra high performance liquid chromatography-time of flight-mass spectrometry (UHPLC-TOF-MS), ion mobility spectroscopy (IMS) and immunoassay methods. The many challenges in detecting and characterizing such adulterants, and the need for concerted effort to curb adulteration in order to safe guard public safety and interest are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Application of stochastic multiattribute analysis to assessment of single walled carbon nanotube synthesis processes.

    PubMed

    Canis, Laure; Linkov, Igor; Seager, Thomas P

    2010-11-15

    The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.

  20. Evaluation of selected methods for determining streamflow during periods of ice effect

    USGS Publications Warehouse

    Melcher, Norwood B.; Walker, J.F.

    1992-01-01

    Seventeen methods for estimating ice-affected streamflow are evaluated for potential use with the U.S. Geological Survey streamflow-gaging station network. The methods evaluated were identified by written responses from U.S. Geological Survey field offices and by a comprehensive literature search. The methods selected and techniques used for applying the methods are described in this report. The methods are evaluated by comparing estimated results with data collected at three streamflow-gaging stations in Iowa during the winter of 1987-88. Discharge measurements were obtained at 1- to 5-day intervals during the ice-affected periods at the three stations to define an accurate baseline record. Discharge records were compiled for each method based on data available, assuming a 6-week field schedule. The methods are classified into two general categories-subjective and analytical--depending on whether individual judgment is necessary for method application. On the basis of results of the evaluation for the three Iowa stations, two of the subjective methods (discharge ratio and hydrographic-and-climatic comparison) were more accurate than the other subjective methods and approximately as accurate as the best analytical method. Three of the analytical methods (index velocity, adjusted rating curve, and uniform flow) could potentially be used at streamflow-gaging stations, where the need for accurate ice-affected discharge estimates justifies the expense of collecting additional field data. One analytical method (ice-adjustment factor) may be appropriate for use at stations with extremely stable stage-discharge ratings and measuring sections. Further research is needed to refine the analytical methods. The discharge-ratio and multiple-regression methods produce estimates of streamflow for varying ice conditions using information obtained from the existing U.S. Geological Survey streamflow-gaging network.

  1. Accuracy of selected techniques for estimating ice-affected streamflow

    USGS Publications Warehouse

    Walker, John F.

    1991-01-01

    This paper compares the accuracy of selected techniques for estimating streamflow during ice-affected periods. The techniques are classified into two categories - subjective and analytical - depending on the degree of judgment required. Discharge measurements have been made at three streamflow-gauging sites in Iowa during the 1987-88 winter and used to established a baseline streamflow record for each site. Using data based on a simulated six-week field-tip schedule, selected techniques are used to estimate discharge during the ice-affected periods. For the subjective techniques, three hydrographers have independently compiled each record. Three measures of performance are used to compare the estimated streamflow records with the baseline streamflow records: the average discharge for the ice-affected period, and the mean and standard deviation of the daily errors. Based on average ranks for three performance measures and the three sites, the analytical and subjective techniques are essentially comparable. For two of the three sites, Kruskal-Wallis one-way analysis of variance detects significant differences among the three hydrographers for the subjective methods, indicating that the subjective techniques are less consistent than the analytical techniques. The results suggest analytical techniques may be viable tools for estimating discharge during periods of ice effect, and should be developed further and evaluated for sites across the United States.

  2. Experimental issues related to frequency response function measurements for frequency-based substructuring

    NASA Astrophysics Data System (ADS)

    Nicgorski, Dana; Avitabile, Peter

    2010-07-01

    Frequency-based substructuring is a very popular approach for the generation of system models from component measured data. Analytically the approach has been shown to produce accurate results. However, implementation with actual test data can cause difficulties and cause problems with the system response prediction. In order to produce good results, extreme care is needed in the measurement of the drive point and transfer impedances of the structure as well as observe all the conditions for a linear time invariant system. Several studies have been conducted to show the sensitivity of the technique to small variations that often occur during typical testing of structures. These variations have been observed in actual tested configurations and have been substantiated with analytical models to replicate the problems typically encountered. The use of analytically simulated issues helps to clearly see the effects of typical measurement difficulties often observed in test data. This paper presents some of these common problems observed and provides guidance and recommendations for data to be used for this modeling approach.

  3. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    NASA Astrophysics Data System (ADS)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  4. Analytical N beam position monitor method

    NASA Astrophysics Data System (ADS)

    Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.

    2017-11-01

    Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.

  5. Analytical methods in multivariate highway safety exposure data estimation

    DOT National Transportation Integrated Search

    1984-01-01

    Three general analytical techniques which may be of use in : extending, enhancing, and combining highway accident exposure data are : discussed. The techniques are log-linear modelling, iterative propor : tional fitting and the expectation maximizati...

  6. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  7. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    PubMed

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  8. An assessment of the Nguyen and Pinder method for slug test analysis. [In situ estimates of ground water contamination

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, J.J. Jr.; Hyder, Z.

    The Nguyen and Pinder method is one of four techniques commonly used for analysis of response data from slug tests. Limited field research has raised questions about the reliability of the parameter estimates obtained with this method. A theoretical evaluation of this technique reveals that errors were made in the derivation of the analytical solution upon which the technique is based. Simulation and field examples show that the errors result in parameter estimates that can differ from actual values by orders of magnitude. These findings indicate that the Nguyen and Pinder method should no longer be a tool in themore » repertoire of the field hydrogeologist. If data from a slug test performed in a partially penetrating well in a confined aquifer need to be analyzed, recent work has shown that the Hvorslev method is the best alternative among the commonly used techniques.« less

  9. Techniques for Forecasting Air Passenger Traffic

    NASA Technical Reports Server (NTRS)

    Taneja, N.

    1972-01-01

    The basic techniques of forecasting the air passenger traffic are outlined. These techniques can be broadly classified into four categories: judgmental, time-series analysis, market analysis and analytical. The differences between these methods exist, in part, due to the degree of formalization of the forecasting procedure. Emphasis is placed on describing the analytical method.

  10. Bioassays as one of the Green Chemistry tools for assessing environmental quality: A review.

    PubMed

    Wieczerzak, M; Namieśnik, J; Kudłak, B

    2016-09-01

    For centuries, mankind has contributed to irreversible environmental changes, but due to the modern science of recent decades, scientists are able to assess the scale of this impact. The introduction of laws and standards to ensure environmental cleanliness requires comprehensive environmental monitoring, which should also meet the requirements of Green Chemistry. The broad spectrum of Green Chemistry principle applications should also include all of the techniques and methods of pollutant analysis and environmental monitoring. The classical methods of chemical analyses do not always match the twelve principles of Green Chemistry, and they are often expensive and employ toxic and environmentally unfriendly solvents in large quantities. These solvents can generate hazardous and toxic waste while consuming large volumes of resources. Therefore, there is a need to develop reliable techniques that would not only meet the requirements of Green Analytical Chemistry, but they could also complement and sometimes provide an alternative to conventional classical analytical methods. These alternatives may be found in bioassays. Commercially available certified bioassays often come in the form of ready-to-use toxkits, and they are easy to use and relatively inexpensive in comparison with certain conventional analytical methods. The aim of this study is to provide evidence that bioassays can be a complementary alternative to classical methods of analysis and can fulfil Green Analytical Chemistry criteria. The test organisms discussed in this work include single-celled organisms, such as cell lines, fungi (yeast), and bacteria, and multicellular organisms, such as invertebrate and vertebrate animals and plants. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. An analytical study of reduced-gravity liquid reorientation using a simplified marker and cell technique

    NASA Technical Reports Server (NTRS)

    Betts, W. S., Jr.

    1972-01-01

    A computer program called HOPI was developed to predict reorientation flow dynamics, wherein liquids move from one end of a closed, partially filled, rigid container to the other end under the influence of container acceleration. The program uses the simplified marker and cell numerical technique and, using explicit finite-differencing, solves the Navier-Stokes equations for an incompressible viscous fluid. The effects of turbulence are also simulated in the program. HOPI can consider curved as well as straight walled boundaries. Both free-surface and confined flows can be calculated. The program was used to simulate five liquid reorientation cases. Three of these cases simulated actual NASA LeRC drop tower test conditions while two cases simulated full-scale Centaur tank conditions. It was concluded that while HOPI can be used to analytically determine the fluid motion in a typical settling problem, there is a current need to optimize HOPI. This includes both reducing the computer usage time and also reducing the core storage required for a given size problem.

  12. Simultaneous analysis of heparan sulfate, chondroitin/dermatan sulfates, and hyaluronan disaccharides by glycoblotting-assisted sample preparation followed by single-step zwitter-ionic-hydrophilic interaction chromatography.

    PubMed

    Takegawa, Yasuhiro; Araki, Kayo; Fujitani, Naoki; Furukawa, Jun-ichi; Sugiyama, Hiroaki; Sakai, Hideaki; Shinohara, Yasuro

    2011-12-15

    Glycosaminoglycans (GAGs) play important roles in cell adhesion and growth, maintenance of extracellular matrix (ECM) integrity, and signal transduction. To fully understand the biological functions of GAGs, there is a growing need for sensitive, rapid, and quantitative analysis of GAGs. The present work describes a novel analytical technique that enables high throughput cellular/tissue glycosaminoglycomics for all three families of uronic acid-containing GAGs, hyaluronan (HA), chondroitin sulfate (CS)/dermatan sulfate (DS), and heparan sulfate (HS). A one-pot purification and labeling procedure for GAG Δ-disaccharides was established by chemo-selective ligation of disaccharides onto high density hydrazide beads (glycoblotting) and subsequent labeling by fluorescence. The 17 most common disaccharides (eight comprising HS, eight CS/DS, and one comprising HA) could be separated with a single chromatography for the first time by employing a zwitter-ionic type of hydrophilic-interaction chromatography column. These novel analytical techniques were able to precisely characterize the glycosaminoglycome in various cell types including embryonal carcinoma cells and ocular epithelial tissues (cornea, conjunctiva, and limbus).

  13. Toward improved understanding and control in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hieftje, Gary M.

    1989-01-01

    As with most papers which attempt to predict the future, this treatment will begin with a coverage of past events. It will be shown that progress in the field of analytical atomic spectrometry has occurred through a series of steps which involve the addition of new techniques and the occasional displacement of established ones. Because it is difficult or impossible to presage true breakthroughs, this manuscript will focus on how such existing methods can be modified or improved to greatest advantage. The thesis will be that rational improvement can be accomplished most effectively by understanding fundamentally the nature of an instrumental system, a measurement process, and a spectrometric technique. In turn, this enhanced understanding can lead to closer control, from which can spring improved performance. Areas where understanding is now lacking and where control is most greatly needed will be identified and a possible scheme for implementing control procedures will be outlined. As we draw toward the new millennium, these novel procedures seem particularly appealing; new high-speed computers, the availability of expert systems, and our enhanced understanding of atomic spectrometric events combine to make future prospects extremely bright.

  14. MRI of human hair.

    PubMed

    Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael

    2009-06-01

    Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.

  15. Strategies for Distinguishing Abiotic Chemistry from Martian Biochemistry in Samples Returned from Mars

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Burton, A. S.; Callahan, M. P.; Elsila, J. E.; Stern, J. C.; Dworkin, J. P.

    2012-01-01

    A key goal in the search for evidence of extinct or extant life on Mars will be the identification of chemical biosignatures including complex organic molecules common to all life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, and nucleobases, which serve as the structural basis of information storage in DNA and RNA. However, many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1]. Therefore, an important challenge in the search for evidence of life on Mars will be distinguishing between abiotic chemistry of either meteoritic or martian origin from any chemical biosignatures from an extinct or extant martian biota. Although current robotic missions to Mars, including the 2011 Mars Science Laboratory (MSL) and the planned 2018 ExoMars rovers, will have the analytical capability needed to identify these key classes of organic molecules if present [2,3], return of a diverse suite of martian samples to Earth would allow for much more intensive laboratory studies using a broad array of extraction protocols and state-of-theart analytical techniques for bulk and spatially resolved characterization, molecular detection, and isotopic and enantiomeric compositions that may be required for unambiguous confirmation of martian life. Here we will describe current state-of-the-art laboratory analytical techniques that have been used to characterize the abundance and distribution of amino acids and nucleobases in meteorites, Apollo samples, and comet- exposed materials returned by the Stardust mission with an emphasis on their molecular characteristics that can be used to distinguish abiotic chemistry from biochemistry as we know it. The study of organic compounds in carbonaceous meteorites is highly relevant to Mars sample return analysis, since exogenous organic matter should have accumulated in the martian regolith over the last several billion years and the analytical techniques previously developed for the study of extraterrestrial materials can be applied to martian samples.

  16. CLOSED-LOOP STRIPPING ANALYSIS (CLSA) OF ...

    EPA Pesticide Factsheets

    Synthetic musk compounds have been found in surface water, fish tissues, and human breast milk. Current techniques for separating these compounds from fish tissues require tedious sample clean-upprocedures A simple method for the deterrnination of these compounds in fish tissues has been developed. Closed-loop stripping of saponified fish tissues in a I -L Wheaton purge-and-trap vessel is used to strip compounds with high vapor pressures such as synthetic musks from the matrix onto a solid sorbent (Abselut Nexus). This technique is useful for screening biological tissues that contain lipids for musk compounds. Analytes are desorbed from the sorbent trap sequentially with polar and nonpolar solvents, concentrated, and directly analyzed by high resolution gas chromatography coupled to a mass spectrometer operating in the selected ion monitoring mode. In this paper, we analyzed two homogenized samples of whole fish tissues with spiked synthetic musk compounds using closed-loop stripping analysis (CLSA) and pressurized liquid extraction (PLE). The analytes were not recovered quantitatively but the extraction yield was sufficiently reproducible for at least semi-quantitative purposes (screening). The method was less expensive to implement and required significantly less sample preparation than the PLE technique. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water,

  17. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  18. Development of techniques for advanced optical contamination measurement with internal reflection spectroscopy, phase 1, volume 1

    NASA Technical Reports Server (NTRS)

    Hayes, J. D.

    1972-01-01

    The feasibility of monitoring volatile contaminants in a large space simulation chamber using techniques of internal reflection spectroscopy was demonstrated analytically and experimentally. The infrared spectral region was selected as the operational spectral range in order to provide unique identification of the contaminants along with sufficient sensitivity to detect trace contaminant concentrations. It was determined theoretically that a monolayer of the contaminants could be detected and identified using optimized experimental procedures. This ability was verified experimentally. Procedures were developed to correct the attenuated total reflectance spectra for thick sample distortion. However, by using two different element designs the need for such correction can be avoided.

  19. Pooling sheep faecal samples for the assessment of anthelmintic drug efficacy using McMaster and Mini-FLOTAC in gastrointestinal strongyle and Nematodirus infection.

    PubMed

    Kenyon, Fiona; Rinaldi, Laura; McBean, Dave; Pepe, Paola; Bosco, Antonio; Melville, Lynsey; Devin, Leigh; Mitchell, Gillian; Ianniello, Davide; Charlier, Johannes; Vercruysse, Jozef; Cringoli, Giuseppe; Levecke, Bruno

    2016-07-30

    In small ruminants, faecal egg counts (FECs) and reduction in FECs (FECR) are the most common methods for the assessment of intensity of gastrointestinal (GI) nematodes infections and anthelmintic drug efficacy, respectively. The main limitation of these methods is the time and cost to conduct FECs on a representative number of individual animals. A cost-saving alternative would be to examine pooled faecal samples, however little is known regarding whether pooling can give representative results. In the present study, we compared the FECR results obtained by both an individual and a pooled examination strategy across different pool sizes and analytical sensitivity of the FEC techniques. A survey was conducted on 5 sheep farms in Scotland, where anthelmintic resistance is known to be widespread. Lambs were treated with fenbendazole (4 groups), levamisole (3 groups), ivermectin (3 groups) or moxidectin (1 group). For each group, individual faecal samples were collected from 20 animals, at baseline (D0) and 14 days after (D14) anthelmintic administration. Faecal samples were analyzed as pools of 3-5, 6-10, and 14-20 individual samples. Both individual and pooled samples were screened for GI strongyle and Nematodirus eggs using two FEC techniques with three different levels of analytical sensitivity, including Mini-FLOTAC (analytical sensitivity of 10 eggs per gram of faeces (EPG)) and McMaster (analytical sensitivity of 15 or 50 EPG).For both Mini-FLOTAC and McMaster (analytical sensitivity of 15 EPG), there was a perfect agreement in classifying the efficacy of the anthelmintic as 'normal', 'doubtful' or 'reduced' regardless of pool size. When using the McMaster method (analytical sensitivity of 50 EPG) anthelmintic efficacy was often falsely classified as 'normal' or assessment was not possible due to zero FECs at D0, and this became more pronounced when the pool size increased. In conclusion, pooling ovine faecal samples holds promise as a cost-saving and efficient strategy for assessing GI nematode FECR. However, for the assessment FECR one will need to consider the baseline FEC, pool size and analytical sensitivity of the method. Copyright © 2016. Published by Elsevier B.V.

  20. An Example of a Hakomi Technique Adapted for Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Collis, Peter

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a model of therapy that lends itself to integration with other therapy models. This paper aims to provide an example to assist others in assimilating techniques from other forms of therapy into FAP. A technique from the Hakomi Method is outlined and modified for FAP. As, on the whole, psychotherapy…

  1. Investigation of the feasibility of an analytical method of accounting for the effects of atmospheric drag on satellite motion

    NASA Technical Reports Server (NTRS)

    Bozeman, Robert E.

    1987-01-01

    An analytic technique for accounting for the joint effects of Earth oblateness and atmospheric drag on close-Earth satellites is investigated. The technique is analytic in the sense that explicit solutions to the Lagrange planetary equations are given; consequently, no numerical integrations are required in the solution process. The atmospheric density in the technique described is represented by a rotating spherical exponential model with superposed effects of the oblate atmosphere and the diurnal variations. A computer program implementing the process is discussed and sample output is compared with output from program NSEP (Numerical Satellite Ephemeris Program). NSEP uses a numerical integration technique to account for atmospheric drag effects.

  2. Current and future technology in radial and axial gas turbines

    NASA Technical Reports Server (NTRS)

    Rohlik, H. E.

    1983-01-01

    Design approaches and flow analysis techniques currently employed by aircraft engine manufacturers are assessed. Studies were performed to define the characteristics of aircraft and engines for civil missions of the 1990's and beyond. These studies, coupled with experience in recent years, identified the critical technologies needed to meet long range goals in fuel economy and other operating costs. Study results, recent and current research and development programs, and an estimate of future design and analytic capabilities are discussed.

  3. Structural Glycomic Analyses at High Sensitivity: A Decade of Progress

    NASA Astrophysics Data System (ADS)

    Alley, William R.; Novotny, Milos V.

    2013-06-01

    The field of glycomics has recently advanced in response to the urgent need for structural characterization and quantification of complex carbohydrates in biologically and medically important applications. The recent success of analytical glycobiology at high sensitivity reflects numerous advances in biomolecular mass spectrometry and its instrumentation, capillary and microchip separation techniques, and microchemical manipulations of carbohydrate reactivity. The multimethodological approach appears to be necessary to gain an in-depth understanding of very complex glycomes in different biological systems.

  4. Structural Glycomic Analyses at High Sensitivity: A Decade of Progress

    PubMed Central

    Alley, William R.; Novotny, Milos V.

    2014-01-01

    The field of glycomics has recently advanced in response to the urgent need for structural characterization and quantification of complex carbohydrates in biologically and medically important applications. The recent success of analytical glycobiology at high sensitivity reflects numerous advances in biomolecular mass spectrometry and its instrumentation, capillary and microchip separation techniques, and microchemical manipulations of carbohydrate reactivity. The multimethodological approach appears to be necessary to gain an in-depth understanding of very complex glycomes in different biological systems. PMID:23560930

  5. Dynamic Loading and Characterization of Fiber-Reinforced Composites

    NASA Astrophysics Data System (ADS)

    Sierakowski, Robert L.; Chaturvedi, Shive K.

    1997-02-01

    Emphasizing polymer based fiber-reinforced composites, this book is designed to provide readers with a significant understanding of the complexities involved in characterizing dynamic events and the corresponding response of advanced fiber composite materials and structures. These elements include dynamic loading devices, material properties characterization, analytical and experimental techniques to assess the damage and failure modes associated with various dynamic loading events. Concluding remarks are presented throughout the text which summarize key points and raise issues related to important research needed.

  6. MODELING MICROBUBBLE DYNAMICS IN BIOMEDICAL APPLICATIONS*

    PubMed Central

    CHAHINE, Georges L.; HSIAO, Chao-Tsung

    2012-01-01

    Controlling microbubble dynamics to produce desirable biomedical outcomes when and where necessary and avoid deleterious effects requires advanced knowledge, which can be achieved only through a combination of experimental and numerical/analytical techniques. The present communication presents a multi-physics approach to study the dynamics combining viscous- in-viscid effects, liquid and structure dynamics, and multi bubble interaction. While complex numerical tools are developed and used, the study aims at identifying the key parameters influencing the dynamics, which need to be included in simpler models. PMID:22833696

  7. Measuring solids concentration in stormwater runoff: comparison of analytical methods.

    PubMed

    Clark, Shirley E; Siu, Christina Y S

    2008-01-15

    Stormwater suspended solids typically are quantified using one of two methods: aliquot/subsample analysis (total suspended solids [TSS]) or whole-sample analysis (suspended solids concentration [SSC]). Interproject comparisons are difficult because of inconsistencies in the methods and in their application. To address this concern, the suspended solids content has been measured using both methodologies in many current projects, but the question remains about how to compare these values with historical water-quality data where the analytical methodology is unknown. This research was undertaken to determine the effect of analytical methodology on the relationship between these two methods of determination of the suspended solids concentration, including the effect of aliquot selection/collection method and of particle size distribution (PSD). The results showed that SSC was best able to represent the known sample concentration and that the results were independent of the sample's PSD. Correlations between the results and the known sample concentration could be established for TSS samples, but they were highly dependent on the sample's PSD and on the aliquot collection technique. These results emphasize the need to report not only the analytical method but also the particle size information on the solids in stormwater runoff.

  8. Comparision of ICP-OES and MP-AES in determing soil nutrients by Mechlich3 method

    NASA Astrophysics Data System (ADS)

    Tonutare, Tonu; Penu, Priit; Krebstein, Kadri; Rodima, Ako; Kolli, Raimo; Shanskiy, Merrit

    2014-05-01

    Accurate, routine testing of nutrients in soil samples is critical to understanding soil potential fertility. There are different factors which must be taken into account selecting the best analytical technique for soil laboratory analysis. Several techniques can provide adequate detection range for same analytical subject. In similar cases the choise of technique will depend on factors such as sample throughput, required infrastructure, ease of use, used chemicals and need for gas supply and operating costs. Mehlich 3 extraction method is widely used for the determination of the plant available nutrient elements contents in agricultural soils. For determination of Ca, K, and Mg from soil extract depending of laboratory ICP and AAS techniques are used, also flame photometry for K in some laboratories. For the determination of extracted P is used ICP or Vis spectrometry. The excellent sensitivity and wide working range for all extracted elements make ICP a nearly ideal method, so long as the sample throughput is big enough to justify the initial capital outlay. Other advantage of ICP techniques is the multiplex character (simultaneous acquisition of all wavelengths). Depending on element the detection limits are in range 0.1 - 1000 μg/L. For smaller laboratories with low sample throughput requirements the use of AAS is more common. Flame AAS is a fast, relatively cheap and easy technique for analysis of elements. The disadvantages of the method is single element analysis and use of flammable gas, like C2H2 and oxidation gas N2O for some elements. Detection limits of elements for AAS lays from 1 to 1000 μg/L. MP-AES offers a unique alternative to both, AAS and ICP-OES techniques with its detection power, speed of analysis. MP-AES is quite new, simple and relatively inexpensive multielemental technique, which is use self-sustained atmospheric pressure microwave plasma (MP) using nitrogen gas generated by nitrogen generator. Therefore not needs for argon and flammable (C2H2) gases, cylinder handling and the running costs of equipment are low. Detection limits of elements for MP-AES lays between the AAS and ICP ones. The objective of this study was to compare the results of soil analysis using two multielemental analytical methods - ICP-OES and MP-AES. In the experiment, different soil types with various texture, content of organic matter and pH were used. For the study soil samples of Albeluvisols, Leptosols, Cambisols, Regosols and Histosols were used . The plant available nutrients were estimated by Mehlich 3 extraction. The ICP-OES analysis were provided in the Estonian Agricultural Research Centre and MP-AES analysis in department of Soil Science and Agrochemistry at Estonian University of Life Sciences. The detection limits and limits of quantification of Ca, K, Mg and P in extracts are calculated and reported.

  9. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  10. Big data analytics in hyperspectral imaging for detection of microbial colonies on agar plates (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Yoon, Seung-Chul; Park, Bosoon; Lawrence, Kurt C.

    2017-05-01

    Various types of optical imaging techniques measuring light reflectivity and scattering can detect microbial colonies of foodborne pathogens on agar plates. Until recently, these techniques were developed to provide solutions for hypothesis-driven studies, which focused on developing tools and batch/offline machine learning methods with well defined sets of data. These have relatively high accuracy and rapid response time because the tools and methods are often optimized for the collected data. However, they often need to be retrained or recalibrated when new untrained data and/or features are added. A big-data driven technique is more suitable for online learning of new/ambiguous samples and for mining unknown or hidden features. Although big data research in hyperspectral imaging is emerging in remote sensing and many tools and methods have been developed so far in many other applications such as bioinformatics, the tools and methods still need to be evaluated and adjusted in applications where the conventional batch machine learning algorithms were dominant. The primary objective of this study is to evaluate appropriate big data analytic tools and methods for online learning and mining of foodborne pathogens on agar plates. After the tools and methods are successfully identified, they will be applied to rapidly search big color and hyperspectral image data of microbial colonies collected over the past 5 years in house and find the most probable colony or a group of colonies in the collected big data. The meta-data, such as collection time and any unstructured data (e.g. comments), will also be analyzed and presented with output results. The expected results will be novel, big data-driven technology to correctly detect and recognize microbial colonies of various foodborne pathogens on agar plates.

  11. Artificial intelligence in medicine.

    PubMed Central

    Ramesh, A. N.; Kambhampati, C.; Monson, J. R. T.; Drew, P. J.

    2004-01-01

    INTRODUCTION: Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. METHODS: Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. RESULTS: The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. DISCUSSION: Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting. PMID:15333167

  12. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  13. Artificial intelligence in medicine.

    PubMed

    Ramesh, A N; Kambhampati, C; Monson, J R T; Drew, P J

    2004-09-01

    Artificial intelligence is a branch of computer science capable of analysing complex medical data. Their potential to exploit meaningful relationship with in a data set can be used in the diagnosis, treatment and predicting outcome in many clinical scenarios. Medline and internet searches were carried out using the keywords 'artificial intelligence' and 'neural networks (computer)'. Further references were obtained by cross-referencing from key articles. An overview of different artificial intelligent techniques is presented in this paper along with the review of important clinical applications. The proficiency of artificial intelligent techniques has been explored in almost every field of medicine. Artificial neural network was the most commonly used analytical tool whilst other artificial intelligent techniques such as fuzzy expert systems, evolutionary computation and hybrid intelligent systems have all been used in different clinical settings. Artificial intelligence techniques have the potential to be applied in almost every field of medicine. There is need for further clinical trials which are appropriately designed before these emergent techniques find application in the real clinical setting.

  14. Trends in data processing of comprehensive two-dimensional chromatography: state of the art.

    PubMed

    Matos, João T V; Duarte, Regina M B O; Duarte, Armando C

    2012-12-01

    The operation of advanced chromatographic systems, namely comprehensive two-dimensional (2D) chromatography coupled to multidimensional detectors, allows achieving a great deal of data that need special care to be processed in order to characterize and quantify as much as possible the analytes under study. The aim of this review is to identify the main trends, research needs and gaps on the techniques for data processing of multidimensional data sets obtained from comprehensive 2D chromatography. The following topics have been identified as the most promising for new developments in the near future: data acquisition and handling, peak detection and quantification, measurement of overlapping of 2D peaks, and data analysis software for 2D chromatography. The rational supporting most of the data processing techniques is based on the generalization of one-dimensional (1D) chromatography although algorithms, such as the inverted watershed algorithm, use the 2D chromatographic data as such. However, for processing more complex N-way data there is a need for using more sophisticated techniques. Apart from using other concepts from 1D chromatography, which have not been tested for 2D chromatography, there is still room for new improvements and developments in algorithms and software for dealing with 2D comprehensive chromatographic data. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Does leaf chemistry differentially affect breakdown in tropical vs temperate streams? Importance of standardized analytical techniques to measure leaf chemistry

    Treesearch

    Marcelo Ard& #243; n; Catherine M. Pringle; Susan L. Eggert

    2009-01-01

    Comparisons of the effects of leaf litter chemistry on leaf breakdown rates in tropical vs temperate streams are hindered by incompatibility among studies and across sites of analytical methods used to measure leaf chemistry. We used standardized analytical techniques to measure chemistry and breakdown rate of leaves from common riparian tree species at 2 sites, 1...

  16. Fractals and Spatial Methods for Mining Remote Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina; Emerson, Charles; Quattrochi, Dale

    2003-01-01

    The rapid increase in digital remote sensing and GIS data raises a critical problem -- how can such an enormous amount of data be handled and analyzed so that useful information can be derived quickly? Efficient handling and analysis of large spatial data sets is central to environmental research, particularly in global change studies that employ time series. Advances in large-scale environmental monitoring and modeling require not only high-quality data, but also reliable tools to analyze the various types of data. A major difficulty facing geographers and environmental scientists in environmental assessment and monitoring is that spatial analytical tools are not easily accessible. Although many spatial techniques have been described recently in the literature, they are typically presented in an analytical form and are difficult to transform to a numerical algorithm. Moreover, these spatial techniques are not necessarily designed for remote sensing and GIS applications, and research must be conducted to examine their applicability and effectiveness in different types of environmental applications. This poses a chicken-and-egg problem: on one hand we need more research to examine the usability of the newer techniques and tools, yet on the other hand, this type of research is difficult to conduct if the tools to be explored are not accessible. Another problem that is fundamental to environmental research are issues related to spatial scale. The scale issue is especially acute in the context of global change studies because of the need to integrate remote-sensing and other spatial data that are collected at different scales and resolutions. Extrapolation of results across broad spatial scales remains the most difficult problem in global environmental research. There is a need for basic characterization of the effects of scale on image data, and the techniques used to measure these effects must be developed and implemented to allow for a multiple scale assessment of the data before any useful process-oriented modeling involving scale-dependent data can be conducted. Through the support of research grants from NASA, we have developed a software module called ICAMS (Image Characterization And Modeling System) to address the need to develop innovative spatial techniques and make them available to the broader scientific communities. ICAMS provides new spatial techniques, such as fractal analysis, geostatistical functions, and multiscale analysis that are not easily available in commercial GIS/image processing software. By bundling newer spatial methods in a user-friendly software module, researchers can begin to test and experiment with the new spatial analysis methods and they can gauge scale effects using a variety of remote sensing imagery. In the following, we describe briefly the development of ICAMS and present application examples.

  17. Reduction of multi-dimensional laboratory data to a two-dimensional plot: a novel technique for the identification of laboratory error.

    PubMed

    Kazmierczak, Steven C; Leen, Todd K; Erdogmus, Deniz; Carreira-Perpinan, Miguel A

    2007-01-01

    The clinical laboratory generates large amounts of patient-specific data. Detection of errors that arise during pre-analytical, analytical, and post-analytical processes is difficult. We performed a pilot study, utilizing a multidimensional data reduction technique, to assess the utility of this method for identifying errors in laboratory data. We evaluated 13,670 individual patient records collected over a 2-month period from hospital inpatients and outpatients. We utilized those patient records that contained a complete set of 14 different biochemical analytes. We used two-dimensional generative topographic mapping to project the 14-dimensional record to a two-dimensional space. The use of a two-dimensional generative topographic mapping technique to plot multi-analyte patient data as a two-dimensional graph allows for the rapid identification of potentially anomalous data. Although we performed a retrospective analysis, this technique has the benefit of being able to assess laboratory-generated data in real time, allowing for the rapid identification and correction of anomalous data before they are released to the physician. In addition, serial laboratory multi-analyte data for an individual patient can also be plotted as a two-dimensional plot. This tool might also be useful for assessing patient wellbeing and prognosis.

  18. Analytical Chemistry of Surfaces: Part II. Electron Spectroscopy.

    ERIC Educational Resources Information Center

    Hercules, David M.; Hercules, Shirley H.

    1984-01-01

    Discusses two surface techniques: X-ray photoelectron spectroscopy (ESCA) and Auger electron spectroscopy (AES). Focuses on fundamental aspects of each technique, important features of instrumentation, and some examples of how ESCA and AES have been applied to analytical surface problems. (JN)

  19. Assessment of skin exposure to nickel, chromium and cobalt by acid wipe sampling and ICP-MS.

    PubMed

    Lidén, Carola; Skare, Lizbet; Lind, Birger; Nise, Gun; Vahter, Marie

    2006-05-01

    There is a great need to accurately assess skin exposure to contact allergens. We have developed a technique for assessment of skin exposure to nickel, chromium and cobalt using acid wipe sampling by cellulose wipes with 1% nitric acid. Chemical analysis was performed by inductively coupled plasma mass spectrometry (ICP-MS). The recovery of nickel, chromium and cobalt from arms and palms was 93%. The analytical result is expressed in terms of mass per unit area (microg/cm(2)). The developed acid wipe sampling technique is suitable for determination of nickel, chromium and cobalt deposited on the skin. The technique may be used in workplace studies, in studies of individuals in the general population, in dermatitis patients, in identification of risk groups, as well as in developing preventive strategies and in follow-up after intervention.

  20. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  1. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  2. Engineering Rugged Field Assays to Detect Hazardous Chemicals Using Spore-Based Bacterial Biosensors.

    PubMed

    Wynn, Daniel; Deo, Sapna; Daunert, Sylvia

    2017-01-01

    Bacterial whole cell-based biosensors have been genetically engineered to achieve selective and reliable detection of a wide range of hazardous chemicals. Although whole-cell biosensors demonstrate many advantages for field-based detection of target analytes, there are still some challenges that need to be addressed. Most notably, their often modest shelf life and need for special handling and storage make them challenging to use in situations where access to reagents, instrumentation, and expertise are limited. These problems can be circumvented by developing biosensors in Bacillus spores, which can be engineered to address all of these concerns. In its sporulated state, a whole cell-based biosensor has a remarkably long life span and is exceptionally resistant to environmental insult. When these spores are germinated for use in analytical techniques, they show no loss in performance, even after long periods of storage under harsh conditions. In this chapter, we will discuss the development and use of whole cell-based sensors, their adaptation to spore-based biosensors, their current applications, and future directions in the field. © 2017 Elsevier Inc. All rights reserved.

  3. Algorithmic implementation of particle-particle ladder diagram approximation to study strongly-correlated metals and semiconductors

    NASA Astrophysics Data System (ADS)

    Prayogi, A.; Majidi, M. A.

    2017-07-01

    In condensed-matter physics, strongly-correlated systems refer to materials that exhibit variety of fascinating properties and ordered phases, depending on temperature, doping, and other factors. Such unique properties most notably arise due to strong electron-electron interactions, and in some cases due to interactions involving other quasiparticles as well. Electronic correlation effects are non-trivial that one may need a sufficiently accurate approximation technique with quite heavy computation, such as Quantum Monte-Carlo, in order to capture particular material properties arising from such effects. Meanwhile, less accurate techniques may come with lower numerical cost, but the ability to capture particular properties may highly depend on the choice of approximation. Among the many-body techniques derivable from Feynman diagrams, we aim to formulate algorithmic implementation of the Ladder Diagram approximation to capture the effects of electron-electron interactions. We wish to investigate how these correlation effects influence the temperature-dependent properties of strongly-correlated metals and semiconductors. As we are interested to study the temperature-dependent properties of the system, the Ladder diagram method needs to be applied in Matsubara frequency domain to obtain the self-consistent self-energy. However, at the end we would also need to compute the dynamical properties like density of states (DOS) and optical conductivity that are defined in the real frequency domain. For this purpose, we need to perform the analytic continuation procedure. At the end of this study, we will test the technique by observing the occurrence of metal-insulator transition in strongly-correlated metals, and renormalization of the band gap in strongly-correlated semiconductors.

  4. Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"

    NASA Astrophysics Data System (ADS)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.

  5. Approximate analytical relationships for linear optimal aeroelastic flight control laws

    NASA Astrophysics Data System (ADS)

    Kassem, Ayman Hamdy

    1998-09-01

    This dissertation introduces new methods to uncover functional relationships between design parameters of a contemporary control design technique and the resulting closed-loop properties. Three new methods are developed for generating such relationships through analytical expressions: the Direct Eigen-Based Technique, the Order of Magnitude Technique, and the Cost Function Imbedding Technique. Efforts concentrated on the linear-quadratic state-feedback control-design technique applied to an aeroelastic flight control task. For this specific application, simple and accurate analytical expressions for the closed-loop eigenvalues and zeros in terms of basic parameters such as stability and control derivatives, structural vibration damping and natural frequency, and cost function weights are generated. These expressions explicitly indicate how the weights augment the short period and aeroelastic modes, as well as the closed-loop zeros, and by what physical mechanism. The analytical expressions are used to address topics such as damping, nonminimum phase behavior, stability, and performance with robustness considerations, and design modifications. This type of knowledge is invaluable to the flight control designer and would be more difficult to formulate when obtained from numerical-based sensitivity analysis.

  6. Isotope-ratio-monitoring gas chromatography-mass spectrometry: methods for isotopic calibration

    NASA Technical Reports Server (NTRS)

    Merritt, D. A.; Brand, W. A.; Hayes, J. M.

    1994-01-01

    In trial analyses of a series of n-alkanes, precise determinations of 13C contents were based on isotopic standards introduced by five different techniques and results were compared. Specifically, organic-compound standards were coinjected with the analytes and carried through chromatography and combustion with them; or CO2 was supplied from a conventional inlet and mixed with the analyte in the ion source, or CO2 was supplied from an auxiliary mixing volume and transmitted to the source without interruption of the analyte stream. Additionally, two techniques were investigated in which the analyte stream was diverted and CO2 standards were placed on a near-zero background. All methods provided accurate results. Where applicable, methods not involving interruption of the analyte stream provided the highest performance (sigma = 0.00006 at.% 13C or 0.06% for 250 pmol C as CO2 reaching the ion source), but great care was required. Techniques involving diversion of the analyte stream were immune to interference from coeluting sample components and still provided high precision (0.0001 < or = sigma < or = 0.0002 at.% or 0.1 < or = sigma < or = 0.2%).

  7. Analytical technique characterizes all trace contaminants in water

    NASA Technical Reports Server (NTRS)

    Foster, J. N.; Lysyj, I.; Nelson, K. H.

    1967-01-01

    Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.

  8. Immuno Nanosensor for the Ultrasensitive Naked Eye Detection of Tuberculosis.

    PubMed

    Mohd Bakhori, Noremylia; Yusof, Nor Azah; Abdullah, Jaafar; Wasoh, Helmi; Md Noor, Siti Suraiya; Ahmad Raston, Nurul Hanun; Mohammad, Faruq

    2018-06-14

    In the present study, a beneficial approach for the ultrasensitive and affordable naked eye detection and diagnosis of tuberculosis (TB) by utilizing plasmonic enzyme-linked immunosorbent assay (ELISA) via antibody-antigen interaction was studied. Here, the biocatalytic cycle of the intracellular enzymes links to the formation and successive growth of the gold nanoparticles (GNPs) for ultrasensitive detection. The formation of different colored solutions by the plasmonic nanoparticles in the presence of enzyme labels links directly to the existence or non-existence of the TB analytes in the sample solutions. For disease detection, the adapted protocol is based mainly on the conventional ELISA procedure that involves catalase-labeled antibodies, i.e., the enzymes consume hydrogen peroxide and further produce GNPs with the addition of gold (III) chloride. The amount of hydrogen peroxide remaining in the solution determines whether the GNPs solution is to be formed in the color blue or the color red, as it serves as a confirmation for the naked eye detection of TB analytes. However, the conventional ELISA method only shows tonal colors that need a high concentration of analyte to achieve high confidence levels for naked eye detection. Also, in this research, we proposed the incorporation of protein biomarker, Mycobacterium tuberculosis ESAT-6-like protein esxB (CFP-10), as a means of TB detection using plasmonic ELISA. With the use of this technique, the CFP-10 detection limit can be lowered to 0.01 µg/mL by the naked eye. Further, our developed technique was successfully tested and confirmed with sputum samples from patients diagnosed with positive TB, thereby providing enough evidence for the utilization of our technique in the early diagnosis of TB disease.

  9. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  10. Analytical techniques and method validation for the measurement of selected semivolatile and nonvolatile organofluorochemicals in air.

    PubMed

    Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M

    2004-09-01

    The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.

  11. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  12. Fabrication of Nanostructured Mesoporous Germanium for Application in Laser Desorption Ionization Mass Spectrometry.

    PubMed

    Abdelmaksoud, Hazem H; Guinan, Taryn M; Voelcker, Nicolas H

    2017-02-15

    Surface-assisted laser desorption/ionization mass spectrometry (SALDI-MS) is a high-throughput analytical technique ideally suited for small-molecule detection from different bodily fluids (e.g., saliva, urine, and blood plasma). Many SALDI-MS substrates require complex fabrication processes and further surface modifications. Furthermore, some substrates show instability upon exposure to ambient conditions and need to be kept under special inert conditions. We have successfully optimized mesoporous germanium (meso-pGe) using bipolar electrochemical etching and efficiently applied meso-pGe as a SALDI-MS substrate for the detection of illicit drugs such as in the context of workplace, roadside, and antiaddictive drug compliance. Argon plasma treatment improved the meso-pGe efficiency as a SALDI-MS substrate and eliminated the need for surface functionalization. The resulting substrate showed a precise surface geometry tuning by altering the etching parameters, and an outstanding performance for illicit drug detection with a limit of detection in Milli-Q water of 1.7 ng/mL and in spiked saliva as low as 5.3 ng/mL for cocaine. The meso-pGe substrate had a demonstrated stability over 56 days stored in ambient conditions. This proof-of-principle study demonstrates that meso-pGe can be reproducibly fabricated and applied as an analytical SALDI-MS substrate which opens the door for further analytical and forensic high-throughput applications.

  13. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  14. Common aspects influencing the translocation of SERS to Biomedicine.

    PubMed

    Gil, Pilar Rivera; Tsouts, Dionysia; Sanles-Sobrido, Marcos; Cabo, Andreu

    2018-01-04

    In this review, we introduce the reader the analytical technique, surface-enhanced Raman scattering motivated by the great potential we believe this technique have in biomedicine. We present the advantages and limitations of this technique relevant for bioanalysis in vitro and in vivo and how this technique goes beyond the state of the art of traditional analytical, labelling and healthcare diagnosis technologies. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  15. [The water content reference material of water saturated octanol].

    PubMed

    Wang, Haifeng; Ma, Kang; Zhang, Wei; Li, Zhanyuan

    2011-03-01

    The national standards of biofuels specify the technique specification and analytical methods. A water content certified reference material based on the water saturated octanol was developed in order to satisfy the needs of the instrument calibration and the methods validation, assure the accuracy and consistency of results in water content measurements of biofuels. Three analytical methods based on different theories were employed to certify the water content of the reference material, including Karl Fischer coulometric titration, Karl Fischer volumetric titration and quantitative nuclear magnetic resonance. The consistency of coulometric and volumetric titration was achieved through the improvement of methods. The accuracy of the certified result was improved by the introduction of the new method of quantitative nuclear magnetic resonance. Finally, the certified value of reference material is 4.76% with an expanded uncertainty of 0.09%.

  16. A guide for the application of analytics on healthcare processes: A dynamic view on patient pathways.

    PubMed

    Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan

    2016-10-01

    The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Light aircraft crash safety program

    NASA Technical Reports Server (NTRS)

    Thomson, R. G.; Hayduk, R. J.

    1974-01-01

    NASA is embarked upon research and development tasks aimed at providing the general aviation industry with a reliable crashworthy airframe design technology. The goals of the NASA program are: reliable analytical techniques for predicting the nonlinear behavior of structures; significant design improvements of airframes; and simulated full-scale crash test data. The analytical tools will include both simplified procedures for estimating energy absorption characteristics and more complex computer programs for analysis of general airframe structures under crash loading conditions. The analytical techniques being developed both in-house and under contract are described, and a comparison of some analytical predictions with experimental results is shown.

  18. Towards a 3D modelling of the microwave photo-induced load in CPW technology

    NASA Astrophysics Data System (ADS)

    Gary, Rene; Arnould, Jean-Daniel; Vilcot, Anne

    2005-09-01

    The optical control study works on both the optical and the microwave behaviours of the plasma photo-induced in the semiconductor enlightened by a laser beam. The presented study is based on the necessity to be able to foresee the microwave response of CPW microwave devices versus different optical powers and different kinds of optical fibers, single-mode or multimode. The optical part has been achieved analytically by solving the diffusion equation of photo-induced carriers using the Hankel transform in 3-Dimensions. The added value of this technique is its precision and fastness. For the electromagnetic part we have chosen to use CST Microwave Studio software, which solves numerically Maxwell's equations with a Finite Integration Technique (FIT). For this aim we have had to model the photo-induced load using the locally changed conductivity directly depending of the excess carriers distribution. In the final paper, the first part will deal with the analytical computation of the photo-induced excess carrier in silicon substrate using the Hankel transform under permanent enlightening. Then the explanation of the model will be based on the need of a 3-Dimension model that may be described in an electromagnetic software. Finally simulation results of simple CPW devices as stub will be compared to measurements. In conclusion, we will show that the model is suitable for designing more complex devices and that it can be simplified in case of low precision needs.

  19. Surface-Enhanced Raman Spectroscopy.

    ERIC Educational Resources Information Center

    Garrell, Robin L.

    1989-01-01

    Reviews the basis for the technique and its experimental requirements. Describes a few examples of the analytical problems to which surface-enhanced Raman spectroscopy (SERS) has been and can be applied. Provides a perspective on the current limitations and frontiers in developing SERS as an analytical technique. (MVL)

  20. Arsenic, Antimony, Chromium, and Thallium Speciation in Water and Sediment Samples with the LC-ICP-MS Technique

    PubMed Central

    Jabłońska-Czapla, Magdalena

    2015-01-01

    Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962

  1. A Critical Review on Clinical Application of Separation Techniques for Selective Recognition of Uracil and 5-Fluorouracil.

    PubMed

    Pandey, Khushaboo; Dubey, Rama Shankar; Prasad, Bhim Bali

    2016-03-01

    The most important objectives that are frequently found in bio-analytical chemistry involve applying tools to relevant medical/biological problems and refining these applications. Developing a reliable sample preparation step, for the medical and biological fields is another primary objective in analytical chemistry, in order to extract and isolate the analytes of interest from complex biological matrices. Since, main inborn errors of metabolism (IEM) diagnosable through uracil analysis and the therapeutic monitoring of toxic 5-fluoruracil (an important anti-cancerous drug) in dihydropyrimidine dehydrogenase deficient patients, require an ultra-sensitive, reproducible, selective, and accurate analytical techniques for their measurements. Therefore, keeping in view, the diagnostic value of uracil and 5-fluoruracil measurements, this article refines several analytical techniques involved in selective recognition and quantification of uracil and 5-fluoruracil from biological and pharmaceutical samples. The prospective study revealed that implementation of molecularly imprinted polymer as a solid-phase material for sample preparation and preconcentration of uracil and 5-fluoruracil had proven to be effective as it could obviates problems related to tedious separation techniques, owing to protein binding and drastic interferences, from the complex matrices in real samples such as blood plasma, serum samples.

  2. An automated ranking platform for machine learning regression models for meat spoilage prediction using multi-spectral imaging and metabolic profiling.

    PubMed

    Estelles-Lopez, Lucia; Ropodi, Athina; Pavlidis, Dimitris; Fotopoulou, Jenny; Gkousari, Christina; Peyrodie, Audrey; Panagou, Efstathios; Nychas, George-John; Mohareb, Fady

    2017-09-01

    Over the past decade, analytical approaches based on vibrational spectroscopy, hyperspectral/multispectral imagining and biomimetic sensors started gaining popularity as rapid and efficient methods for assessing food quality, safety and authentication; as a sensible alternative to the expensive and time-consuming conventional microbiological techniques. Due to the multi-dimensional nature of the data generated from such analyses, the output needs to be coupled with a suitable statistical approach or machine-learning algorithms before the results can be interpreted. Choosing the optimum pattern recognition or machine learning approach for a given analytical platform is often challenging and involves a comparative analysis between various algorithms in order to achieve the best possible prediction accuracy. In this work, "MeatReg", a web-based application is presented, able to automate the procedure of identifying the best machine learning method for comparing data from several analytical techniques, to predict the counts of microorganisms responsible of meat spoilage regardless of the packaging system applied. In particularly up to 7 regression methods were applied and these are ordinary least squares regression, stepwise linear regression, partial least square regression, principal component regression, support vector regression, random forest and k-nearest neighbours. MeatReg" was tested with minced beef samples stored under aerobic and modified atmosphere packaging and analysed with electronic nose, HPLC, FT-IR, GC-MS and Multispectral imaging instrument. Population of total viable count, lactic acid bacteria, pseudomonads, Enterobacteriaceae and B. thermosphacta, were predicted. As a result, recommendations of which analytical platforms are suitable to predict each type of bacteria and which machine learning methods to use in each case were obtained. The developed system is accessible via the link: www.sorfml.com. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.

    2005-09-01

    To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.

  4. Adaptive steganography

    NASA Astrophysics Data System (ADS)

    Chandramouli, Rajarathnam; Li, Grace; Memon, Nasir D.

    2002-04-01

    Steganalysis techniques attempt to differentiate between stego-objects and cover-objects. In recent work we developed an explicit analytic upper bound for the steganographic capacity of LSB based steganographic techniques for a given false probability of detection. In this paper we look at adaptive steganographic techniques. Adaptive steganographic techniques take explicit steps to escape detection. We explore different techniques that can be used to adapt message embedding to the image content or to a known steganalysis technique. We investigate the advantages of adaptive steganography within an analytical framework. We also give experimental results with a state-of-the-art steganalysis technique demonstrating that adaptive embedding results in a significant number of bits embedded without detection.

  5. Microextraction techniques at the analytical laboratory: an efficient way for determining low amounts of residual insecticides in soils

    NASA Astrophysics Data System (ADS)

    Viñas, Pilar; Navarro, Tania; Campillo, Natalia; Fenoll, Jose; Garrido, Isabel; Cava, Juana; Hernandez-Cordoba, Manuel

    2017-04-01

    Microextraction techniques allow sensitive measurements of pollutants to be carried out by means of instrumentation commonly available at the analytical laboratory. This communication reports our studies focused to the determination of pyrethroid insecticides in polluted soils. These chemicals are synthetic analogues of pyrethrum widely used for pest control in agricultural and household applications. Because of their properties, pyrethroids tend to strongly absorb to soil particles and organic matter. Although they are considered as pesticides with a low toxicity for humans, long times exposure to them may cause damage in immune system and in the neurological system. The procedure here studied is based on dispersive liquid-liquid microextraction (DLLME), and permits the determination of fifteen pyrethroid compounds (allethrin, resmethrin, tetramethrin, bifenthrin, fenpropathrin, cyhalothrin, acrinathrin, permethrin, λ-cyfluthrin, cypermethrin, flucythrinate, fenvalerate, esfenvalerate, τ-fluvalinate, and deltamethrin) in soil samples using gas chromatography with mass spectrometry (GC-MS). The analytes were first extracted from the soil samples (4 g) by treatment with 2 mL of acetonitrile, 2 mL of water and 0.5 g of NaCl. The enriched organic phase (approximately 0.8 mL) was separated by centrifugation, and this solution used as the dispersant in a DLLME process. The analytes did not need to be derivatized before their injection into the chromatographic system, due to their volatility and thermal stability. The identification of the different pyrethroids was carried out based on their retention times and mass spectra, considering the m/z values of the different fragments and their relative abundances. The detection limits were in the 0.2-23 ng g-1 range, depending on the analyte and the sample under analysis. The authors are grateful to the Comunidad Autonóma de la Región de Murcia, Spain (Fundación Séneca, 19888/GERM/15) and to the Spanish MINECO (Project CTQ2015-68049-R) for financial support

  6. Primary particle diameter differentiation and bimodality identification by five analytical methods using gold nanoparticle size distributions synthesized by pulsed laser ablation in liquids

    NASA Astrophysics Data System (ADS)

    Letzel, Alexander; Gökce, Bilal; Menzel, Andreas; Plech, Anton; Barcikowski, Stephan

    2018-03-01

    For a known material, the size distribution of a nanoparticle colloid is a crucial parameter that defines its properties. However, measured size distributions are not easy to interpret as one has to consider weighting (e.g. by light absorption, scattering intensity, volume, surface, number) and the way size information was gained. The radius of a suspended nanoparticle can be given as e.g. sphere equivalent, hydrodynamic, Feret or radius of gyration. In this study, gold nanoparticles in water are synthesized by pulsed-laser ablation (LAL) and fragmentation (LFL) in liquids and characterized by various techniques (scanning transmission electron microscopy (STEM), small-angle X-ray scattering (SAXS), analytical disc centrifugation (ADC), dynamic light scattering (DLS) and UV-vis spectroscopy with Mie-Gans Theory) to study the comparability of different analytical techniques and determine the method that is preferable for a given task related to laser-generated nanoparticles. In particular, laser-generated colloids are known to be bimodal and/or polydisperse, but bimodality is sometimes not analytically resolved in literature. In addition, frequently reported small size shifts of the primary particle mode around 10 nm needs evaluation of its statistical significance related to the analytical method. Closely related to earlier studies on SAXS, different colloids in defined proportions are mixed and their size as a function of the nominal mixing ratio is analyzed. It is found that the derived particle size is independent of the nominal mixing ratio if the colloid size fractions do not overlap considerably. Conversely, the obtained size for colloids with overlapping size fractions strongly depends on the nominal mixing ratio since most methods cannot distinguish between such fractions. Overall, SAXS and ADC are very accurate methods for particle size analysis. Further, the ability of different methods to determine the nominal mixing ratio of sizes fractions is studied experimentally.

  7. WHAEM: PROGRAM DOCUMENTATION FOR THE WELLHEAD ANALYTIC ELEMENT MODEL

    EPA Science Inventory

    The Wellhead Analytic Element Model (WhAEM) demonstrates a new technique for the definition of time-of-travel capture zones in relatively simple geohydrologic settings. he WhAEM package includes an analytic element model that uses superposition of (many) analytic solutions to gen...

  8. Evaluation of three new laser spectrometer techniques for in-situ carbon monoxide measurements

    NASA Astrophysics Data System (ADS)

    Zellweger, C.; Steinbacher, M.; Buchmann, B.

    2012-07-01

    Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. However, the current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came on the market with promising properties for trace gas analytics. The current study compares three instruments that are recently commercially available (since 2011) with the up to now best available technique (vacuum UV fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques provide a considerably better performance compared to previous techniques, although some issues such as temperature influence and cross sensitivities need further attention.

  9. An Analysis of Earth Science Data Analytics Use Cases

    NASA Technical Reports Server (NTRS)

    Shie, Chung-Lin; Kempler, Steve

    2014-01-01

    The increase in the number and volume, and sources, of globally available Earth science data measurements and datasets have afforded Earth scientists and applications researchers unprecedented opportunities to study our Earth in ever more sophisticated ways. In fact, the NASA Earth Observing System Data Information System (EOSDIS) archives have doubled from 2007 to 2014, to 9.1 PB (Ramapriyan, 2009; and https:earthdata.nasa.govaboutsystem-- performance). In addition, other US agency, international programs, field experiments, ground stations, and citizen scientists provide a plethora of additional sources for studying Earth. Co--analyzing huge amounts of heterogeneous data to glean out unobvious information is a daunting task. Earth science data analytics (ESDA) is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. It can include Data Preparation, Data Reduction, and Data Analysis. Through work associated with the Earth Science Information Partners (ESIP) Federation, a collection of Earth science data analytics use cases have been collected and analyzed for the purpose of extracting the types of Earth science data analytics employed, and requirements for data analytics tools and techniques yet to be implemented, based on use case needs. ESIP generated use case template, ESDA use cases, use case types, and preliminary use case analysis (this is a work in progress) will be presented.

  10. Government/Industry Workshop on Payload Loads Technology

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A fully operational space shuttle is discussed which will offer science the opportunity to explore near earth orbit and finally interplanetary space on nearly a limitless basis. This multiplicity of payload/experiment combinations and frequency of launches places many burdens on dynamicists to predict launch and landing environments accurately and efficiently. Two major problems are apparent in the attempt to design for the diverse environments: (1) balancing the design criteria (loads, etc.) between launch and orbit operations, and (2) developing analytical techniques that are reliable, accurate, efficient, and low cost to meet the challenge of multiple launches and payloads. This paper deals with the key issues inherent in these problems, the key trades required, the basic approaches needed, and a summary of the state-of-the-art techniques.

  11. Fluid quantity gaging

    NASA Technical Reports Server (NTRS)

    Mord, Allan J.; Snyder, Howard A.; Kilpatrick, Kathleen A.; Hermanson, Lynn A.; Hopkins, Richard A.; Vangundy, Donald A.

    1988-01-01

    A system for measuring the mass of liquid in a tank on orbit with 1 percent accuracy was developed and demonstrated. An extensive tradeoff identified adiabatic compression as the only gaging technique that is independent of gravity or its orientation, and of the size and distribution of bubbles in the tank. This technique is applicable to all Earth-storable and cryogenic liquids of interest for Space Station use, except superfluid helium, and can be applied to tanks of any size, shape, or internal structure. Accuracy of 0.2 percent was demonstrated in the laboratory, and a detailed analytical model was developed and verified by testing. A flight system architecture is presented that allows meeting the needs of a broad range of space fluid systems without custom development for each user.

  12. Resonance Ionization, Mass Spectrometry.

    ERIC Educational Resources Information Center

    Young, J. P.; And Others

    1989-01-01

    Discussed is an analytical technique that uses photons from lasers to resonantly excite an electron from some initial state of a gaseous atom through various excited states of the atom or molecule. Described are the apparatus, some analytical applications, and the precision and accuracy of the technique. Lists 26 references. (CW)

  13. Meta-Analytic Structural Equation Modeling (MASEM): Comparison of the Multivariate Methods

    ERIC Educational Resources Information Center

    Zhang, Ying

    2011-01-01

    Meta-analytic Structural Equation Modeling (MASEM) has drawn interest from many researchers recently. In doing MASEM, researchers usually first synthesize correlation matrices across studies using meta-analysis techniques and then analyze the pooled correlation matrix using structural equation modeling techniques. Several multivariate methods of…

  14. Turbine blade tip durability analysis

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.

    1981-01-01

    An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.

  15. Analytical techniques for mechanistic characterization of EUV photoresists

    NASA Astrophysics Data System (ADS)

    Grzeskowiak, Steven; Narasimhan, Amrit; Murphy, Michael; Ackerman, Christian; Kaminsky, Jake; Brainard, Robert L.; Denbeaux, Greg

    2017-03-01

    Extreme ultraviolet (EUV, 13.5 nm) lithography is the prospective technology for high volume manufacturing by the microelectronics industry. Significant strides towards achieving adequate EUV source power and availability have been made recently, but a limited rate of improvement in photoresist performance still delays the implementation of EUV. Many fundamental questions remain to be answered about the exposure mechanisms of even the relatively well understood chemically amplified EUV photoresists. Moreover, several groups around the world are developing revolutionary metal-based resists whose EUV exposure mechanisms are even less understood. Here, we describe several evaluation techniques to help elucidate mechanistic details of EUV exposure mechanisms of chemically amplified and metal-based resists. EUV absorption coefficients are determined experimentally by measuring the transmission through a resist coated on a silicon nitride membrane. Photochemistry can be evaluated by monitoring small outgassing reaction products to provide insight into photoacid generator or metal-based resist reactivity. Spectroscopic techniques such as thin-film Fourier transform infrared (FTIR) spectroscopy can measure the chemical state of a photoresist system pre- and post-EUV exposure. Additionally, electrolysis can be used to study the interaction between photoresist components and low energy electrons. Collectively, these techniques improve our current understanding of photomechanisms for several EUV photoresist systems, which is needed to develop new, better performing materials needed for high volume manufacturing.

  16. Analytical Challenges in Biotechnology.

    ERIC Educational Resources Information Center

    Glajch, Joseph L.

    1986-01-01

    Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)

  17. An analytical and experimental evaluation of a Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. A.; Cosby, R. M.

    1976-01-01

    An analytical and experimental evaluation of line focusing Fresnel lenses with application potential in the 200 to 370 C range was studied. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves down lens. Experimentation was based on a 56 cm wide, f/1.0 lens. A Sun tracking heliostat provided a nonmoving solar source. Measured data indicated more spreading at the profile base than analytically predicted, resulting in a peak concentration 18 percent lower than the computed peak of 57. The measured and computed transmittances were 85 and 87 percent, respectively. Preliminary testing with a subsequent lens indicated that modified manufacturing techniques corrected the profile spreading problem and should enable improved analytical experimental correlation.

  18. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  19. A strategy to determine operating parameters in tissue engineering hollow fiber bioreactors

    PubMed Central

    Shipley, RJ; Davidson, AJ; Chan, K; Chaudhuri, JB; Waters, SL; Ellis, MJ

    2011-01-01

    The development of tissue engineering hollow fiber bioreactors (HFB) requires the optimal design of the geometry and operation parameters of the system. This article provides a strategy for specifying operating conditions for the system based on mathematical models of oxygen delivery to the cell population. Analytical and numerical solutions of these models are developed based on Michaelis–Menten kinetics. Depending on the minimum oxygen concentration required to culture a functional cell population, together with the oxygen uptake kinetics, the strategy dictates the model needed to describe mass transport so that the operating conditions can be defined. If cmin ≫ Km we capture oxygen uptake using zero-order kinetics and proceed analytically. This enables operating equations to be developed that allow the user to choose the medium flow rate, lumen length, and ECS depth to provide a prescribed value of cmin. When , we use numerical techniques to solve full Michaelis–Menten kinetics and present operating data for the bioreactor. The strategy presented utilizes both analytical and numerical approaches and can be applied to any cell type with known oxygen transport properties and uptake kinetics. PMID:21370228

  20. [Sample preparation and bioanalysis in mass spectrometry].

    PubMed

    Bourgogne, Emmanuel; Wagner, Michel

    2015-01-01

    The quantitative analysis of compounds of clinical interest of low molecular weight (<1000 Da) in biological fluids is currently in most cases performed by liquid chromatography-mass spectrometry (LC-MS). Analysis of these compounds in biological fluids (plasma, urine, saliva, hair...) is a difficult task requiring a sample preparation. Sample preparation is a crucial part of chemical/biological analysis and in a sense is considered the bottleneck of the whole analytical process. The main objectives of sample preparation are the removal of potential interferences, analyte preconcentration, and converting (if needed) the analyte into a more suitable form for detection or separation. Without chromatographic separation, endogenous compounds, co-eluted products may affect a quantitative method in mass spectrometry performance. This work focuses on three distinct parts. First, quantitative bioanalysis will be defined, different matrices and sample preparation techniques currently used in bioanalysis by mass spectrometry of/for small molecules of clinical interest in biological fluids. In a second step the goals of sample preparation will be described. Finally, in a third step, sample preparation strategies will be made either directly ("dilute and shoot") or after precipitation.

  1. Assessing the Value of Structured Analytic Techniques in the U.S. Intelligence Community

    DTIC Science & Technology

    2016-01-01

    Analytic Techniques, and Why Do Analysts Use Them? SATs are methods of organizing and stimulating thinking about intelligence problems. These methods... thinking ; and imaginative thinking techniques encourage new perspectives, insights, and alternative scenarios. Among the many SATs in use today, the...more transparent, so that other analysts and customers can bet - ter understand how the judgments were reached. SATs also facilitate group involvement

  2. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  3. 40 CFR Table 4 to Subpart Zzzz of... - Requirements for Performance Tests

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be greater... ASTM D6348-03,c provided in ASTM D6348-03 Annex A5 (Analyte Spiking Technique), the percent R must be...

  4. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  5. A validation of event-related FMRI comparisons between users of cocaine, nicotine, or cannabis and control subjects.

    PubMed

    Murphy, Kevin; Dixon, Veronica; LaGrave, Kathleen; Kaufman, Jacqueline; Risinger, Robert; Bloom, Alan; Garavan, Hugh

    2006-07-01

    Noninvasive brain imaging techniques are a powerful tool for researching the effects of drug abuse on brain activation measures. However, because many drugs have direct vascular effects, the validity of techniques that depend on blood flow measures as a reflection of neuronal activity may be called into question. This may be of particular concern in event-related functional magnetic resonance imaging (fMRI), where current analytic techniques search for a specific shape in the hemodynamic response to neuronal activity. To investigate possible alterations in task-related activation as a result of drug abuse, fMRI scans were conducted on subjects in four groups as they performed a simple event-related finger-tapping task: users of cocaine, nicotine, or cannabis and control subjects. Activation measures, as determined by two different analytic methods, did not differ between the groups. A comparison between an intravenous saline and an intravenous cocaine condition in cocaine users found a similar null result. Further in-depth analyses of the shape of the hemodynamic responses in each group also showed no differences. This study demonstrates that drug groups may be compared with control subjects using event-related fMRI without the need for any post hoc procedures to correct for possible drug-induced cardiovascular alterations. Thus, fMRI activation differences reported between these drug groups can be more confidently interpreted as reflecting neuronal differences.

  6. Measurement of absolute regional lung air volumes from near-field x-ray speckles.

    PubMed

    Leong, Andrew F T; Paganin, David M; Hooper, Stuart B; Siew, Melissa L; Kitchen, Marcus J

    2013-11-18

    Propagation-based phase contrast x-ray (PBX) imaging yields high contrast images of the lung where airways that overlap in projection coherently scatter the x-rays, giving rise to a speckled intensity due to interference effects. Our previous works have shown that total and regional changes in lung air volumes can be accurately measured from two-dimensional (2D) absorption or phase contrast images when the subject is immersed in a water-filled container. In this paper we demonstrate how the phase contrast speckle patterns can be used to directly measure absolute regional lung air volumes from 2D PBX images without the need for a water-filled container. We justify this technique analytically and via simulation using the transport-of-intensity equation and calibrate the technique using our existing methods for measuring lung air volume. Finally, we show the full capabilities of this technique for measuring regional differences in lung aeration.

  7. Implementing Operational Analytics using Big Data Technologies to Detect and Predict Sensor Anomalies

    NASA Astrophysics Data System (ADS)

    Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.

    2016-09-01

    Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.

  8. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  9. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  10. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  11. An Experiential Research-Focused Approach: Implementation in a Nonlaboratory-Based Graduate-Level Analytical Chemistry Course

    ERIC Educational Resources Information Center

    Toh, Chee-Seng

    2007-01-01

    A project is described which incorporates nonlaboratory research skills in a graduate level course on analytical chemistry. This project will help students to grasp the basic principles and concepts of modern analytical techniques and also help them develop relevant research skills in analytical chemistry.

  12. Analytical and numerical techniques for predicting the interfacial stresses of wavy carbon nanotube/polymer composites

    NASA Astrophysics Data System (ADS)

    Yazdchi, K.; Salehi, M.; Shokrieh, M. M.

    2009-03-01

    By introducing a new simplified 3D representative volume element for wavy carbon nanotubes, an analytical model is developed to study the stress transfer in single-walled carbon nanotube-reinforced polymer composites. Based on the pull-out modeling technique, the effects of waviness, aspect ratio, and Poisson ratio on the axial and interfacial shear stresses are analyzed in detail. The results of the present analytical model are in a good agreement with corresponding results for straight nanotubes.

  13. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  14. Green's function calculations for semi-infinite carbon nanotubes

    NASA Astrophysics Data System (ADS)

    John, D. L.; Pulfrey, D. L.

    2006-02-01

    In the modeling of nanoscale electronic devices, the non-equilibrium Green's function technique is gaining increasing popularity. One complication in this method is the need for computation of the self-energy functions that account for the interactions between the active portion of a device and its leads. In the one-dimensional case, these functions may be computed analytically. In higher dimensions, a numerical approach is required. In this work, we generalize earlier methods that were developed for tight-binding Hamiltonians, and present results for the case of a carbon nanotube.

  15. Magic Angle Spinning NMR Metabolomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhi Hu, Jian

    Nuclear Magnetic Resonance (NMR) spectroscopy is a non-destructive, quantitative, reproducible, untargeted and unbiased method that requires no or minimal sample preparation, and is one of the leading analytical tools for metabonomics research [1-3]. The easy quantification and the no need of prior knowledge about compounds present in a sample associated with NMR are advantageous over other techniques [1,4]. 1H NMR is especially attractive because protons are present in virtually all metabolites and its NMR sensitivity is high, enabling the simultaneous identification and monitoring of a wide range of low molecular weight metabolites.

  16. Analytical Micromechanics Modeling Technique Developed for Ceramic Matrix Composites Analysis

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2005-01-01

    Ceramic matrix composites (CMCs) promise many advantages for next-generation aerospace propulsion systems. Specifically, carbon-reinforced silicon carbide (C/SiC) CMCs enable higher operational temperatures and provide potential component weight savings by virtue of their high specific strength. These attributes may provide systemwide benefits. Higher operating temperatures lessen or eliminate the need for cooling, thereby reducing both fuel consumption and the complex hardware and plumbing required for heat management. This, in turn, lowers system weight, size, and complexity, while improving efficiency, reliability, and service life, resulting in overall lower operating costs.

  17. Optimization of on-line hydrogen stable isotope ratio measurements of halogen- and sulfur-bearing organic compounds using elemental analyzer–chromium/high-temperature conversion isotope ratio mass spectrometry (EA-Cr/HTC-IRMS)

    USGS Publications Warehouse

    Gehre, Matthias; Renpenning, Julian; Geilmann, Heike; Qi, Haiping; Coplen, Tyler B.; Kümmel, Steffen; Ivdra, Natalija; Brand, Willi A.; Schimmelmann, Arndt

    2017-01-01

    Conclusions: The optimized EA-Cr/HTC reactor design can be implemented in existing analytical equipment using commercially available material and is universally applicable for both heteroelement-bearing and heteroelement-free organic-compound classes. The sensitivity and simplicity of the on-line EA-Cr/HTC-IRMS technique provide a much needed tool for routine hydrogen-isotope source tracing of organic contaminants in the environment. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Architecture for Business Intelligence in the Healthcare Sector

    NASA Astrophysics Data System (ADS)

    Lee, Sang Young

    2018-03-01

    Healthcare environment is growing to include not only the traditional information systems, but also a business intelligence platform. For executive leaders, consultants, and analysts, there is no longer a need to spend hours in design and develop of typical reports or charts, the entire solution can be completed through using Business Intelligence software. The current paper highlights the advantages of big data analytics and business intelligence in the healthcare industry. In this paper, In this paper we focus our discussion around intelligent techniques and methodologies which are recently used for business intelligence in healthcare.

  19. Critical review of dog detection and the influences of physiology, training, and analytical methodologies.

    PubMed

    Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M

    2018-08-01

    Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Inter-laboratory comparison of X-ray fluorescence analyses of eruptive products of El Chichón Volcano, Chiapas, Mexico

    USGS Publications Warehouse

    Tilling, Robert I.; Bornhorst, Theodore J.; Taggart, Joseph E.; Rose, William I.; McGee, James J.

    1987-01-01

    An inter-laboratory comparison has been made of X-ray fluorescence analyses of 10 samples of lava and pumices from El Chichón Volcano, Chiapas, Mexico. Some determinations of major-element constituents agree within analytical uncertainty, whereas others exchibit significant bias. Analyses carried out at the Michigan Technological University (MTU) laboratory are systematically lower in MgO (26–48%), Fetotal(5–18%), CaO (4–15%) and higher in K2O (0–15%) than analyses made at the U.S. Geological Survey (USGS) Denver laboratory. These differences are ascribed in part to a complex combination of calibration assumptionsand mineralogical and particle-size effects inherent in the use of pressed rock-powder pellets in the analytical procedure of the MTU laboratory. Other, but as yet unknown, differences in sample preparation and/or analytical technique may also be important; effects related to natural sample inhomogeneityare believed to be insignificant. The inter-laboratory differences in the analytical data complicated accurate assessment of whether El Chichón magmas have changed composition during the past 300 000 a. Knowledge of such change is needed for understanding petrogenetic history and for such related studies as evaluation of volcanic hazards.

  1. An Analytical Solution for Transient Thermal Response of an Insulated Structure

    NASA Technical Reports Server (NTRS)

    Blosser, Max L.

    2012-01-01

    An analytical solution was derived for the transient response of an insulated aerospace vehicle structure subjected to a simplified heat pulse. This simplified problem approximates the thermal response of a thermal protection system of an atmospheric entry vehicle. The exact analytical solution is solely a function of two non-dimensional parameters. A simpler function of these two parameters was developed to approximate the maximum structural temperature over a wide range of parameter values. Techniques were developed to choose constant, effective properties to represent the relevant temperature and pressure-dependent properties for the insulator and structure. A technique was also developed to map a time-varying surface temperature history to an equivalent square heat pulse. Using these techniques, the maximum structural temperature rise was calculated using the analytical solutions and shown to typically agree with finite element simulations within 10 to 20 percent over the relevant range of parameters studied.

  2. Optical trapping for analytical biotechnology.

    PubMed

    Ashok, Praveen C; Dholakia, Kishan

    2012-02-01

    We describe the exciting advances of using optical trapping in the field of analytical biotechnology. This technique has opened up opportunities to manipulate biological particles at the single cell or even at subcellular levels which has allowed an insight into the physical and chemical mechanisms of many biological processes. The ability of this technique to manipulate microparticles and measure pico-Newton forces has found several applications such as understanding the dynamics of biological macromolecules, cell-cell interactions and the micro-rheology of both cells and fluids. Furthermore we may probe and analyse the biological world when combining trapping with analytical techniques such as Raman spectroscopy and imaging. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Nuclear and atomic analytical techniques in environmental studies in South America.

    PubMed

    Paschoa, A S

    1990-01-01

    The use of nuclear analytical techniques for environmental studies in South America is selectively reviewed since the time of earlier works of Lattes with cosmic rays until the recent applications of the PIXE (particle-induced X-ray emission) technique to study air pollution problems in large cities, such as São Paulo and Rio de Janeiro. The studies on natural radioactivity and fallout from nuclear weapons in South America are briefly examined.

  4. Laser ablation/ionization characterization of solids: Second interim progress report of the strategic environmental research development program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, W.P.; Bushaw, B.A.; McCarthy, M.I.

    1996-10-01

    The Department of Energy is undertaking the enormous task of remediating defense wastes and environmental insults which have occurred over 50 years of nuclear weapons production. It is abundantly clear that significant technology advances are needed to characterize, process, and store highly radioactive waste and to remediate contaminated zones. In addition to the processing and waste form issues, analytical technologies needed for the characterization of solids, and for monitoring storage tanks and contaminated sites do not exist or are currently expensive labor-intensive tasks. This report describes progress in developing sensitive, rapid, and widely applicable laser-based mass spectrometry techniques for analysismore » of mixed chemical wastes and contaminated soils.« less

  5. Molecular signature of complex regional pain syndrome (CRPS) and its analysis.

    PubMed

    König, Simone; Schlereth, Tanja; Birklein, Frank

    2017-10-01

    Complex Regional Pain Syndrome (CRPS) is a rare, but often disabling pain disease. Biomarkers are lacking, but several inflammatory substances have been associated with the pathophysiology. This review outlines the current knowledge with respect to target biomolecules and the analytical tools available to measure them. Areas covered: Targets include cytokines, neuropeptides and resolvins; analysis strategies are thus needed for different classes of substances such as proteins, peptides, lipids and small molecules. Traditional methods like immunoassays are of importance next to state-of-the art high-resolution mass spectrometry techniques and 'omics' approaches. Expert commentary: Future biomarker studies need larger cohorts, which improve subgrouping of patients due to their presumed pathophysiology, and highly standardized workflows from sampling to analysis.

  6. On the Development of a Deterministic Three-Dimensional Radiation Transport Code

    NASA Technical Reports Server (NTRS)

    Rockell, Candice; Tweed, John

    2011-01-01

    Since astronauts on future deep space missions will be exposed to dangerous radiations, there is a need to accurately model the transport of radiation through shielding materials and to estimate the received radiation dose. In response to this need a three dimensional deterministic code for space radiation transport is now under development. The new code GRNTRN is based on a Green's function solution of the Boltzmann transport equation that is constructed in the form of a Neumann series. Analytical approximations will be obtained for the first three terms of the Neumann series and the remainder will be estimated by a non-perturbative technique . This work discusses progress made to date and exhibits some computations based on the first two Neumann series terms.

  7. Green aspects, developments and perspectives of liquid phase microextraction techniques.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2014-02-01

    Determination of analytes at trace levels in complex samples (e.g. biological or contaminated water or soils) are often required for the environmental assessment and monitoring as well as for scientific research in the field of environmental pollution. A limited number of analytical techniques are sensitive enough for the direct determination of trace components in samples and, because of that, a preliminary step of the analyte isolation/enrichment prior to analysis is required in many cases. In this work the newest trends and innovations in liquid phase microextraction, like: single-drop microextraction (SDME), hollow fiber liquid-phase microextraction (HF-LPME), and dispersive liquid-liquid microextraction (DLLME) have been discussed, including their critical evaluation and possible application in analytical practice. The described modifications of extraction techniques deal with system miniaturization and/or automation, the use of ultrasound and physical agitation, and electrochemical methods. Particular attention was given to pro-ecological aspects therefore the possible use of novel, non-toxic extracting agents, inter alia, ionic liquids, coacervates, surfactant solutions and reverse micelles in the liquid phase microextraction techniques has been evaluated in depth. Also, new methodological solutions and the related instruments and devices for the efficient liquid phase micoextraction of analytes, which have found application at the stage of procedure prior to chromatographic determination, are presented. © 2013 Published by Elsevier B.V.

  8. Pre-analytic and analytic sources of variations in thiopurine methyltransferase activity measurement in patients prescribed thiopurine-based drugs: A systematic review.

    PubMed

    Loit, Evelin; Tricco, Andrea C; Tsouros, Sophia; Sears, Margaret; Ansari, Mohammed T; Booth, Ronald A

    2011-07-01

    Low thiopurine S-methyltransferase (TPMT) enzyme activity is associated with increased thiopurine drug toxicity, particularly myelotoxicity. Pre-analytic and analytic variables for TPMT genotype and phenotype (enzyme activity) testing were reviewed. A systematic literature review was performed, and diagnostic laboratories were surveyed. Thirty-five studies reported relevant data for pre-analytic variables (patient age, gender, race, hematocrit, co-morbidity, co-administered drugs and specimen stability) and thirty-three for analytic variables (accuracy, reproducibility). TPMT is stable in blood when stored for up to 7 days at room temperature, and 3 months at -30°C. Pre-analytic patient variables do not affect TPMT activity. Fifteen drugs studied to date exerted no clinically significant effects in vivo. Enzymatic assay is the preferred technique. Radiochemical and HPLC techniques had intra- and inter-assay coefficients of variation (CVs) below 10%. TPMT is a stable enzyme, and its assay is not affected by age, gender, race or co-morbidity. Copyright © 2011. Published by Elsevier Inc.

  9. Big Data Analytics with Datalog Queries on Spark.

    PubMed

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  10. Big Data Analytics with Datalog Queries on Spark

    PubMed Central

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2017-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics. PMID:28626296

  11. Assessing the service quality of Iran military hospitals: Joint Commission International standards and Analytic Hierarchy Process (AHP) technique

    PubMed Central

    Bahadori, Mohammadkarim; Ravangard, Ramin; Yaghoubi, Maryam; Alimohammadzadeh, Khalil

    2014-01-01

    Background: Military hospitals are responsible for preserving, restoring and improving the health of not only armed forces, but also other people. According to the military organizations strategy, which is being a leader and pioneer in all areas, providing quality health services is one of the main goals of the military health care organizations. This study was aimed to evaluate the service quality of selected military hospitals in Iran based on the Joint Commission International (JCI) standards and comparing these hospitals with each other and ranking them using the analytic hierarchy process (AHP) technique in 2013. Materials and Methods: This was a cross-sectional and descriptive study conducted on five military hospitals, selected using the purposive sampling method, in 2013. Required data collected using checklists of accreditation standards and nominal group technique. AHP technique was used for prioritizing. Furthermore, Expert Choice 11.0 was used to analyze the collected data. Results: Among JCI standards, the standards of access to care and continuity of care (weight = 0.122), quality improvement and patient safety (weight = 0.121) and leadership and management (weight = 0.117) had the greatest importance, respectively. Furthermore, in the overall ranking, BGT (weight = 0.369), IHM (0.238), SAU (0.202), IHK (weight = 0.125) and SAB (weight = 0.066) ranked first to fifth, respectively. Conclusion: AHP is an appropriate technique for measuring the overall performance of hospitals and their quality of services. It is a holistic approach that takes all hospital processes into consideration. The results of the present study can be used to improve hospitals performance through identifying areas, which are in need of focus for quality improvement and selecting strategies to improve service quality. PMID:25250364

  12. Data Filtering in Instrumental Analyses with Applications to Optical Spectroscopy and Chemical Imaging

    ERIC Educational Resources Information Center

    Vogt, Frank

    2011-01-01

    Most measurement techniques have some limitations imposed by a sensor's signal-to-noise ratio (SNR). Thus, in analytical chemistry, methods for enhancing the SNR are of crucial importance and can be ensured experimentally or established via pre-treatment of digitized data. In many analytical curricula, instrumental techniques are given preference…

  13. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  14. Need total sulfur content? Use chemiluminescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubala, S.W.; Campbell, D.N.; DiSanzo, F.P.

    Regulations issued by the United States Environmental Protection Agency require petroleum refineries to reduce or control the amount of total sulfur present in their refined products. These legislative requirements have led many refineries to search for online instrumentation that can produce accurate and repeatable total sulfur measurements within allowed levels. Several analytical methods currently exist to measure total sulfur content. They include X-ray fluorescence (XRF), microcoulometry, lead acetate tape, and pyrofluorescence techniques. Sulfur-specific chemiluminescence detection (SSCD) has recently received much attention due to its linearity, selectivity, sensitivity, and equimolar response. However, its use has been largely confined to the areamore » of gas chromatography. This article focuses on the special design considerations and analytical utility of an SSCD system developed to determine total sulfur content in gasoline. The system exhibits excellent linearity and selectivity, the ability to detect low minimum levels, and an equimolar response to various sulfur compounds. 2 figs., 2 tabs.« less

  15. Singlet oxygen-based electrosensing by molecular photosensitizers

    NASA Astrophysics Data System (ADS)

    Trashin, Stanislav; Rahemi, Vanoushe; Ramji, Karpagavalli; Neven, Liselotte; Gorun, Sergiu M.; de Wael, Karolien

    2017-07-01

    Enzyme-based electrochemical biosensors are an inspiration for the development of (bio)analytical techniques. However, the instability and reproducibility of the reactivity of enzymes, combined with the need for chemical reagents for sensing remain challenges for the construction of useful devices. Here we present a sensing strategy inspired by the advantages of enzymes and photoelectrochemical sensing, namely the integration of aerobic photocatalysis and electrochemical analysis. The photosensitizer, a bioinspired perfluorinated Zn phthalocyanine, generates singlet-oxygen from air under visible light illumination and oxidizes analytes, yielding electrochemically-detectable products while resisting the oxidizing species it produces. Compared with enzymatic detection methods, the proposed strategy uses air instead of internally added reactive reagents, features intrinsic baseline correction via on/off light switching and shows C-F bonds-type enhanced stability. It also affords selectivity imparted by the catalytic process and nano-level detection, such as 20 nM amoxicillin in μl sample volumes.

  16. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  17. Determination of gamma-aminobutyric acid in food matrices by isotope dilution hydrophilic interaction chromatography coupled to mass spectrometry.

    PubMed

    Zazzeroni, Raniero; Homan, Andrew; Thain, Emma

    2009-08-01

    The estimation of the dietary intake of gamma-aminobutyric acid (GABA) is dependent upon the knowledge of its concentration values in food matrices. To this end, an isotope dilution liquid chromatography-mass spectrometry method has been developed employing the hydrophilic interaction chromatography technique for analyte separation. This approach enabled accurate quantification of GABA in apple, potato, soybeans, and orange juice without the need of a pre- or post-column derivatization reaction. A selective and precise analytical measurement has been obtained with a triple quadrupole mass spectrometer operating in multiple reaction monitoring using the method of standard additions and GABA-d(6) as an internal standard. The concentrations of GABA found in the matrices tested are 7 microg/g of apple, 342 microg/g of potatoes, 211 microg/g of soybeans, and 344 microg/mL of orange juice.

  18. Forensic collection of trace chemicals from diverse surfaces with strippable coatings.

    PubMed

    Jakubowski, Michael J; Beltis, Kevin J; Drennan, Paul M; Pindzola, Bradford A

    2013-11-07

    Surface sampling for chemical analysis plays a vital role in environmental monitoring, industrial hygiene, homeland security and forensics. The standard surface sampling tool, a simple cotton gauze pad, is failing to meet the needs of the community as analytical techniques become more sensitive and the variety of analytes increases. In previous work, we demonstrated the efficacy of non-destructive, conformal, spray-on strippable coatings for chemical collection from simple glass surfaces. Here we expand that work by presenting chemical collection at a low spiking level (0.1 g m(-2)) from a diverse array of common surfaces - painted metal, engineering plastics, painted wallboard and concrete - using strippable coatings. The collection efficiency of the strippable coatings is compared to and far exceeds gauze pads. Collection from concrete, a particular challenge for wipes like gauze, averaged 73% over eight chemically diverse compounds for the strippable coatings whereas gauze averaged 10%.

  19. Applying predictive analytics to develop an intelligent risk detection application for healthcare contexts.

    PubMed

    Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini

    2013-01-01

    Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.

  20. Single-Step Reagentless Laser Scribing Fabrication of Electrochemical Paper-Based Analytical Devices.

    PubMed

    de Araujo, William R; Frasson, Carolina M R; Ameku, Wilson A; Silva, José R; Angnes, Lúcio; Paixão, Thiago R L C

    2017-11-20

    A single-step laser scribing process is used to pattern nanostructured electrodes on paper-based devices. The facile and low-cost technique eliminates the need for chemical reagents or controlled conditions. This process involves the use of a CO 2 laser to pyrolyze the surface of the paperboard, producing a conductive porous non-graphitizing carbon material composed of graphene sheets and composites with aluminosilicate nanoparticles. The new electrode material was extensively characterized, and it exhibits high conductivity and an enhanced active/geometric area ratio; it is thus well-suited for electrochemical purposes. As a proof-of-concept, the devices were successfully employed for different analytical applications in the clinical, pharmaceutical, food, and forensic fields. The scalable and green fabrication method associated with the features of the new material is highly promising for the development of portable electrochemical devices. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Singlet oxygen-based electrosensing by molecular photosensitizers

    PubMed Central

    Trashin, Stanislav; Rahemi, Vanoushe; Ramji, Karpagavalli; Neven, Liselotte; Gorun, Sergiu M.; De Wael, Karolien

    2017-01-01

    Enzyme-based electrochemical biosensors are an inspiration for the development of (bio)analytical techniques. However, the instability and reproducibility of the reactivity of enzymes, combined with the need for chemical reagents for sensing remain challenges for the construction of useful devices. Here we present a sensing strategy inspired by the advantages of enzymes and photoelectrochemical sensing, namely the integration of aerobic photocatalysis and electrochemical analysis. The photosensitizer, a bioinspired perfluorinated Zn phthalocyanine, generates singlet-oxygen from air under visible light illumination and oxidizes analytes, yielding electrochemically-detectable products while resisting the oxidizing species it produces. Compared with enzymatic detection methods, the proposed strategy uses air instead of internally added reactive reagents, features intrinsic baseline correction via on/off light switching and shows C-F bonds-type enhanced stability. It also affords selectivity imparted by the catalytic process and nano-level detection, such as 20 nM amoxicillin in μl sample volumes.

  2. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Audrey Noreen

    Homeland security relies heavily on analytical chemistry to identify suspicious materials and persons. Traditionally this role has focused on attribution, determining the type and origin of an explosive, for example. But as technology advances, analytical chemistry can and will play an important role in the prevention and preemption of terrorist attacks. More sensitive and selective detection techniques can allow suspicious materials and persons to be identified even before a final destructive product is made. The work presented herein focuses on the use of commercial and novel detection techniques for application to the prevention of terrorist activities. Although drugs are notmore » commonly thought of when discussing terrorism, narcoterrorism has become a significant threat in the 21st century. The role of the drug trade in the funding of terrorist groups is prevalent; thus, reducing the trafficking of illegal drugs can play a role in the prevention of terrorism by cutting off much needed funding. To do so, sensitive, specific, and robust analytical equipment is needed to quickly identify a suspected drug sample no matter what matrix it is in. Single Particle Aerosol Mass Spectrometry (SPAMS) is a novel technique that has previously been applied to biological and chemical detection. The current work applies SPAMS to drug analysis, identifying the active ingredients in single component, multi-component, and multi-tablet drug samples in a relatively non-destructive manner. In order to do so, a sampling apparatus was created to allow particle generation from drug tablets with on-line introduction to the SPAMS instrument. Rules trees were developed to automate the identification of drug samples on a single particle basis. A novel analytical scheme was also developed to identify suspect individuals based on chemical signatures in human breath. Human breath was sampled using an RTube{trademark} and the trace volatile organic compounds (VOCs) were preconcentrated using solid phase microextraction (SPME) and identified using gas chromatography - mass spectrometry (GC-MS). Modifications to the sampling apparatus allowed for increased VOC collection efficiency, and reduced the time of sampling and analysis by over 25%. The VOCs are present in breath due to either endogenous production, or exposure to an external source through absorption, inhalation, or ingestion. Detection of these exogenous chemicals can provide information on the prior location and activities of the subject. Breath samples collected before and after exposure in a hardware store and nail salon were analyzed to investigate the prior location of a subject; breath samples collected before and after oral exposure to terpenes and terpenoid compounds, pseudoephedrine, and inhalation exposure to hexamine and other explosive related compounds were analyzed to investigate the prior activity of a subject. The elimination of such compounds from the body was also monitored. In application, this technique may provide an early warning system to identify persons of interest in the prevention and preemption stages of homeland security.« less

  4. Analytical applications of microbial fuel cells. Part II: Toxicity, microbial activity and quantification, single analyte detection and other uses.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Quantitative and qualitative sensing techniques for biogenic volatile organic compounds and their oxidation products.

    PubMed

    Kim, Saewung; Guenther, Alex; Apel, Eric

    2013-07-01

    The physiological production mechanisms of some of the organics in plants, commonly known as biogenic volatile organic compounds (BVOCs), have been known for more than a century. Some BVOCs are emitted to the atmosphere and play a significant role in tropospheric photochemistry especially in ozone and secondary organic aerosol (SOA) productions as a result of interplays between BVOCs and atmospheric radicals such as hydroxyl radical (OH), ozone (O3) and NOX (NO + NO2). These findings have been drawn from comprehensive analysis of numerous field and laboratory studies that have characterized the ambient distribution of BVOCs and their oxidation products, and reaction kinetics between BVOCs and atmospheric oxidants. These investigations are limited by the capacity for identifying and quantifying these compounds. This review highlights the major analytical techniques that have been used to observe BVOCs and their oxidation products such as gas chromatography, mass spectrometry with hard and soft ionization methods, and optical techniques from laser induced fluorescence (LIF) to remote sensing. In addition, we discuss how new analytical techniques can advance our understanding of BVOC photochemical processes. The principles, advantages, and drawbacks of the analytical techniques are discussed along with specific examples of how the techniques were applied in field and laboratory measurements. Since a number of thorough review papers for each specific analytical technique are available, readers are referred to these publications rather than providing thorough descriptions of each technique. Therefore, the aim of this review is for readers to grasp the advantages and disadvantages of various sensing techniques for BVOCs and their oxidation products and to provide guidance for choosing the optimal technique for a specific research task.

  6. Analytical techniques for characterization of cyclodextrin complexes in the solid state: A review.

    PubMed

    Mura, Paola

    2015-09-10

    Cyclodextrins are cyclic oligosaccharides able to form inclusion complexes with a variety of hydrophobic guest molecules, positively modifying their physicochemical properties. A thorough analytical characterization of cyclodextrin complexes is of fundamental importance to provide an adequate support in selection of the most suitable cyclodextrin for each guest molecule, and also in view of possible future patenting and marketing of drug-cyclodextrin formulations. The demonstration of the actual formation of a drug-cyclodextrin inclusion complex in solution does not guarantee its existence also in the solid state. Moreover, the technique used to prepare the solid complex can strongly influence the properties of the final product. Therefore, an appropriate characterization of the drug-cyclodextrin solid systems obtained has also a key role in driving in the choice of the most effective preparation method, able to maximize host-guest interactions. The analytical characterization of drug-cyclodextrin solid systems and the assessment of the actual inclusion complex formation is not a simple task and involves the combined use of several analytical techniques, whose results have to be evaluated together. The objective of the present review is to present a general prospect of the principal analytical techniques which can be employed for a suitable characterization of drug-cyclodextrin systems in the solid state, evidencing their respective potential advantages and limits. The applications of each examined technique are described and discussed by pertinent examples from literature. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Genome wide approaches to identify protein-DNA interactions.

    PubMed

    Ma, Tao; Ye, Zhenqing; Wang, Liguo

    2018-05-29

    Transcription factors are DNA-binding proteins that play key roles in many fundamental biological processes. Unraveling their interactions with DNA is essential to identify their target genes and understand the regulatory network. Genome-wide identification of their binding sites became feasible thanks to recent progress in experimental and computational approaches. ChIP-chip, ChIP-seq, and ChIP-exo are three widely used techniques to demarcate genome-wide transcription factor binding sites. This review aims to provide an overview of these three techniques including their experiment procedures, computational approaches, and popular analytic tools. ChIP-chip, ChIP-seq, and ChIP-exo have been the major techniques to study genome-wide in vivo protein-DNA interaction. Due to the rapid development of next-generation sequencing technology, array-based ChIP-chip is deprecated and ChIP-seq has become the most widely used technique to identify transcription factor binding sites in genome-wide. The newly developed ChIP-exo further improves the spatial resolution to single nucleotide. Numerous tools have been developed to analyze ChIP-chip, ChIP-seq and ChIP-exo data. However, different programs may employ different mechanisms or underlying algorithms thus each will inherently include its own set of statistical assumption and bias. So choosing the most appropriate analytic program for a given experiment needs careful considerations. Moreover, most programs only have command line interface so their installation and usage will require basic computation expertise in Unix/Linux. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Perspectives on making big data analytics work for oncology.

    PubMed

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from incorporating prior knowledge, using information-theoretic techniques to modern ensemble machine learning approaches or combination of these. We will particularly discuss the pros and cons of different approaches to improve mining of big data in oncology. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    NASA-Lewis Research Center's work on accurate measurement of trace level of metals in various fuels is presented. The differences between laboratories and between analytical techniques especially for concentrations below 10 ppm, are discussed, detailing the Atomic Absorption Spectrometry (AAS) and DC Arc Emission Spectrometry (dc arc) techniques used by NASA-Lewis. Also presented is the design of an Interlaboratory Study which is considering the following factors: laboratory, analytical technique, fuel type, concentration and ashing additive.

  11. The Same Fish: Creating Space for Therapeutic Relationships, Play, and Development in a School for Children with Special Needs.

    PubMed

    Alston, Richard; Sosland, Rachel; Tuohy, Anne; Weiler, Nori Anna; Zeitlin, Diane

    2015-01-01

    This paper represents and attempts to describe psychoanalytically informed work applied in a school setting with children with special needs. While many therapists at the Parkside School are trained in analytic techniques and principles, these ideas have not traditionally been applied to children with language-based learning difficulties. Over the years, we have found that analytic ideas such as transference, countertransference, projective identification, containment, and attachment are especially salient to our understanding of these very complex children. Despite being in a school--a nontraditional setting for psychoanalysis--children are seen in individual and group therapy, often more than once a week. We believe that therapeutic relationships and play (sometimes bringing a child to a place of being able to play) are especially mutative with children with language-based learning challenges. Play and relationship provide a holding environment that, over time, allows for the reorganization of a child's often immature developmental capacities into a sense of agency that captures more clearly a child's innate potential. This article includes case studies of children with complex language-based learning difficulties, including autism spectrum disorders.

  12. Determining optimal parameters in magnetic spacecraft stabilization via attitude feedback

    NASA Astrophysics Data System (ADS)

    Bruni, Renato; Celani, Fabio

    2016-10-01

    The attitude control of a spacecraft using magnetorquers can be achieved by a feedback control law which has four design parameters. However, the practical determination of appropriate values for these parameters is a critical open issue. We propose here an innovative systematic approach for finding these values: they should be those that minimize the convergence time to the desired attitude. This a particularly diffcult optimization problem, for several reasons: 1) such time cannot be expressed in analytical form as a function of parameters and initial conditions; 2) design parameters may range over very wide intervals; 3) convergence time depends also on the initial conditions of the spacecraft, which are not known in advance. To overcome these diffculties, we present a solution approach based on derivative-free optimization. These algorithms do not need to write analytically the objective function: they only need to compute it in a number of points. We also propose a fast probing technique to identify which regions of the search space have to be explored densely. Finally, we formulate a min-max model to find robust parameters, namely design parameters that minimize convergence time under the worst initial conditions. Results are very promising.

  13. Analytical and experimental study of the acoustics and the flow field characteristics of cavitating self-resonating water jets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chahine, G.L.; Genoux, P.F.; Johnson, V.E. Jr.

    1984-09-01

    Waterjet nozzles (STRATOJETS) have been developed which achieve passive structuring of cavitating submerged jets into discrete ring vortices, and which possess cavitation incipient numbers six times higher than obtained with conventional cavitating jet nozzles. In this study we developed analytical and numerical techniques and conducted experimental work to gain an understanding of the basic phenomena involved. The achievements are: (1) a thorough analysis of the acoustic dynamics of the feed pipe to the nozzle; (2) a theory for bubble ring growth and collapse; (3) a numerical model for jet simulation; (4) an experimental observation and analysis of candidate second-generation low-sigmamore » STRATOJETS. From this study we can conclude that intensification of bubble ring collapse and design of highly resonant feed tubes can lead to improved drilling rates. The models here described are excellent tools to analyze the various parameters needed for STRATOJET optimizations. Further analysis is needed to introduce such important factors as viscosity, nozzle-jet interaction, and ring-target interaction, and to develop the jet simulation model to describe the important fine details of the flow field at the nozzle exit.« less

  14. Application of Soft Computing in Coherent Communications Phase Synchronization

    NASA Technical Reports Server (NTRS)

    Drake, Jeffrey T.; Prasad, Nadipuram R.

    2000-01-01

    The use of soft computing techniques in coherent communications phase synchronization provides an alternative to analytical or hard computing methods. This paper discusses a novel use of Adaptive Neuro-Fuzzy Inference Systems (ANFIS) for phase synchronization in coherent communications systems utilizing Multiple Phase Shift Keying (MPSK) modulation. A brief overview of the M-PSK digital communications bandpass modulation technique is presented and it's requisite need for phase synchronization is discussed. We briefly describe the hybrid platform developed by Jang that incorporates fuzzy/neural structures namely the, Adaptive Neuro-Fuzzy Interference Systems (ANFIS). We then discuss application of ANFIS to phase estimation for M-PSK. The modeling of both explicit, and implicit phase estimation schemes for M-PSK symbols with unknown structure are discussed. Performance results from simulation of the above scheme is presented.

  15. Human motion planning based on recursive dynamics and optimal control techniques

    NASA Technical Reports Server (NTRS)

    Lo, Janzen; Huang, Gang; Metaxas, Dimitris

    2002-01-01

    This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.

  16. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  17. Product identification techniques used as training aids for analytical chemists

    NASA Technical Reports Server (NTRS)

    Grillo, J. P.

    1968-01-01

    Laboratory staff assistants are trained to use data and observations of routine product analyses performed by experienced analytical chemists when analyzing compounds for potential toxic hazards. Commercial products are used as examples in teaching the analytical approach to unknowns.

  18. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  19. Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975

    DTIC Science & Technology

    1975-09-01

    and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the

  20. Multidisciplinary design optimization using multiobjective formulation techniques

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Pagaldipti, Narayanan S.

    1995-01-01

    This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.

  1. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  2. Need for evaluative methodologies in land use, regional resource and waste management planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croke, E. J.

    The transfer of planning methodology from the research community to the practitioner very frequently takes the form of analytical and evaluative techniques and procedures. In the end, these become operational in the form of data acquisition, management and display systems, computational schemes that are codified in the form of manuals and handbooks, and computer simulation models. The complexity of the socioeconomic and physical processes that govern environmental resource and waste management have reinforced the need for computer assisted, scientifically sophisticated planning models that are fully operational, dependent on an attainable data base and accessible in terms of the resources normallymore » available to practitioners of regional resource management, waste management, and land use planning. A variety of models and procedures that attempt to meet one or more of the needs of these practitioners are discussed.« less

  3. Advanced Video Analysis Needs for Human Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Campbell, Paul D.

    1994-01-01

    Evaluators of human task performance in space missions make use of video as a primary source of data. Extraction of relevant human performance information from video is often a labor-intensive process requiring a large amount of time on the part of the evaluator. Based on the experiences of several human performance evaluators, needs were defined for advanced tools which could aid in the analysis of video data from space missions. Such tools should increase the efficiency with which useful information is retrieved from large quantities of raw video. They should also provide the evaluator with new analytical functions which are not present in currently used methods. Video analysis tools based on the needs defined by this study would also have uses in U.S. industry and education. Evaluation of human performance from video data can be a valuable technique in many industrial and institutional settings where humans are involved in operational systems and processes.

  4. Paid carers' experiences of caring for mechanically ventilated children at home: implications for services and training.

    PubMed

    Maddox, Christina; Pontin, David

    2013-06-01

    UK survival rates for long-term mechanically ventilated children have increased and paid carers are trained to care for them at home, however there is limited literature on carers' training needs and experience of sharing care. Using a qualitative abductive design, we purposively sampled experienced carers to generate data via diaries, semi-structured interviews, and researcher reflexive notes. Research ethics approval was granted from NHS and University committees. Five analytical themes emerged - Parent as expert; Role definition tensions; Training and Continuing Learning Needs; Mixed Emotions; Support Mechanisms highlighting the challenges of working in family homes for carers and their associated learning needs. Further work on preparing carers to share feelings with parents, using burnout prevention techniques, and building confidence is suggested. Carers highlight the lack of clinical supervision during their night-working hours. One solution may be to provide access to registered nurse support when working out-of-office hours.

  5. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  6. Airborne chemistry: acoustic levitation in chemical analysis.

    PubMed

    Santesson, Sabina; Nilsson, Staffan

    2004-04-01

    This review with 60 references describes a unique path to miniaturisation, that is, the use of acoustic levitation in analytical and bioanalytical chemistry applications. Levitation of small volumes of sample by means of a levitation technique can be used as a way to avoid solid walls around the sample, thus circumventing the main problem of miniaturisation, the unfavourable surface-to-volume ratio. Different techniques for sample levitation have been developed and improved. Of the levitation techniques described, acoustic or ultrasonic levitation fulfils all requirements for analytical chemistry applications. This technique has previously been used to study properties of molten materials and the equilibrium shape()and stability of liquid drops. Temperature and mass transfer in levitated drops have also been described, as have crystallisation and microgravity applications. The airborne analytical system described here is equipped with different and exchangeable remote detection systems. The levitated drops are normally in the 100 nL-2 microL volume range and additions to the levitated drop can be made in the pL-volume range. The use of levitated drops in analytical and bioanalytical chemistry offers several benefits. Several remote detection systems are compatible with acoustic levitation, including fluorescence imaging detection, right angle light scattering, Raman spectroscopy, and X-ray diffraction. Applications include liquid/liquid extractions, solvent exchange, analyte enrichment, single-cell analysis, cell-cell communication studies, precipitation screening of proteins to establish nucleation conditions, and crystallisation of proteins and pharmaceuticals.

  7. Single Particle Analysis by Combined Chemical Imaging to Study Episodic Air Pollution Events in Vienna

    NASA Astrophysics Data System (ADS)

    Ofner, Johannes; Eitenberger, Elisabeth; Friedbacher, Gernot; Brenner, Florian; Hutter, Herbert; Schauer, Gerhard; Kistler, Magdalena; Greilinger, Marion; Lohninger, Hans; Lendl, Bernhard; Kasper-Giebl, Anne

    2017-04-01

    The aerosol composition of a city like Vienna is characterized by a complex interaction of local emissions and atmospheric input on a regional and continental scale. The identification of major aerosol constituents for basic source appointment and air quality issues needs a high analytical effort. Exceptional episodic air pollution events strongly change the typical aerosol composition of a city like Vienna on a time-scale of few hours to several days. Analyzing the chemistry of particulate matter from these events is often hampered by the sampling time and related sample amount necessary to apply the full range of bulk analytical methods needed for chemical characterization. Additionally, morphological and single particle features are hardly accessible. Chemical Imaging evolved to a powerful tool for image-based chemical analysis of complex samples. As a complementary technique to bulk analytical methods, chemical imaging can address a new access to study air pollution events by obtaining major aerosol constituents with single particle features at high temporal resolutions and small sample volumes. The analysis of the chemical imaging datasets is assisted by multivariate statistics with the benefit of image-based chemical structure determination for direct aerosol source appointment. A novel approach in chemical imaging is combined chemical imaging or so-called multisensor hyperspectral imaging, involving elemental imaging (electron microscopy-based energy dispersive X-ray imaging), vibrational imaging (Raman micro-spectroscopy) and mass spectrometric imaging (Time-of-Flight Secondary Ion Mass Spectrometry) with subsequent combined multivariate analytics. Combined chemical imaging of precipitated aerosol particles will be demonstrated by the following examples of air pollution events in Vienna: Exceptional episodic events like the transformation of Saharan dust by the impact of the city of Vienna will be discussed and compared to samples obtained at a high alpine background site (Sonnblick Observatory, Saharan Dust Event from April 2016). Further, chemical imaging of biological aerosol constituents of an autumnal pollen breakout in Vienna, with background samples from nearby locations from November 2016 will demonstrate the advantages of the chemical imaging approach. Additionally, the chemical fingerprint of an exceptional air pollution event from a local emission source, caused by the pull down process of a building in Vienna will unravel the needs for multisensor imaging, especially the combinational access. Obtained chemical images will be correlated to bulk analytical results. Benefits of the overall methodical access by combining bulk analytics and combined chemical imaging of exceptional episodic air pollution events will be discussed.

  8. Culture-Sensitive Functional Analytic Psychotherapy

    ERIC Educational Resources Information Center

    Vandenberghe, L.

    2008-01-01

    Functional analytic psychotherapy (FAP) is defined as behavior-analytically conceptualized talk therapy. In contrast to the technique-oriented educational format of cognitive behavior therapy and the use of structural mediational models, FAP depends on the functional analysis of the moment-to-moment stream of interactions between client and…

  9. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  10. An Analytical Technique to Elucidate Field Impurities From Manufacturing Uncertainties of an Double Pancake Type HTS Insert for High Field LTS/HTS NMR Magnets

    PubMed Central

    Hahn, Seung-yong; Ahn, Min Cheol; Bobrov, Emanuel Saul; Bascuñán, Juan; Iwasa, Yukikazu

    2010-01-01

    This paper addresses adverse effects of dimensional uncertainties of an HTS insert assembled with double-pancake coils on spatial field homogeneity. Each DP coil was wound with Bi2223 tapes having dimensional tolerances larger than one order of magnitude of those accepted for LTS wires used in conventional NMR magnets. The paper presents: 1) dimensional variations measured in two LTS/HTS NMR magnets, 350 MHz (LH350) and 700 MHz (LH700), both built and operated at the Francis Bitter Magnet Laboratory; and 2) an analytical technique and its application to elucidate the field impurities measured with the two LTS/HTS magnets. Field impurities computed with the analytical model and those measured with the two LTS/HTS magnets agree quite well, demonstrating that this analytical technique is applicable to design a DP-assembled HTS insert with an improved field homogeneity for a high-field LTS/HTS NMR magnet. PMID:20407595

  11. Methods for geochemical analysis

    USGS Publications Warehouse

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  12. Electrochemical concentration measurements for multianalyte mixtures in simulated electrorefiner salt

    NASA Astrophysics Data System (ADS)

    Rappleye, Devin Spencer

    The development of electroanalytical techniques in multianalyte molten salt mixtures, such as those found in used nuclear fuel electrorefiners, would enable in situ, real-time concentration measurements. Such measurements are beneficial for process monitoring, optimization and control, as well as for international safeguards and nuclear material accountancy. Electroanalytical work in molten salts has been limited to single-analyte mixtures with a few exceptions. This work builds upon the knowledge of molten salt electrochemistry by performing electrochemical measurements on molten eutectic LiCl-KCl salt mixture containing two analytes, developing techniques for quantitatively analyzing the measured signals even with an additional signal from another analyte, correlating signals to concentration and identifying improvements in experimental and analytical methodologies. (Abstract shortened by ProQuest.).

  13. Analytical methods for gelatin differentiation from bovine and porcine origins and food products.

    PubMed

    Nhari, Raja Mohd Hafidz Raja; Ismail, Amin; Che Man, Yaakob B

    2012-01-01

    Usage of gelatin in food products has been widely debated for several years, which is about the source of gelatin that has been used, religion, and health. As an impact, various analytical methods have been introduced and developed to differentiate gelatin whether it is made from porcine or bovine sources. The analytical methods comprise a diverse range of equipment and techniques including spectroscopy, chemical precipitation, chromatography, and immunochemical. Each technique can differentiate gelatins for certain extent with advantages and limitations. This review is focused on overview of the analytical methods available for differentiation of bovine and porcine gelatin and gelatin in food products so that new method development can be established. © 2011 Institute of Food Technologists®

  14. An analytical and experimental evaluation of the plano-cylindrical Fresnel lens solar concentrator

    NASA Technical Reports Server (NTRS)

    Hastings, L. J.; Allums, S. L.; Cosby, R. M.

    1976-01-01

    Plastic Fresnel lenses for solar concentration are attractive because of potential for low-cost mass production. An analytical and experimental evaluation of line-focusing Fresnel lenses with application potential in the 200 to 370 C range is reported. Analytical techniques were formulated to assess the solar transmission and imaging properties of a grooves-down lens. Experimentation was based primarily on a 56 cm-wide lens with f-number 1.0. A sun-tracking heliostat provided a non-moving solar source. Measured data indicated more spreading at the profile base than analytically predicted. The measured and computed transmittances were 85 and 87% respectively. Preliminary testing with a second lens (1.85 m) indicated that modified manufacturing techniques corrected the profile spreading problem.

  15. Adequacy of surface analytical tools for studying the tribology of ceramics

    NASA Technical Reports Server (NTRS)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  16. Uses of Multivariate Analytical Techniques in Online and Blended Business Education: An Assessment of Current Practice and Recommendations for Future Research

    ERIC Educational Resources Information Center

    Arbaugh, J. B.; Hwang, Alvin

    2013-01-01

    Seeking to assess the analytical rigor of empirical research in management education, this article reviews the use of multivariate statistical techniques in 85 studies of online and blended management education over the past decade and compares them with prescriptions offered by both the organization studies and educational research communities.…

  17. Bioanalytical Applications of Fluorescence Line-Narrowing and Non-Line-Narrowing Spectroscopy Interfaced with Capillary Electrophoresis and High-Performance Liquid Chromatography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Kenneth Paul

    Capillary electrophoresis (CE) and high-performance liquid chromatography (HPLC) are widely used analytical separation techniques with many applications in chemical, biochemical, and biomedical sciences. Conventional analyte identification in these techniques is based on retention/migration times of standards; requiring a high degree of reproducibility, availability of reliable standards, and absence of coelution. From this, several new information-rich detection methods (also known as hyphenated techniques) are being explored that would be capable of providing unambiguous on-line identification of separating analytes in CE and HPLC. As further discussed, a number of such on-line detection methods have shown considerable success, including Raman, nuclear magnetic resonancemore » (NMR), mass spectrometry (MS), and fluorescence line-narrowing spectroscopy (FLNS). In this thesis, the feasibility and potential of combining the highly sensitive and selective laser-based detection method of FLNS with analytical separation techniques are discussed and presented. A summary of previously demonstrated FLNS detection interfaced with chromatography and electrophoresis is given, and recent results from on-line FLNS detection in CE (CE-FLNS), and the new combination of HPLC-FLNS, are shown.« less

  18. Chemometric applications to assess quality and critical parameters of virgin and extra-virgin olive oil. A review.

    PubMed

    Gómez-Caravaca, Ana M; Maggio, Rubén M; Cerretani, Lorenzo

    2016-03-24

    Today virgin and extra-virgin olive oil (VOO and EVOO) are food with a large number of analytical tests planned to ensure its quality and genuineness. Almost all official methods demand high use of reagents and manpower. Because of that, analytical development in this area is continuously evolving. Therefore, this review focuses on analytical methods for EVOO/VOO which use fast and smart approaches based on chemometric techniques in order to reduce time of analysis, reagent consumption, high cost equipment and manpower. Experimental approaches of chemometrics coupled with fast analytical techniques such as UV-Vis spectroscopy, fluorescence, vibrational spectroscopies (NIR, MIR and Raman fluorescence), NMR spectroscopy, and other more complex techniques like chromatography, calorimetry and electrochemical techniques applied to EVOO/VOO production and analysis have been discussed throughout this work. The advantages and drawbacks of this association have also been highlighted. Chemometrics has been evidenced as a powerful tool for the oil industry. In fact, it has been shown how chemometrics can be implemented all along the different steps of EVOO/VOO production: raw material input control, monitoring during process and quality control of final product. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Electromigrative separation techniques in forensic science: combining selectivity, sensitivity, and robustness.

    PubMed

    Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin

    2015-01-01

    In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.

  20. Protein assay structured on paper by using lithography

    NASA Astrophysics Data System (ADS)

    Wilhelm, E.; Nargang, T. M.; Al Bitar, W.; Waterkotte, B.; Rapp, B. E.

    2015-03-01

    There are two main challenges in producing a robust, paper-based analytical device. The first one is to create a hydrophobic barrier which unlike the commonly used wax barriers does not break if the paper is bent. The second one is the creation of the (bio-)specific sensing layer. For this proteins have to be immobilized without diminishing their activity. We solve both problems using light-based fabrication methods that enable fast, efficient manufacturing of paper-based analytical devices. The first technique relies on silanization by which we create a flexible hydrophobic barrier made of dimethoxydimethylsilane. The second technique demonstrated within this paper uses photobleaching to immobilize proteins by means of maskless projection lithography. Both techniques have been tested on a classical lithography setup using printed toner masks and on a lithography system for maskless lithography. Using these setups we could demonstrate that the proposed manufacturing techniques can be carried out at low costs. The resolution of the paper-based analytical devices obtained with static masks was lower due to the lower mask resolution. Better results were obtained using advanced lithography equipment. By doing so we demonstrated, that our technique enables fabrication of effective hydrophobic boundary layers with a thickness of only 342 μm. Furthermore we showed that flourescine-5-biotin can be immobilized on the non-structured paper and be employed for the detection of streptavidinalkaline phosphatase. By carrying out this assay on a paper-based analytical device which had been structured using the silanization technique we proofed biological compatibility of the suggested patterning technique.

  1. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Raman Spectrometry.

    ERIC Educational Resources Information Center

    Gardiner, Derek J.

    1980-01-01

    Reviews mainly quantitative analytical applications in the field of Raman spectrometry. Includes references to other reviews, new and analytically untested techniques, and novel sampling and instrument designs. Cites 184 references. (CS)

  3. Petermann I and II spot size: Accurate semi analytical description involving Nelder-Mead method of nonlinear unconstrained optimization and three parameter fundamental modal field

    NASA Astrophysics Data System (ADS)

    Roy Choudhury, Raja; Roy Choudhury, Arundhati; Kanti Ghose, Mrinal

    2013-01-01

    A semi-analytical model with three optimizing parameters and a novel non-Gaussian function as the fundamental modal field solution has been proposed to arrive at an accurate solution to predict various propagation parameters of graded-index fibers with less computational burden than numerical methods. In our semi analytical formulation the optimization of core parameter U which is usually uncertain, noisy or even discontinuous, is being calculated by Nelder-Mead method of nonlinear unconstrained minimizations as it is an efficient and compact direct search method and does not need any derivative information. Three optimizing parameters are included in the formulation of fundamental modal field of an optical fiber to make it more flexible and accurate than other available approximations. Employing variational technique, Petermann I and II spot sizes have been evaluated for triangular and trapezoidal-index fibers with the proposed fundamental modal field. It has been demonstrated that, the results of the proposed solution identically match with the numerical results over a wide range of normalized frequencies. This approximation can also be used in the study of doped and nonlinear fiber amplifier.

  4. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  5. Multivariate approaches for stability control of the olive oil reference materials for sensory analysis - part II: applications.

    PubMed

    Valverde-Som, Lucia; Ruiz-Samblás, Cristina; Rodríguez-García, Francisco P; Cuadros-Rodríguez, Luis

    2018-02-09

    The organoleptic quality of virgin olive oil depends on positive and negative sensory attributes. These attributes are related to volatile organic compounds and phenolic compounds that represent the aroma and taste (flavour) of the virgin olive oil. The flavour is the characteristic that can be measured by a taster panel. However, as for any analytical measuring device, the tasters, individually, and the panel, as a whole, should be harmonized and validated and proper olive oil standards are needed. In the present study, multivariate approaches are put into practice in addition to the rules to build a multivariate control chart from chromatographic volatile fingerprinting and chemometrics. Fingerprinting techniques provide analytical information without identify and quantify the analytes. This methodology is used to monitor the stability of sensory reference materials. The similarity indices have been calculated to build multivariate control chart with two olive oils certified reference materials that have been used as examples to monitor their stabilities. This methodology with chromatographic data could be applied in parallel with the 'panel test' sensory method to reduce the work of sensory analysis. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  6. Development of Equivalent Material Properties of Microbump for Simulating Chip Stacking Packaging

    PubMed Central

    Lee, Chang-Chun; Tzeng, Tzai-Liang; Huang, Pei-Chen

    2015-01-01

    A three-dimensional integrated circuit (3D-IC) structure with a significant scale mismatch causes difficulty in analytic model construction. This paper proposes a simulation technique to introduce an equivalent material composed of microbumps and their surrounding wafer level underfill (WLUF). The mechanical properties of this equivalent material, including Young’s modulus (E), Poisson’s ratio, shear modulus, and coefficient of thermal expansion (CTE), are directly obtained by applying either a tensile load or a constant displacement, and by increasing the temperature during simulations, respectively. Analytic results indicate that at least eight microbumps at the outermost region of the chip stacking structure need to be considered as an accurate stress/strain contour in the concerned region. In addition, a factorial experimental design with analysis of variance is proposed to optimize chip stacking structure reliability with four factors: chip thickness, substrate thickness, CTE, and E-value. Analytic results show that the most significant factor is CTE of WLUF. This factor affects microbump reliability and structural warpage under a temperature cycling load and high-temperature bonding process. WLUF with low CTE and high E-value are recommended to enhance the assembly reliability of the 3D-IC architecture. PMID:28793495

  7. A three-dimensional analytical model to simulate groundwater flow during operation of recirculating wells

    NASA Astrophysics Data System (ADS)

    Huang, Junqi; Goltz, Mark N.

    2005-11-01

    The potential for using pairs of so-called horizontal flow treatment wells (HFTWs) to effect in situ capture and treatment of contaminated groundwater has recently been demonstrated. To apply this new technology, design engineers need to be able to simulate the relatively complex groundwater flow patterns that result from HFTW operation. In this work, a three-dimensional analytical solution for steady flow in a homogeneous, anisotropic, contaminated aquifer is developed to efficiently calculate the interflow of water circulating between a pair of HFTWs and map the spatial extent of contaminated groundwater flowing from upgradient that is captured. The solution is constructed by superposing the solutions for the flow fields resulting from operation of partially penetrating wells. The solution is used to investigate the flow resulting from operation of an HFTW well pair and to quantify how aquifer anisotropy, well placement, and pumping rate impact capture zone width and interflow. The analytical modeling method presented here provides a fast and accurate technique for representing the flow field resulting from operation of HFTW systems, and represents a tool that can be useful in designing in situ groundwater contamination treatment systems.

  8. Evaluation Criteria for Micro-CAI: A Psychometric Approach

    PubMed Central

    Wallace, Douglas; Slichter, Mark; Bolwell, Christine

    1985-01-01

    The increased use of microcomputer-based instructional programs has resulted in a greater need for third-party evaluation of the software. This in turn has prompted the development of micro-CAI evaluation tools. The present project sought to develop a prototype instrument to assess the impact of CAI program presentation characteristics on students. Data analysis and scale construction was conducted using standard item reliability analyses and factor analytic techniques. Adequate subscale reliabilities and factor structures were found, suggesting that a psychometric approach to CAI evaluation may possess some merit. Efforts to assess the utility of the resultant instrument are currently underway.

  9. Biologically inspired technologies using artificial muscles

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2005-01-01

    One of the newest fields of biomimetics is the electroactive polymers (EAP) that are also known as artificial muscles. To take advantage of these materials, efforts are made worldwide to establish a strong infrastructure addressing the need for comprehensive analytical modeling of their response mechanism and develop effective processing and characterization techniques. The field is still in its emerging state and robust materials are still not readily available however in recent years significant progress has been made and commercial products have already started to appear. This paper covers the current state of- the-art and challenges to making artificial muscles and their potential biomimetic applications.

  10. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  11. Automated Solid Phase Extraction (SPE) LC/NMR Applied to the Structural Analysis of Extractable Compounds from a Pharmaceutical Packaging Material of Construction.

    PubMed

    Norwood, Daniel L; Mullis, James O; Davis, Mark; Pennino, Scott; Egert, Thomas; Gonnella, Nina C

    2013-01-01

    The structural analysis (i.e., identification) of organic chemical entities leached into drug product formulations has traditionally been accomplished with techniques involving the combination of chromatography with mass spectrometry. These include gas chromatography/mass spectrometry (GC/MS) for volatile and semi-volatile compounds, and various forms of liquid chromatography/mass spectrometry (LC/MS or HPLC/MS) for semi-volatile and relatively non-volatile compounds. GC/MS and LC/MS techniques are complementary for structural analysis of leachables and potentially leachable organic compounds produced via laboratory extraction of pharmaceutical container closure/delivery system components and corresponding materials of construction. Both hyphenated analytical techniques possess the separating capability, compound specific detection attributes, and sensitivity required to effectively analyze complex mixtures of trace level organic compounds. However, hyphenated techniques based on mass spectrometry are limited by the inability to determine complete bond connectivity, the inability to distinguish between many types of structural isomers, and the inability to unambiguously determine aromatic substitution patterns. Nuclear magnetic resonance spectroscopy (NMR) does not have these limitations; hence it can serve as a complement to mass spectrometry. However, NMR technology is inherently insensitive and its ability to interface with chromatography has been historically challenging. This article describes the application of NMR coupled with liquid chromatography and automated solid phase extraction (SPE-LC/NMR) to the structural analysis of extractable organic compounds from a pharmaceutical packaging material of construction. The SPE-LC/NMR technology combined with micro-cryoprobe technology afforded the sensitivity and sample mass required for full structure elucidation. Optimization of the SPE-LC/NMR analytical method was achieved using a series of model compounds representing the chemical diversity of extractables. This study demonstrates the complementary nature of SPE-LC/NMR with LC/MS for this particular pharmaceutical application. The identification of impurities leached into drugs from the components and materials associated with pharmaceutical containers, packaging components, and materials has historically been done using laboratory techniques based on the combination of chromatography with mass spectrometry. Such analytical techniques are widely recognized as having the selectivity and sensitivity required to separate the complex mixtures of impurities often encountered in such identification studies, including both the identification of leachable impurities as well as potential leachable impurities produced by laboratory extraction of packaging components and materials. However, while mass spectrometry-based analytical techniques have limitations for this application, newer analytical techniques based on the combination of chromatography with nuclear magnetic resonance spectroscopy provide an added dimension of structural definition. This article describes the development, optimization, and application of an analytical technique based on the combination of chromatography and nuclear magnetic resonance spectroscopy to the identification of potential leachable impurities from a pharmaceutical packaging material. The complementary nature of the analytical techniques for this particular pharmaceutical application is demonstrated.

  12. The HVT technique and the 'uncertainty' relation for central potentials

    NASA Astrophysics Data System (ADS)

    Grypeos, M. E.; Koutroulos, C. G.; Oyewumi, K. J.; Petridou, Th

    2004-08-01

    The quantum mechanical hypervirial theorems (HVT) technique is used to treat the so-called 'uncertainty' relation for quite a general class of central potential wells, including the (reduced) Poeschl-Teller and the Gaussian one. It is shown that this technique is quite suitable in deriving an approximate analytic expression in the form of a truncated power series expansion for the dimensionless product Pnl equiv langr2rangnllangp2rangnl/planck2, for every (deeply) bound state of a particle moving non-relativistically in the well, provided that a (dimensionless) parameter s is sufficiently small. Attention is also paid to a number of cases, among the limited existing ones, in which exact analytic or semi-analytic expressions for Pnl can be derived. Finally, numerical results are given and discussed.

  13. 7 CFR 90.2 - General terms defined.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...

  14. Analytical Applications of NMR: Summer Symposium on Analytical Chemistry.

    ERIC Educational Resources Information Center

    Borman, Stuart A.

    1982-01-01

    Highlights a symposium on analytical applications of nuclear magnetic resonance spectroscopy (NMR), discussing pulse Fourier transformation technique, two-dimensional NMR, solid state NMR, and multinuclear NMR. Includes description of ORACLE, an NMR data processing system at Syracuse University using real-time color graphics, and algorithms for…

  15. Microgenetic Learning Analytics Methods: Workshop Report

    ERIC Educational Resources Information Center

    Aghababyan, Ani; Martin, Taylor; Janisiewicz, Philip; Close, Kevin

    2016-01-01

    Learning analytics is an emerging discipline and, as such, benefits from new tools and methodological approaches. This work reviews and summarizes our workshop on microgenetic data analysis techniques using R, held at the second annual Learning Analytics Summer Institute in Cambridge, Massachusetts, on 30 June 2014. Specifically, this paper…

  16. Dynamic mobility applications analytical needs assessment.

    DOT National Transportation Integrated Search

    2012-07-01

    Dynamic Mobility Applications Analytical Needs Assessment was a one-year project (July 2011 to July 2012) to develop a strategy for assessing the potential impact of twenty-eight applications for improved mobility across national transportation syste...

  17. Active Control of Inlet Noise on the JT15D Turbofan Engine

    NASA Technical Reports Server (NTRS)

    Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.

    1999-01-01

    This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.

  18. A rapid method for estimation of Pu-isotopes in urine samples using high volume centrifuge.

    PubMed

    Kumar, Ranjeet; Rao, D D; Dubla, Rupali; Yadav, J R

    2017-07-01

    The conventional radio-analytical technique used for estimation of Pu-isotopes in urine samples involves anion exchange/TEVA column separation followed by alpha spectrometry. This sequence of analysis consumes nearly 3-4 days for completion. Many a times excreta analysis results are required urgently, particularly under repeat and incidental/emergency situations. Therefore, there is need to reduce the analysis time for the estimation of Pu-isotopes in bioassay samples. This paper gives the details of standardization of a rapid method for estimation of Pu-isotopes in urine samples using multi-purpose centrifuge, TEVA resin followed by alpha spectrometry. The rapid method involves oxidation of urine samples, co-precipitation of plutonium along with calcium phosphate followed by sample preparation using high volume centrifuge and separation of Pu using TEVA resin. Pu-fraction was electrodeposited and activity estimated using 236 Pu tracer recovery by alpha spectrometry. Ten routine urine samples of radiation workers were analyzed and consistent radiochemical tracer recovery was obtained in the range 47-88% with a mean and standard deviation of 64.4% and 11.3% respectively. With this newly standardized technique, the whole analytical procedure is completed within 9h (one working day hour). Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. An Evaluation of Fractal Surface Measurement Methods for Characterizing Landscape Complexity from Remote-Sensing Imagery

    NASA Technical Reports Server (NTRS)

    Lam, Nina Siu-Ngan; Qiu, Hong-Lie; Quattrochi, Dale A.; Emerson, Charles W.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The rapid increase in digital data volumes from new and existing sensors necessitates the need for efficient analytical tools for extracting information. We developed an integrated software package called ICAMS (Image Characterization and Modeling System) to provide specialized spatial analytical functions for interpreting remote sensing data. This paper evaluates the three fractal dimension measurement methods: isarithm, variogram, and triangular prism, along with the spatial autocorrelation measurement methods Moran's I and Geary's C, that have been implemented in ICAMS. A modified triangular prism method was proposed and implemented. Results from analyzing 25 simulated surfaces having known fractal dimensions show that both the isarithm and triangular prism methods can accurately measure a range of fractal surfaces. The triangular prism method is most accurate at estimating the fractal dimension of higher spatial complexity, but it is sensitive to contrast stretching. The variogram method is a comparatively poor estimator for all of the surfaces, particularly those with higher fractal dimensions. Similar to the fractal techniques, the spatial autocorrelation techniques are found to be useful to measure complex images but not images with low dimensionality. These fractal measurement methods can be applied directly to unclassified images and could serve as a tool for change detection and data mining.

  20. A method for the direct injection and analysis of small volume human blood spots and plasma extracts containing high concentrations of organic solvents using revered-phase 2D UPLC/MS.

    PubMed

    Rainville, Paul D; Simeone, Jennifer L; Root, Dan S; Mallet, Claude R; Wilson, Ian D; Plumb, Robert S

    2015-03-21

    The emergence of micro sampling techniques holds great potential to improve pharmacokinetic data quality, reduce animal usage, and save costs in safety assessment studies. The analysis of these samples presents new challenges for bioanalytical scientists, both in terms of sample processing and analytical sensitivity. The use of two dimensional LC/MS with, at-column-dilution for the direct analysis of highly organic extracts prepared from biological fluids such as dried blood spots and plasma is demonstrated. This technique negated the need to dry down and reconstitute, or dilute samples with water/aqueous buffer solutions, prior to injection onto a reversed-phase LC system. A mixture of model drugs, including bromhexine, triprolidine, enrofloxacin, and procaine were used to test the feasibility of the method. Finally an LC/MS assay for the probe pharmaceutical rosuvastatin was developed from dried blood spots and protein-precipitated plasma. The assays showed acceptable recovery, accuracy and precision according to US FDA guidelines. The resulting analytical method showed an increase in assay sensitivity of up to forty fold as compared to conventional methods by maximizing the amount loaded onto the system and the MS response for the probe pharmaceutical rosuvastatin from small volume samples.

  1. Concurrence of big data analytics and healthcare: A systematic review.

    PubMed

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Closing the brain-to-brain loop in laboratory testing.

    PubMed

    Plebani, Mario; Lippi, Giuseppe

    2011-07-01

    Abstract The delivery of laboratory services has been described 40 years ago and defined with the foremost concept of "brain-to-brain turnaround time loop". This concept consists of several processes, including the final step which is the action undertaken on the patient based on laboratory information. Unfortunately, the need for systematic feedback to improve the value of laboratory services has been poorly understood and, even more risky, poorly applied in daily laboratory practice. Currently, major problems arise from the unavailability of consensually accepted quality specifications for the extra-analytical phase of laboratory testing. This, in turn, does not allow clinical laboratories to calculate a budget for the "patient-related total error". The definition and use of the term "total error" refers only to the analytical phase, and should be better defined as "total analytical error" to avoid any confusion and misinterpretation. According to the hierarchical approach to classify strategies to set analytical quality specifications, the "assessment of the effect of analytical performance on specific clinical decision-making" is comprehensively at the top and therefore should be applied as much as possible to address analytical efforts towards effective goals. In addition, an increasing number of laboratories worldwide are adopting risk management strategies such as FMEA, FRACAS, LEAN and Six Sigma since these techniques allow the identification of the most critical steps in the total testing process, and to reduce the patient-related risk of error. As a matter of fact, an increasing number of laboratory professionals recognize the importance of understanding and monitoring any step in the total testing process, including the appropriateness of the test request as well as the appropriate interpretation and utilization of test results.

  3. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    PubMed

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  4. LIBS Analysis for Coal

    NASA Astrophysics Data System (ADS)

    E. Romero, Carlos; De Saro, Robert

    Coal is a non-uniform material with large inherent variability in composition, and other important properties, such as calorific value and ash fusion temperature. This quality variability is very important when coal is used as fuel in steam generators, since it affects boiler operation and control, maintenance and availability, and the extent and treatment of environmental pollution associated with coal combustion. On-line/in situ monitoring of coal before is fed into a boiler is a necessity. A very few analytical techniques like X-ray fluorescence and prompt gamma neutron activation analysis are available commercially with enough speed and sophistication of data collection for continuous coal monitoring. However, there is still a need for a better on-line/in situ technique that has higher selectivity, sensitivity, accuracy and precision, and that is safer and has a lower installation and operating costs than the other options. Laser induced breakdown spectroscopy (LIBS) is ideal for coal monitoring in boiler applications as it need no sample preparation, it is accurate and precise it is fast, and it can detect all of the elements of concern to the coal-fired boiler industry. LIBS data can also be adapted with advanced data processing techniques to provide real-time information required by boiler operators nowadays. This chapter summarizes development of LIBS for on-line/in situ coal applications in utility boilers.

  5. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  6. Discourse-Centric Learning Analytics: Mapping the Terrain

    ERIC Educational Resources Information Center

    Knight, Simon; Littleton, Karen

    2015-01-01

    There is an increasing interest in developing learning analytic techniques for the analysis, and support of, high-quality learning discourse. This paper maps the terrain of discourse-centric learning analytics (DCLA), outlining the distinctive contribution of DCLA and outlining a definition for the field moving forwards. It is our claim that DCLA…

  7. 40 CFR 136.6 - Method modifications and analytical requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...

  8. Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations

    ERIC Educational Resources Information Center

    Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei

    2016-01-01

    Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…

  9. Techniques for sensing methanol concentration in aqueous environments

    NASA Technical Reports Server (NTRS)

    Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)

    2001-01-01

    An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.

  10. Big data analytics : predicting traffic flow regimes from simulated connected vehicle messages using data analytics and machine learning.

    DOT National Transportation Integrated Search

    2016-12-25

    The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...

  11. INVESTIGATING ENVIRONMENTAL SINKS OF MACROLIDE ANTIBIOTICS WITH ANALYTICAL CHEMISTRY

    EPA Science Inventory

    Possible environmental sinks (wastewater effluents, biosolids, sediments) of macrolide antibiotics (i.e., azithromycin, roxithromycin and clarithromycin)are investigated using state-of-the-art analytical chemistry techniques.

  12. Investigation of finite element: ABC methods for electromagnetic field simulation. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chatterjee, A.; Volakis, John L.; Nguyen, J.

    1994-01-01

    The mechanics of wave propagation in the presence of obstacles is of great interest in many branches of engineering and applied mathematics like electromagnetics, fluid dynamics, geophysics, seismology, etc. Such problems can be broadly classified into two categories: the bounded domain or the closed problem and the unbounded domain or the open problem. Analytical techniques have been derived for the simpler problems; however, the need to model complicated geometrical features, complex material coatings and fillings, and to adapt the model to changing design parameters have inevitably tilted the balance in favor of numerical techniques. The modeling of closed problems presents difficulties primarily in proper meshing of the interior region. However, problems in unbounded domains pose a unique challenge to computation, since the exterior region is inappropriate for direct implementation of numerical techniques. A large number of solutions have been proposed but only a few have stood the test of time and experiment. The goal of this thesis is to develop an efficient and reliable partial differential equation technique to model large three dimensional scattering problems in electromagnetics.

  13. The analysis of cable forces based on natural frequency

    NASA Astrophysics Data System (ADS)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  14. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  15. Species authentication and geographical origin discrimination of herbal medicines by near infrared spectroscopy: A review.

    PubMed

    Wang, Pei; Yu, Zhiguo

    2015-10-01

    Near infrared (NIR) spectroscopy as a rapid and nondestructive analytical technique, integrated with chemometrics, is a powerful process analytical tool for the pharmaceutical industry and is becoming an attractive complementary technique for herbal medicine analysis. This review mainly focuses on the recent applications of NIR spectroscopy in species authentication of herbal medicines and their geographical origin discrimination.

  16. A Boltzmann machine for the organization of intelligent machines

    NASA Technical Reports Server (NTRS)

    Moed, Michael C.; Saridis, George N.

    1989-01-01

    In the present technological society, there is a major need to build machines that would execute intelligent tasks operating in uncertain environments with minimum interaction with a human operator. Although some designers have built smart robots, utilizing heuristic ideas, there is no systematic approach to design such machines in an engineering manner. Recently, cross-disciplinary research from the fields of computers, systems AI and information theory has served to set the foundations of the emerging area of the design of intelligent machines. Since 1977 Saridis has been developing an approach, defined as Hierarchical Intelligent Control, designed to organize, coordinate and execute anthropomorphic tasks by a machine with minimum interaction with a human operator. This approach utilizes analytical (probabilistic) models to describe and control the various functions of the intelligent machine structured by the intuitively defined principle of Increasing Precision with Decreasing Intelligence (IPDI) (Saridis 1979). This principle, even though resembles the managerial structure of organizational systems (Levis 1988), has been derived on an analytic basis by Saridis (1988). The purpose is to derive analytically a Boltzmann machine suitable for optimal connection of nodes in a neural net (Fahlman, Hinton, Sejnowski, 1985). Then this machine will serve to search for the optimal design of the organization level of an intelligent machine. In order to accomplish this, some mathematical theory of the intelligent machines will be first outlined. Then some definitions of the variables associated with the principle, like machine intelligence, machine knowledge, and precision will be made (Saridis, Valavanis 1988). Then a procedure to establish the Boltzmann machine on an analytic basis will be presented and illustrated by an example in designing the organization level of an Intelligent Machine. A new search technique, the Modified Genetic Algorithm, is presented and proved to converge to the minimum of a cost function. Finally, simulations will show the effectiveness of a variety of search techniques for the intelligent machine.

  17. Method optimization and quality assurance in speciation analysis using high performance liquid chromatography with detection by inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Larsen, Erik H.

    1998-02-01

    Achievement of optimum selectivity, sensitivity and robustness in speciation analysis using high performance liquid chromatography (HPLC) with inductively coupled mass spectrometry (ICP-MS) detection requires that each instrumental component is selected and optimized with a view to the ideal operating characteristics of the entire hyphenated system. An isocratic HPLC system, which employs an aqueous mobile phase with organic buffer constituents, is well suited for introduction into the ICP-MS because of the stability of the detector response and high degree of analyte sensitivity attained. Anion and cation exchange HPLC systems, which meet these requirements, were used for the seperation of selenium and arsenic species in crude extracts of biological samples. Furthermore, the signal-to-noise ratios obtained for these incompletely ionized elements in the argon ICP were further enhanced by a factor of four by continously introducing carbon as methanol via the mobile phase into the ICP. Sources of error in the HPLC system (column overload), in the sample introduction system (memory by organic solvents) and in the ICP-MS (spectroscopic interferences) and their prevention are also discussed. The optimized anion and cation exchange HPLC-ICP-MS systems were used for arsenic speciation in contaminated ground water and in an in-house shrimp reference sample. For the purpose of verification, HPLC coupled with tandem mass spectrometry with electrospray ionization was additionally used for arsenic speciation in the shrimp sample. With this analytical technique the HPLC retention time in combination with mass analysis of the molecular ions and their collision-induced fragments provide almost conclusive evidence of the identity of the analyte species. The speciation methods are validated by establishing a mass balance of the analytes in each fraction of the extraction procedure, by recovery of spikes and by employing and comparing independent techniques. The urgent need for reference materials certified for elemental species is stressed.

  18. Application and further development of diffusion based 2D chemical imaging techniques in the rhizosphere

    NASA Astrophysics Data System (ADS)

    Hoefer, Christoph; Santner, Jakob; Borisov, Sergey; Kreuzeder, Andreas; Wenzel, Walter; Puschenreiter, Markus

    2015-04-01

    Two dimensional chemical imaging of root processes refers to novel in situ methods to investigate and map solutes at a high spatial resolution (sub-mm). The visualization of these solutes reveals new insights in soil biogeochemistry and root processes. We derive chemical images by using data from DGT-LA-ICP-MS (Diffusive Gradients in Thin Films and Laser Ablation Inductively Coupled Plasma Mass Spectrometry) and POS (Planar Optode Sensors). Both technologies have shown promising results when applied in aqueous environment but need to be refined and improved for imaging at the soil-plant interface. Co-localized mapping using combined DGT and POS technologies and the development of new gel combinations are in our focus. DGTs are smart and thin (<0.4 mm) hydrogels; containing a binding resin for the targeted analytes (e.g. trace metals, phosphate, sulphide or radionuclides). The measurement principle is passive and diffusion based. The present analytes are diffusing into the gel and are bound by the resin. Thereby, the resin acts as zero sink. After application, DGTs are retrieved, dried, and analysed using LA-ICP-MS. The data is then normalized by an internal standard (e.g. 13C), calibrated using in-house standards and chemical images of the target area are plotted using imaging software. POS are, similar to DGT, thin sensor foils containing a fluorophore coating depending on the target analyte. The measurement principle is based on excitation of the flourophore by a specific wavelength and emission of the fluorophore depending on the presence of the analyte. The emitted signal is captured using optical filters and a DSLR camera. While DGT analysis is destructive, POS measurements can be performed continuously during the application. Both semi-quantitative techniques allow an in situ application to visualize chemical processes directly at the soil-plant interface. Here, we present a summary of results from rhizotron experiments with different plants in metal contaminated and agricultural soils.

  19. Remote sensing for oceanography: Past, present, future

    NASA Technical Reports Server (NTRS)

    Mcgoldrick, L. F.

    1984-01-01

    Oceanic dynamics was traditionally investigated by sampling from instruments in situ, yielding quantitative measurements that are intermittent in both space and time; the ocean is undersampled. The need to obtain proper sampling of the averaged quantities treated in analytical and numerical models is at present the most significant limitation on advances in physical oceanography. Within the past decade, many electromagnetic techniques for the study of the Earth and planets were applied to the study of the ocean. Now satellites promise nearly total coverage of the world's oceans using only a few days to a few weeks of observations. Both a review of the early and present techniques applied to satellite oceanography and a description of some future systems to be launched into orbit during the remainder of this century are presented. Both scientific and technologic capabilities are discussed.

  20. Imaging-based molecular barcoding with pixelated dielectric metasurfaces

    NASA Astrophysics Data System (ADS)

    Tittl, Andreas; Leitis, Aleksandrs; Liu, Mingkai; Yesilkoy, Filiz; Choi, Duk-Yong; Neshev, Dragomir N.; Kivshar, Yuri S.; Altug, Hatice

    2018-06-01

    Metasurfaces provide opportunities for wavefront control, flat optics, and subwavelength light focusing. We developed an imaging-based nanophotonic method for detecting mid-infrared molecular fingerprints and implemented it for the chemical identification and compositional analysis of surface-bound analytes. Our technique features a two-dimensional pixelated dielectric metasurface with a range of ultrasharp resonances, each tuned to a discrete frequency; this enables molecular absorption signatures to be read out at multiple spectral points, and the resulting information is then translated into a barcode-like spatial absorption map for imaging. The signatures of biological, polymer, and pesticide molecules can be detected with high sensitivity, covering applications such as biosensing and environmental monitoring. Our chemically specific technique can resolve absorption fingerprints without the need for spectrometry, frequency scanning, or moving mechanical parts, thereby paving the way toward sensitive and versatile miniaturized mid-infrared spectroscopy devices.

  1. "Tangible as tissue": Arnold Gesell, infant behavior, and film analysis.

    PubMed

    Curtis, Scott

    2011-09-01

    From 1924 to 1948, developmental psychologist Arnold Gesell regularly used photographic and motion picture technologies to collect data on infant behavior. The film camera, he said, records behavior "in such coherent, authentic and measurable detail that ... the reaction patterns of infant and child become almost as tangible as tissue." This essay places his faith in the fidelity and tangibility of film, as well as his use of film as evidence, in the context of developmental psychology's professed need for legitimately scientific observational techniques. It also examines his use of these same films as educational material to promote his brand of scientific child rearing. But his analytic techniques - his methods of extracting data from the film frames - are the key to understanding the complex relationship between his theories of development and his chosen research technology.

  2. Recent analytical developments for powder characterization

    NASA Astrophysics Data System (ADS)

    Brackx, E.; Pages, S.; Dugne, O.; Podor, R.

    2015-07-01

    Powders and divided solid materials are widely represented as finished or intermediary products in industries as widely varied as foodstuffs, cosmetics, construction, pharmaceuticals, electronic transmission, and energy. Their optimal use requires a mastery of the transformation process based on knowledge of the different phenomena concerned (sintering, chemical reactivity, purity, etc.). Their modelling and understanding need a prior acquisition of sets of data and characteristics which are more or less challenging to obtain. The goal of this study is to present the use of different physico-chemical characterization techniques adapted to uranium-containing powders analyzed either in a raw state or after a specific preparation (ionic polishing). The new developments touched on concern dimensional characterization techniques for grains and pores by image analysis, chemical surface characterization and powder chemical reactivity characterization. The examples discussed are from fabrication process materials used in the nuclear fuel cycle.

  3. Analytical techniques for measuring hydrocarbon emissions from the manufacture of fiberglass-reinforced plastics. Report for June 1995--March 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, R.S.; Kong, E.J.; Bahner, M.A.

    The paper discusses several projects to measure hydrocarbon emissions associated with the manufacture of fiberglass-reinforced plastics. The main purpose of the projects was to evaluate pollution prevention techniques to reduce emissions by altering raw materials, application equipment, and operator technique. Analytical techniques were developed to reduce the cost of these emission measurements. Emissions from a small test mold in a temporary total enclosure (TTE) correlated with emissions from full-size production molds in a separate TTE. Gravimetric mass balance measurements inside the TTE generally agreed to within +/-30% with total hydrocarbon (THC) measurements in the TTE exhaust duct.

  4. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  5. Data mining to support simulation modeling of patient flow in hospitals.

    PubMed

    Isken, Mark W; Rajagopalan, Balaji

    2002-04-01

    Spiraling health care costs in the United States are driving institutions to continually address the challenge of optimizing the use of scarce resources. One of the first steps towards optimizing resources is to utilize capacity effectively. For hospital capacity planning problems such as allocation of inpatient beds, computer simulation is often the method of choice. One of the more difficult aspects of using simulation models for such studies is the creation of a manageable set of patient types to include in the model. The objective of this paper is to demonstrate the potential of using data mining techniques, specifically clustering techniques such as K-means, to help guide the development of patient type definitions for purposes of building computer simulation or analytical models of patient flow in hospitals. Using data from a hospital in the Midwest this study brings forth several important issues that researchers need to address when applying clustering techniques in general and specifically to hospital data.

  6. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  7. Facile characterization of polymer fractions from waste electrical and electronic equipment (WEEE) for mechanical recycling.

    PubMed

    Taurino, Rosa; Pozzi, Paolo; Zanasi, Tania

    2010-12-01

    In view of the environmental problem involved in the management of WEEE, and then in the recycling of post-consumer plastic of WEEE there is a pressing need for rapid measurement technologies for simple identification of the various commercial plastic materials and of the several contaminants, to improve the recycling of such wastes. This research is focused on the characterization and recycling of two types of plastics, namely plastic from personal computer (grey plastic) and plastic from television (black plastic). Various analytical techniques were used to monitor the compositions of WEEE. Initially, the chemical structure of each plastic material was identified by Fourier transform infrared (FTIR) spectroscopy and differential scanning calorimetry (DSC). Polymeric contaminants of these plastics, in particular brominated flame retardants (BFRs) were detected in grey plastics only using different techniques. These techniques are useful for a rapid, correct and economics identification of a large volumes of WEEE plastics. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Model reduction by trimming for a class of semi-Markov reliability models and the corresponding error bound

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Palumbo, Daniel L.

    1991-01-01

    Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.

  9. Real-Time Leaky Lamb Wave Spectrum Measurement and Its Application to NDE of Composites

    NASA Technical Reports Server (NTRS)

    Lih, Shyh-Shiuh; Bar-Cohen, Yoseph

    1999-01-01

    Numerous analytical and theoretical studies of the behavior of leaky Lamb waves (LLW) in composite materials were documented in the literature. One of the key issues that are constraining the application of this method as a practical tool is the amount of data that needs to be acquired and the slow process that is involved with such experiments. Recently, a methodology that allows quasi real-time acquisition of LLW dispersion data was developed. At each angle of incidence the reflection spectrum is available in real time from the experimental setup and it can be used for rapid detection of the defects. This technique can be used to rapidly acquire the various plate wave modes along various angles of incidence for the characterization of the material elastic properties. The experimental method and data acquisition technique will be described in this paper. Experimental data was used to examine a series of flaws including porosity and delaminations and demonstrated the efficiency of the developed technique.

  10. Cell-free DNA and next-generation sequencing in the service of personalized medicine for lung cancer

    PubMed Central

    Bennett, Catherine W.; Berchem, Guy; Kim, Yeoun Jin; El-Khoury, Victoria

    2016-01-01

    Personalized medicine has emerged as the future of cancer care to ensure that patients receive individualized treatment specific to their needs. In order to provide such care, molecular techniques that enable oncologists to diagnose, treat, and monitor tumors are necessary. In the field of lung cancer, cell free DNA (cfDNA) shows great potential as a less invasive liquid biopsy technique, and next-generation sequencing (NGS) is a promising tool for analysis of tumor mutations. In this review, we outline the evolution of cfDNA and NGS and discuss the progress of using them in a clinical setting for patients with lung cancer. We also present an analysis of the role of cfDNA as a liquid biopsy technique and NGS as an analytical tool in studying EGFR and MET, two frequently mutated genes in lung cancer. Ultimately, we hope that using cfDNA and NGS for cancer diagnosis and treatment will become standard for patients with lung cancer and across the field of oncology. PMID:27589834

  11. The Coordinate Orthogonality Check (corthog)

    NASA Astrophysics Data System (ADS)

    Avitabile, P.; Pechinsky, F.

    1998-05-01

    A new technique referred to as the coordinate orthogonality check (CORTHOG) helps to identify how each physical degree of freedom contributes to the overall orthogonality relationship between analytical and experimental modal vectors on a mass-weighted basis. Using the CORTHOG technique together with the pseudo-orthogonality check (POC) clarifies where potential discrepancies exist between the analytical and experimental modal vectors. CORTHOG improves the understanding of the correlation (or lack of correlation) that exists between modal vectors. The CORTHOG theory is presented along with the evaluation of several cases to show the use of the technique.

  12. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  13. Analytical Protocols for Analysis of Organic Molecules in Mars Analog Materials

    NASA Technical Reports Server (NTRS)

    Mahaffy, Paul R.; Brinkerhoff, W.; Buch, A.; Demick, J.; Glavin, D. P.

    2004-01-01

    A range of analytical techniques and protocols that might be applied b in situ investigations of martian fines, ices, and rock samples are evaluated by analysis of organic molecules m Mars analogues. These simulants 6om terrestrial (i.e. tephra from Hawaii) or extraterrestrial (meteoritic) samples are examined by pyrolysis gas chromatograph mass spectrometry (GCMS), organic extraction followed by chemical derivatization GCMS, and laser desorption mass spectrometry (LDMS). The combination of techniques imparts analysis breadth since each technique provides a unique analysis capability for Certain classes of organic molecules.

  14. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  15. Student Attitudes toward Learning Analytics in Higher Education: "The Fitbit Version of the Learning World".

    PubMed

    Roberts, Lynne D; Howell, Joel A; Seaman, Kristen; Gibson, David C

    2016-01-01

    Increasingly, higher education institutions are exploring the potential of learning analytics to predict student retention, understand learning behaviors, and improve student learning through providing personalized feedback and support. The technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students' knowledge, attitudes, and concerns about big data and learning analytics through four focus groups ( N = 41). Thematic analysis of the focus group transcripts identified six key themes. The first theme, "Uninformed and Uncertain," represents students' lack of knowledge about learning analytics prior to the focus groups. Following the provision of information, viewing of videos and discussion of learning analytics scenarios three further themes; "Help or Hindrance to Learning," "More than a Number," and "Impeding Independence"; represented students' perceptions of the likely impact of learning analytics on their learning. "Driving Inequality" and "Where Will it Stop?" represent ethical concerns raised by the students about the potential for inequity, bias and invasion of privacy and the need for informed consent. A key tension to emerge was how "personal" vs. "collective" purposes or principles can intersect with "uniform" vs. "autonomous" activity. The findings highlight the need the need to engage students in the decision making process about learning analytics.

  16. Flexible aircraft dynamic modeling for dynamic analysis and control synthesis

    NASA Technical Reports Server (NTRS)

    Schmidt, David K.

    1989-01-01

    The linearization and simplification of a nonlinear, literal model for flexible aircraft is highlighted. Areas of model fidelity that are critical if the model is to be used for control system synthesis are developed and several simplification techniques that can deliver the necessary model fidelity are discussed. These techniques include both numerical and analytical approaches. An analytical approach, based on first-order sensitivity theory is shown to lead not only to excellent numerical results, but also to closed-form analytical expressions for key system dynamic properties such as the pole/zero factors of the vehicle transfer-function matrix. The analytical results are expressed in terms of vehicle mass properties, vibrational characteristics, and rigid-body and aeroelastic stability derivatives, thus leading to the underlying causes for critical dynamic characteristics.

  17. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. A Lightweight I/O Scheme to Facilitate Spatial and Temporal Queries of Scientific Data Analytics

    NASA Technical Reports Server (NTRS)

    Tian, Yuan; Liu, Zhuo; Klasky, Scott; Wang, Bin; Abbasi, Hasan; Zhou, Shujia; Podhorszki, Norbert; Clune, Tom; Logan, Jeremy; Yu, Weikuan

    2013-01-01

    In the era of petascale computing, more scientific applications are being deployed on leadership scale computing platforms to enhance the scientific productivity. Many I/O techniques have been designed to address the growing I/O bottleneck on large-scale systems by handling massive scientific data in a holistic manner. While such techniques have been leveraged in a wide range of applications, they have not been shown as adequate for many mission critical applications, particularly in data post-processing stage. One of the examples is that some scientific applications generate datasets composed of a vast amount of small data elements that are organized along many spatial and temporal dimensions but require sophisticated data analytics on one or more dimensions. Including such dimensional knowledge into data organization can be beneficial to the efficiency of data post-processing, which is often missing from exiting I/O techniques. In this study, we propose a novel I/O scheme named STAR (Spatial and Temporal AggRegation) to enable high performance data queries for scientific analytics. STAR is able to dive into the massive data, identify the spatial and temporal relationships among data variables, and accordingly organize them into an optimized multi-dimensional data structure before storing to the storage. This technique not only facilitates the common access patterns of data analytics, but also further reduces the application turnaround time. In particular, STAR is able to enable efficient data queries along the time dimension, a practice common in scientific analytics but not yet supported by existing I/O techniques. In our case study with a critical climate modeling application GEOS-5, the experimental results on Jaguar supercomputer demonstrate an improvement up to 73 times for the read performance compared to the original I/O method.

  19. 100-B/C Target Analyte List Development for Soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  20. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE PAGES

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    2016-09-16

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  1. State of practice and emerging application of analytical techniques of nuclear forensic analysis: highlights from the 4th Collaborative Materials Exercise of the Nuclear Forensics International Technical Working Group (ITWG)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.

    The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less

  2. Conceptual data sampling for breast cancer histology image classification.

    PubMed

    Rezk, Eman; Awan, Zainab; Islam, Fahad; Jaoua, Ali; Al Maadeed, Somaya; Zhang, Nan; Das, Gautam; Rajpoot, Nasir

    2017-10-01

    Data analytics have become increasingly complicated as the amount of data has increased. One technique that is used to enable data analytics in large datasets is data sampling, in which a portion of the data is selected to preserve the data characteristics for use in data analytics. In this paper, we introduce a novel data sampling technique that is rooted in formal concept analysis theory. This technique is used to create samples reliant on the data distribution across a set of binary patterns. The proposed sampling technique is applied in classifying the regions of breast cancer histology images as malignant or benign. The performance of our method is compared to other classical sampling methods. The results indicate that our method is efficient and generates an illustrative sample of small size. It is also competing with other sampling methods in terms of sample size and sample quality represented in classification accuracy and F1 measure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Analytic and subjective assessments of operator workload imposed by communications tasks in transport aircraft

    NASA Technical Reports Server (NTRS)

    Eckel, J. S.; Crabtree, M. S.

    1984-01-01

    Analytical and subjective techniques that are sensitive to the information transmission and processing requirements of individual communications-related tasks are used to assess workload imposed on the aircrew by A-10 communications requirements for civilian transport category aircraft. Communications-related tasks are defined to consist of the verbal exchanges between crews and controllers. Three workload estimating techniques are proposed. The first, an information theoretic analysis, is used to calculate bit values for perceptual, manual, and verbal demands in each communication task. The second, a paired-comparisons technique, obtains subjective estimates of the information processing and memory requirements for specific messages. By combining the results of the first two techniques, a hybrid analytical scale is created. The third, a subjective rank ordering of sequences of communications tasks, provides an overall scaling of communications workload. Recommendations for future research include an examination of communications-induced workload among the air crew and the development of simulation scenarios.

  4. Tracking Matrix Effects in the Analysis of DNA Adducts of Polycyclic Aromatic Hydrocarbons

    PubMed Central

    Klaene, Joshua J.; Flarakos, Caroline; Glick, James; Barret, Jennifer T.; Zarbl, Helmut; Vouros, Paul

    2015-01-01

    LC-MS using electrospray ionization is currently the method of choice in bio-organic analysis covering a wide range of applications in a broad spectrum of biological media. The technique is noted for its high sensitivity but one major limitation which hinders achievement of its optimal sensitivity is the signal suppression due to matrix inferences introduced by the presence of co-extracted compounds during the sample preparation procedure. The analysis of DNA adducts of common environmental carcinogens is particularly sensitive to such matrix effects as sample preparation is a multistep process which involves “contamination” of the sample due to the addition of enzymes and other reagents for digestion of the DNA in order to isolate the analyte(s). This problem is further exacerbated by the need to reach low levels of quantitation (LOQ in the ppb level) while also working with limited (2-5 μg) quantities of sample. We report here on the systematic investigation of ion signal suppression contributed by each individual step involved in the sample preparation associated with the analysis of DNA adducts of polycyclic aromatic hydrocarbon (PAH) using as model analyte dG-BaP, the deoxyguanosine adduct of benzo[a]pyrene (BaP). The individual matrix contribution of each one of these sources to analyte signal was systematically addressed as were any interactive effects. The information was used to develop a validated analytical protocol for the target biomarker at levels typically encountered in vivo using as little as 2 μg of DNA and applied to a dose response study using a metabolically competent cell line. PMID:26607319

  5. Predicting Malignant and Paramalignant Pleural Effusions by Combining Clinical, Radiological and Pleural Fluid Analytical Parameters.

    PubMed

    Herrera Lara, Susana; Fernández-Fabrellas, Estrella; Juan Samper, Gustavo; Marco Buades, Josefa; Andreu Lapiedra, Rafael; Pinilla Moreno, Amparo; Morales Suárez-Varela, María

    2017-10-01

    The usefulness of clinical, radiological and pleural fluid analytical parameters for diagnosing malignant and paramalignant pleural effusion is not clearly stated. Hence this study aimed to identify possible predictor variables of diagnosing malignancy in pleural effusion of unknown aetiology. Clinical, radiological and pleural fluid analytical parameters were obtained from consecutive patients who had suffered pleural effusion of unknown aetiology. They were classified into three groups according to their final diagnosis: malignant, paramalignant and benign pleural effusion. The CHAID (Chi-square automatic interaction detector) methodology was used to estimate the implication of the clinical, radiological and analytical variables in daily practice through decision trees. Of 71 patients, malignant (n = 31), paramalignant (n = 15) and benign (n = 25), smoking habit, dyspnoea, weight loss, radiological characteristics (mass, node, adenopathies and pleural thickening) and pleural fluid analytical parameters (pH and glucose) distinguished malignant and paramalignant pleural effusions (all with a p < 0.05). Decision tree 1 classified 77.8% of malignant and paramalignant pleural effusions in step 2. Decision tree 2 classified 83.3% of malignant pleural effusions in step 2, 73.3% of paramalignant pleural effusions and 91.7% of benign ones. The data herein suggest that the identified predictor values applied to tree diagrams, which required no extraordinary measures, have a higher rate of correct identification of malignant, paramalignant and benign effusions when compared to techniques available today and proved most useful for usual clinical practice. Future studies are still needed to further improve the classification of patients.

  6. Tungsten devices in analytical atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Hou, Xiandeng; Jones, Bradley T.

    2002-04-01

    Tungsten devices have been employed in analytical atomic spectrometry for approximately 30 years. Most of these atomizers can be electrically heated up to 3000 °C at very high heating rates, with a simple power supply. Usually, a tungsten device is employed in one of two modes: as an electrothermal atomizer with which the sample vapor is probed directly, or as an electrothermal vaporizer, which produces a sample aerosol that is then carried to a separate atomizer for analysis. Tungsten devices may take various physical shapes: tubes, cups, boats, ribbons, wires, filaments, coils and loops. Most of these orientations have been applied to many analytical techniques, such as atomic absorption spectrometry, atomic emission spectrometry, atomic fluorescence spectrometry, laser excited atomic fluorescence spectrometry, metastable transfer emission spectroscopy, inductively coupled plasma optical emission spectrometry, inductively coupled plasma mass spectrometry and microwave plasma atomic spectrometry. The analytical figures of merit and the practical applications reported for these techniques are reviewed. Atomization mechanisms reported for tungsten atomizers are also briefly summarized. In addition, less common applications of tungsten devices are discussed, including analyte preconcentration by adsorption or electrodeposition and electrothermal separation of analytes prior to analysis. Tungsten atomization devices continue to provide simple, versatile alternatives for analytical atomic spectrometry.

  7. Further Investigations of Content Analytic Techniques for Extracting the Differentiating Information Contained in the Narrative Sections of Performance Evaluations for Navy Enlisted Personnel. Technical Report No. 75-1.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.; Richman, Vivian

    The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…

  8. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  9. Assessing the Importance of Treatment Goals in Patients with Psoriasis: Analytic Hierarchy Process vs. Likert Scales.

    PubMed

    Gutknecht, Mandy; Danner, Marion; Schaarschmidt, Marthe-Lisa; Gross, Christian; Augustin, Matthias

    2018-02-15

    To define treatment benefit, the Patient Benefit Index contains a weighting of patient-relevant treatment goals using the Patient Needs Questionnaire, which includes a 5-point Likert scale ranging from 0 ("not important at all") to 4 ("very important"). These treatment goals have been assigned to five health dimensions. The importance of each dimension can be derived by averaging the importance ratings on the Likert scales of associated treatment goals. As the use of a Likert scale does not allow for a relative assessment of importance, the objective of this study was to estimate relative importance weights for health dimensions and associated treatment goals in patients with psoriasis by using the analytic hierarchy process and to compare these weights with the weights resulting from the Patient Needs Questionnaire. Furthermore, patients' judgments on the difficulty of the methods were investigated. Dimensions of the Patient Benefit Index and their treatment goals were mapped into a hierarchy of criteria and sub-criteria to develop the analytic hierarchy process questionnaire. Adult patients with psoriasis starting a new anti-psoriatic therapy in the outpatient clinic of the Institute for Health Services Research in Dermatology and Nursing at the University Medical Center Hamburg (Germany) were recruited and completed both methods (analytic hierarchy process, Patient Needs Questionnaire). Ratings of treatment goals on the Likert scales (Patient Needs Questionnaire) were summarized within each dimension to assess the importance of the respective health dimension/criterion. Following the analytic hierarchy process approach, consistency in judgments was assessed using a standardized measurement (consistency ratio). At the analytic hierarchy process level of criteria, 78 of 140 patients achieved the accepted consistency. Using the analytic hierarchy process, the dimension "improvement of physical functioning" was most important, followed by "improvement of social functioning". Concerning the Patient Needs Questionnaire results, these dimensions were ranked in second and fifth position, whereas "strengthening of confidence in the therapy and in a possible healing" was ranked most important, which was least important in the analytic hierarchy process ranking. In both methods, "improvement of psychological well-being" and "reduction of impairments due to therapy" were equally ranked in positions three and four. In contrast to this, on the level of sub-criteria, predominantly a similar ranking of treatment goals could be observed between the analytic hierarchy process and the Patient Needs Questionnaire. From the patients' point of view, the Likert scales (Patient Needs Questionnaire) were easier to complete than the analytic hierarchy process pairwise comparisons. Patients with psoriasis assign different importance to health dimensions and associated treatment goals. In choosing a method to assess the importance of health dimensions and/or treatment goals, it needs to be considered that resulting importance weights may differ in dependence on the used method. However, in this study, observed discrepancies in importance weights of the health dimensions were most likely caused by the different methodological approaches focusing on treatment goals to assess the importance of health dimensions on the one hand (Patient Needs Questionnaire) or directly assessing health dimensions on the other hand (analytic hierarchy process).

  10. The analyst's participation in the analytic process.

    PubMed

    Levine, H B

    1994-08-01

    The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.

  11. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  12. An Overview on the Importance of Combining Complementary Analytical Platforms in Metabolomic Research.

    PubMed

    Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul

    2017-01-01

    The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  13. Extended Analytic Device Optimization Employing Asymptotic Expansion

    NASA Technical Reports Server (NTRS)

    Mackey, Jonathan; Sehirlioglu, Alp; Dynsys, Fred

    2013-01-01

    Analytic optimization of a thermoelectric junction often introduces several simplifying assumptionsincluding constant material properties, fixed known hot and cold shoe temperatures, and thermallyinsulated leg sides. In fact all of these simplifications will have an effect on device performance,ranging from negligible to significant depending on conditions. Numerical methods, such as FiniteElement Analysis or iterative techniques, are often used to perform more detailed analysis andaccount for these simplifications. While numerical methods may stand as a suitable solution scheme,they are weak in gaining physical understanding and only serve to optimize through iterativesearching techniques. Analytic and asymptotic expansion techniques can be used to solve thegoverning system of thermoelectric differential equations with fewer or less severe assumptionsthan the classic case. Analytic methods can provide meaningful closed form solutions and generatebetter physical understanding of the conditions for when simplifying assumptions may be valid.In obtaining the analytic solutions a set of dimensionless parameters, which characterize allthermoelectric couples, is formulated and provide the limiting cases for validating assumptions.Presentation includes optimization of both classic rectangular couples as well as practically andtheoretically interesting cylindrical couples using optimization parameters physically meaningful toa cylindrical couple. Solutions incorporate the physical behavior for i) thermal resistance of hot andcold shoes, ii) variable material properties with temperature, and iii) lateral heat transfer through legsides.

  14. Big (Bio)Chemical Data Mining Using Chemometric Methods: A Need for Chemists.

    PubMed

    Tauler, Roma; Parastar, Hadi

    2018-03-23

    This review aims to demonstrate abilities to analyze Big (Bio)Chemical Data (BBCD) with multivariate chemometric methods and to show some of the more important challenges of modern analytical researches. In this review, the capabilities and versatility of chemometric methods will be discussed in light of the BBCD challenges that are being encountered in chromatographic, spectroscopic and hyperspectral imaging measurements, with an emphasis on their application to omics sciences. In addition, insights and perspectives on how to address the analysis of BBCD are provided along with a discussion of the procedures necessary to obtain more reliable qualitative and quantitative results. In this review, the importance of Big Data and of their relevance to (bio)chemistry are first discussed. Then, analytical tools which can produce BBCD are presented as well as some basics needed to understand prospects and limitations of chemometric techniques when they are applied to BBCD are given. Finally, the significance of the combination of chemometric approaches with BBCD analysis in different chemical disciplines is highlighted with some examples. In this paper, we have tried to cover some of the applications of big data analysis in the (bio)chemistry field. However, this coverage is not extensive covering everything done in the field. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Antimicrobial drug residues in milk and meat: causes, concerns, prevalence, regulations, tests, and test performance.

    PubMed

    Mitchell, J M; Griffiths, M W; McEwen, S A; McNab, W B; Yee, A J

    1998-06-01

    This paper presents a historical review of antimicrobial use in food animals, the causes of residues in meat and milk, the types of residues found, their regulation in Canada, tests used for their detection, and test performance parameters, with an emphasis on immunoassay techniques. The development of residue detection methods began shortly after the introduction of antimicrobials to food animal production in the late 1940s. From initial technical concerns expressed by the dairy industry to the present public health and international trade implications, there has been an ongoing need for reliable, sensitive, and economical methods for the detection of antimicrobial residues in food animal products such as milk and meat. Initially there were microbial growth inhibition tests, followed by more sensitive and specific methods based on receptor binding, immunochemical, and chromatographic principle. An understanding of basic test performance parameters and their implications is essential when choosing an analytical strategy for residue testing. While each test format has its own attributes, none test will meet all the required analytical needs. Therefore the use of a tiered or integrated system employing assays designated for screening and confirmation is necessary to ensure that foods containing violative residues are not introduced into the food chain.

  16. Accounting for differences in the bioactivity and bioavailability of vitamers

    PubMed Central

    Gregory, Jesse F.

    2012-01-01

    Essentially all vitamins exist with multiple nutritionally active chemical species often called vitamers. Our quantitative understanding of the bioactivity and bioavailability of the various members of each vitamin family has increased markedly, but many issues remain to be resolved concerning the reporting and use of analytical data. Modern methods of vitamin analysis rely heavily on chromatographic techniques that generally allow the measurement of the individual chemical forms of vitamins. Typical applications of food analysis include the evaluation of shelf life and storage stability, monitoring of nutrient retention during food processing, developing food composition databases and data needed for food labeling, assessing dietary adequacy and evaluating epidemiological relationships between diet and disease. Although the usage of analytical data varies depending on the situation, important issues regarding how best to present and interpret the data in light of the presence of multiple vitamers are common to all aspects of food analysis. In this review, we will evaluate the existence of vitamers that exhibit differences in bioactivity or bioavailability, consider when there is a need to address differences in bioactivity or bioavailability of vitamers, and then consider alternative approaches and possible ways to improve the reporting of data. Major examples are taken from literature and experience with vitamin B6 and folate. PMID:22489223

  17. Treating Offenders with Mental Illness: A Research Synthesis

    PubMed Central

    Morgan, Robert D.; Flora, David B.; Kroner, Daryl G.; Mills, Jeremy F.; Varghese, Femina; Steffan, Jarrod S.

    2011-01-01

    The purpose of this research synthesis was to examine treatment effects across studies of the service providers to offenders with mental illness. Meta-analytic techniques were applied to 26 empirical studies obtained from a review of 12,154 research documents. Outcomes of interest in this review included measures of both psychiatric and criminal functioning. Although meta-analytic results are based on a small sample of available studies, results suggest interventions with offenders with mental illness effectively reduced symptoms of distress, improving offender’s ability to cope with their problems, and resulted in improved behavioral markers including institutional adjustment and behavioral functioning. Furthermore, interventions specifically designed to meet the psychiatric and criminal justice needs of offenders with mental illness have shown to produce significant reductions in psychiatric and criminal recidivism. Finally, this review highlighted admission policies and treatment strategies (e.g., use of homework), which produced the most positive benefits. Results of this research synthesis are directly relevant for service providers in both criminal justice and mental health systems (e.g., psychiatric hospitals) as well as community settings by informing treatment strategies for the first time, which are based on empirical evidence. In addition, the implications of these results to policy makers tasked with the responsibility of designating services for this special needs population are highlighted. PMID:22471384

  18. Characterization techniques for nano-electronics, with emphasis to electron microscopy. The role of the European Project ANNA

    NASA Astrophysics Data System (ADS)

    Armigliato, A.

    2008-07-01

    In the present and future CMOS technology, due to the ever shrinking geometries of the electronic devices, the availability of techniques capable of performing quantitative analyses of the relevant parameters (structural, chemical, mechanical) at a nanoscale is of a paramount importance. The influence of these features on the electrical performances of the nanodevices is a key issue for the nanoelectronics industry. In the recent years, a significant progress has been made in this field by a number of techniques, such as X-ray diffraction, in particular with the advent of synchrotron sources, ion-microbeam based Rutherford backscattering and channeling spectrometry, and micro Raman spectrometry. In addition, secondary ion mass spectrometry (SIMS) has achieved an important role in the determination of the dopant depth profile in ultra-shallow junctions (USJs) in silicon. However, the technique which features the ultimate spatial resolution (at the nanometer scale) is scanning transmission electron microscopy (STEM). In this presentation it will be reported on the nanoanalysis by STEM of two very important physical quantities which need to be controlled in the fabrication processes of nanodevices: the dopant profile in the USJs and the lattice strain that is generated in the Si electrically active regions of isolation structures by the different technological steps. The former quantity is investigated by the so-called Z-contrast high-angle annular dark field (HAADF-STEM) method, whereas the mechanical strain can be two-dimensionally mapped by the convergent beam electron diffraction (CBED-STEM) method. A spatial resolution lower than one nanometer and of a few nanometers can be achieved in the two cases, respectively. To keep the pace with the scientific and technological progress an increasingly wide array of analytical techniques is necessary; their complementary role in the solution of present and future characterization problems must be exploited. Presently, however, European laboratories with high-level expertise in materials characterization still operate in a largely independent way; this adversely affects the competitivity of European science and industry at the international level. For this reason the European Commission has started an Integrated Infrastructure Initiative (I3) in the sixth Framework Programme (now continuing in FP7) and funded a project called ANNA (2006-2010). This acronym stands for European Integrated Activity of Excellence and Networking for Nano and Micro- Electronics Analysis. The consortium includes 12 partners from 7 European countries and is coordinated by the Fondazione B.Kessler (FBK) in Trento (Italy); CNR-IMM is one of the 12 partners. Aim of ANNA is the onset of strong, long-term collaboration among the partners, so to form an integrated multi-site analytical facility, able to offer to the European community a wide variety of top-level analytical expertise and services in the field of micro- and nano-electronics. They include X-ray diffraction and scattering, SIMS, electron microscopy, medium-energy ion scattering, optical and electrical techniques. The project will be focused on three main activities: Networking (standardization of samples and methodologies, establishment of accredited reference laboratories), Transnational Access to laboratories located in the partners' premises to perform specific analytical experiments (an example is given by the two STEM methodologies discussed above) and Joint Research activity, which is targeted at the improvement and extension of the methodologies through a continuous instrumental and technical development. It is planned that the European joint analytical laboratory will continue its activity beyond the end of the project in 2010.

  19. Application of Laser Mass Spectrometry to Art and Archaeology

    NASA Technical Reports Server (NTRS)

    Gulian, Lase Lisa E.; Callahan, Michael P.; Muliadi, Sarah; Owens, Shawn; McGovern, Patrick E.; Schmidt, Catherine M.; Trentelman, Karen A.; deVries, Mattanjah S.

    2011-01-01

    REMPI laser mass spectrometry is a combination of resonance enhanced multiphoton ionization spectroscopy and time of flight mass spectrometry, This technique enables the collection of mass specific optical spectra as well as of optically selected mass spectra. Analytes are jet-cooled by entrainment in a molecular beam, and this low temperature gas phase analysis has the benefit of excellent vibronic resolution. Utilizing this method, mass spectrometric analysis of historically relevant samples can be simplified and improved; Optical selection of targets eliminates the need for chromatography while knowledge of a target's gas phase spectroscopy allows for facile differentiation of molecules that are in the aqueous phase considered spectroscopically indistinguishable. These two factors allow smaller sample sizes than commercial MS instruments, which in turn will require less damage to objects of antiquity. We have explored methods to optimize REMPI laser mass spectrometry as an analytical tool to archaeology using theobromine and caffeine as molecular markers in Mesoamerican pottery, and are expanding this approach to the field of art to examine laccaic acid in shellacs.

  20. Analysis of short-chain fatty acids in human feces: A scoping review.

    PubMed

    Primec, Maša; Mičetić-Turk, Dušanka; Langerholc, Tomaž

    2017-06-01

    Short-chain fatty acids (SCFAs) play a crucial role in maintaining homeostasis in humans, therefore the importance of a good and reliable SCFAs analytical detection has raised a lot in the past few years. The aim of this scoping review is to show the trends in the development of different methods of SCFAs analysis in feces, based on the literature published in the last eleven years in all major indexing databases. The search criteria included analytical quantification techniques of SCFAs in different human clinical and in vivo studies. SCFAs analysis is still predominantly performed using gas chromatography (GC), followed by high performance liquid chromatography (HPLC), nuclear magnetic resonance (NMR) and capillary electrophoresis (CE). Performances, drawbacks and advantages of these methods are discussed, especially in the light of choosing a proper pretreatment, as feces is a complex biological material. Further optimization to develop a simple, cost effective and robust method for routine use is needed. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Dietary exposure to trace elements and radionuclides: the methodology of the Italian Total Diet Study 2012-2014.

    PubMed

    D'Amato, Marilena; Turrini, Aida; Aureli, Federica; Moracci, Gabriele; Raggi, Andrea; Chiaravalle, Eugenio; Mangiacotti, Michele; Cenci, Telemaco; Orletti, Roberta; Candela, Loredana; di Sandro, Alessandra; Cubadda, Francesco

    2013-01-01

    This article presents the methodology of the Italian Total Diet Study 2012-2014 aimed at assessing the dietary exposure of the general Italian population to selected nonessential trace elements (Al, inorganic As, Cd, Pb, methyl-Hg, inorganic Hg, U) and radionuclides (40K, 134Cs, 137Cs, 90Sr). The establishment of the TDS food list, the design of the sampling plan, and details about the collection of food samples, their standardized culinary treatment, pooling into analytical samples and subsequent sample treatment are described. Analytical techniques and quality assurance are discussed, with emphasis on the need for speciation data and for minimizing the percentage of left-censored data so as to reduce uncertainties in exposure assessment. Finally the methodology for estimating the exposure of the general population and of population subgroups according to age (children, teenagers, adults, and the elderly) and gender, both at the national level and for each of the four main geographical areas of Italy, is presented.

  2. Conducting Meta-Analyses Based on p Values

    PubMed Central

    van Aert, Robbie C. M.; Wicherts, Jelte M.; van Assen, Marcel A. L. M.

    2016-01-01

    Because of overwhelming evidence of publication bias in psychology, techniques to correct meta-analytic estimates for such bias are greatly needed. The methodology on which the p-uniform and p-curve methods are based has great promise for providing accurate meta-analytic estimates in the presence of publication bias. However, in this article, we show that in some situations, p-curve behaves erratically, whereas p-uniform may yield implausible estimates of negative effect size. Moreover, we show that (and explain why) p-curve and p-uniform result in overestimation of effect size under moderate-to-large heterogeneity and may yield unpredictable bias when researchers employ p-hacking. We offer hands-on recommendations on applying and interpreting results of meta-analyses in general and p-uniform and p-curve in particular. Both methods as well as traditional methods are applied to a meta-analysis on the effect of weight on judgments of importance. We offer guidance for applying p-uniform or p-curve using R and a user-friendly web application for applying p-uniform. PMID:27694466

  3. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  4. Feasibility of UV-VIS-Fluorescence spectroscopy combined with pattern recognition techniques to authenticate a new category of plant food supplements.

    PubMed

    Boggia, Raffaella; Turrini, Federica; Anselmo, Marco; Zunin, Paola; Donno, Dario; Beccaro, Gabriele L

    2017-07-01

    Bud extracts, named also "gemmoderivatives", are a new category of natural products, obtained macerating meristematic fresh tissues of trees and plants. In the European Community these botanical remedies are classified as plant food supplements. Nowadays these products are still poorly studied, even if they are widely used and commercialized. Several analytical tools for the quality control of these very expensive supplements are urgently needed in order to avoid mislabelling and frauds. In fact, besides the usual quality controls common to the other botanical dietary supplements, these extracts should be checked in order to quickly detect if the cheaper adult parts of the plants are deceptively used in place of the corresponding buds whose harvest-period and production are extremely limited. This study aims to provide a screening analytical method based on UV-VIS-Fluorescence spectroscopy coupled to multivariate analysis for a rapid, inexpensive and non-destructive quality control of these products.

  5. Using business intelligence for efficient inter-facility patient transfer.

    PubMed

    Haque, Waqar; Derksen, Beth Ann; Calado, Devin; Foster, Lee

    2015-01-01

    In the context of inter-facility patient transfer, a transfer operator must be able to objectively identify a destination which meets the needs of a patient, while keeping in mind each facility's limitations. We propose a solution which uses Business Intelligence (BI) techniques to analyze data related to healthcare infrastructure and services, and provides a web based system to identify optimal destination(s). The proposed inter-facility transfer system uses a single data warehouse with an Online Analytical Processing (OLAP) cube built on top that supplies analytical data to multiple reports embedded in web pages. The data visualization tool includes map based navigation of the health authority as well as an interactive filtering mechanism which finds facilities meeting the selected criteria. The data visualization is backed by an intuitive data entry web form which safely constrains the data, ensuring consistency and a single version of truth. The overall time required to identify the destination for inter-facility transfers is reduced from hours to a few minutes with this interactive solution.

  6. Can we use high precision metal isotope analysis to improve our understanding of cancer?

    PubMed

    Larner, Fiona

    2016-01-01

    High precision natural isotope analyses are widely used in geosciences to trace elemental transport pathways. The use of this analytical tool is increasing in nutritional and disease-related research. In recent months, a number of groups have shown the potential this technique has in providing new observations for various cancers when applied to trace metal metabolism. The deconvolution of isotopic signatures, however, relies on mathematical models and geochemical data, which are not representative of the system under investigation. In addition to relevant biochemical studies of protein-metal isotopic interactions, technological development both in terms of sample throughput and detection sensitivity of these elements is now needed to translate this novel approach into a mainstream analytical tool. Following this, essential background healthy population studies must be performed, alongside observational, cross-sectional disease-based studies. Only then can the sensitivity and specificity of isotopic analyses be tested alongside currently employed methods, and important questions such as the influence of cancer heterogeneity and disease stage on isotopic signatures be addressed.

  7. Real-time analysis of healthcare using big data analytics

    NASA Astrophysics Data System (ADS)

    Basco, J. Antony; Senthilkumar, N. C.

    2017-11-01

    Big Data Analytics (BDA) provides a tremendous advantage where there is a need of revolutionary performance in handling large amount of data that covers 4 characteristics such as Volume Velocity Variety Veracity. BDA has the ability to handle such dynamic data providing functioning effectiveness and exceptionally beneficial output in several day to day applications for various organizations. Healthcare is one of the sectors which generate data constantly covering all four characteristics with outstanding growth. There are several challenges in processing patient records which deals with variety of structured and unstructured format. Inducing BDA in to Healthcare (HBDA) will deal with sensitive patient driven information mostly in unstructured format comprising of prescriptions, reports, data from imaging system, etc., the challenges will be overcome by big data with enhanced efficiency in fetching and storing of data. In this project, dataset alike Electronic Medical Records (EMR) produced from numerous medical devices and mobile applications will be induced into MongoDB using Hadoop framework with Improvised processing technique to improve outcome of processing patient records.

  8. Determination of nanoparticle size distribution together with density or molecular weight by 2D analytical ultracentrifugation

    PubMed Central

    Carney, Randy P.; Kim, Jin Young; Qian, Huifeng; Jin, Rongchao; Mehenni, Hakim; Stellacci, Francesco; Bakr, Osman M.

    2011-01-01

    Nanoparticles are finding many research and industrial applications, yet their characterization remains a challenge. Their cores are often polydisperse and coated by a stabilizing shell that varies in size and composition. No single technique can characterize both the size distribution and the nature of the shell. Advances in analytical ultracentrifugation allow for the extraction of the sedimentation (s) and diffusion coefficients (D). Here we report an approach to transform the s and D distributions of nanoparticles in solution into precise molecular weight (M), density (ρP) and particle diameter (dp) distributions. M for mixtures of discrete nanocrystals is found within 4% of the known quantities. The accuracy and the density information we achieve on nanoparticles are unparalleled. A single experimental run is sufficient for full nanoparticle characterization, without the need for standards or other auxiliary measurements. We believe that our method is of general applicability and we discuss its limitations. PMID:21654635

  9. Trace metal speciation in natural waters: Computational vs. analytical

    USGS Publications Warehouse

    Nordstrom, D. Kirk

    1996-01-01

    Improvements in the field sampling, preservation, and determination of trace metals in natural waters have made many analyses more reliable and less affected by contamination. The speciation of trace metals, however, remains controversial. Chemical model speciation calculations do not necessarily agree with voltammetric, ion exchange, potentiometric, or other analytical speciation techniques. When metal-organic complexes are important, model calculations are not usually helpful and on-site analytical separations are essential. Many analytical speciation techniques have serious interferences and only work well for a limited subset of water types and compositions. A combined approach to the evaluation of speciation could greatly reduce these uncertainties. The approach proposed would be to (1) compare and contrast different analytical techniques with each other and with computed speciation, (2) compare computed trace metal speciation with reliable measurements of solubility, potentiometry, and mean activity coefficients, and (3) compare different model calculations with each other for the same set of water analyses, especially where supplementary data on speciation already exist. A comparison and critique of analytical with chemical model speciation for a range of water samples would delineate the useful range and limitations of these different approaches to speciation. Both model calculations and analytical determinations have useful and different constraints on the range of possible speciation such that they can provide much better insight into speciation when used together. Major discrepancies in the thermodynamic databases of speciation models can be evaluated with the aid of analytical speciation, and when the thermodynamic models are highly consistent and reliable, the sources of error in the analytical speciation can be evaluated. Major thermodynamic discrepancies also can be evaluated by simulating solubility and activity coefficient data and testing various chemical models for their range of applicability. Until a comparative approach such as this is taken, trace metal speciation will remain highly uncertain and controversial.

  10. Damage of composite structures: Detection technique, dynamic response and residual strength

    NASA Astrophysics Data System (ADS)

    Lestari, Wahyu

    2001-10-01

    Reliable and accurate health monitoring techniques can prevent catastrophic failures of structures. Conventional damage detection methods are based on visual or localized experimental methods and very often require prior information concerning the vicinity of the damage or defect. The structure must also be readily accessible for inspections. The techniques are also labor intensive. In comparison to these methods, health-monitoring techniques that are based on the structural dynamic response offers unique information on failure of structures. However, systematic relations between the experimental data and the defect are not available and frequently, the number of vibration modes needed for an accurate identification of defects is much higher than the number of modes that can be readily identified in the experiment. These motivated us to develop an experimental data based detection method with systematic relationships between the experimentally identified information and the analytical or mathematical model representing the defective structures. The developed technique use changes in vibrational curvature modes and natural frequencies. To avoid misinterpretation of the identified information, we also need to understand the effects of defects on the structural dynamic response prior to developing health-monitoring techniques. In this thesis work we focus on two type of defects in composite structures, namely delamination and edge notch like defect. Effects of nonlinearity due to the presence of defect and due to the axial stretching are studied for beams with delamination. Once defects are detected in a structure, next concern is determining the effects of the defects on the strength of the structure and its residual stiffness under dynamic loading. In this thesis, energy release rate due to dynamic loading in a delaminated structure is studied, which will be a foundation toward determining the residual strength of the structure.

  11. Towards sustainable infrastructure management: knowledge-based service-oriented computing framework for visual analytics

    NASA Astrophysics Data System (ADS)

    Vatcha, Rashna; Lee, Seok-Won; Murty, Ajeet; Tolone, William; Wang, Xiaoyu; Dou, Wenwen; Chang, Remco; Ribarsky, William; Liu, Wanqiu; Chen, Shen-en; Hauser, Edd

    2009-05-01

    Infrastructure management (and its associated processes) is complex to understand, perform and thus, hard to make efficient and effective informed decisions. The management involves a multi-faceted operation that requires the most robust data fusion, visualization and decision making. In order to protect and build sustainable critical assets, we present our on-going multi-disciplinary large-scale project that establishes the Integrated Remote Sensing and Visualization (IRSV) system with a focus on supporting bridge structure inspection and management. This project involves specific expertise from civil engineers, computer scientists, geographers, and real-world practitioners from industry, local and federal government agencies. IRSV is being designed to accommodate the essential needs from the following aspects: 1) Better understanding and enforcement of complex inspection process that can bridge the gap between evidence gathering and decision making through the implementation of ontological knowledge engineering system; 2) Aggregation, representation and fusion of complex multi-layered heterogeneous data (i.e. infrared imaging, aerial photos and ground-mounted LIDAR etc.) with domain application knowledge to support machine understandable recommendation system; 3) Robust visualization techniques with large-scale analytical and interactive visualizations that support users' decision making; and 4) Integration of these needs through the flexible Service-oriented Architecture (SOA) framework to compose and provide services on-demand. IRSV is expected to serve as a management and data visualization tool for construction deliverable assurance and infrastructure monitoring both periodically (annually, monthly, even daily if needed) as well as after extreme events.

  12. An integrative framework for sensor-based measurement of teamwork in healthcare

    PubMed Central

    Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J

    2015-01-01

    There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. PMID:25053579

  13. Pavement Performance : Approaches Using Predictive Analytics

    DOT National Transportation Integrated Search

    2018-03-23

    Acceptable pavement condition is paramount to road safety. Using predictive analytics techniques, this project attempted to develop models that provide an assessment of pavement condition based on an array of indictors that include pavement distress,...

  14. Analytical techniques of pilot scanning behavior and their application

    NASA Technical Reports Server (NTRS)

    Harris, R. L., Sr.; Glover, B. J.; Spady, A. A., Jr.

    1986-01-01

    The state of the art of oculometric data analysis techniques and their applications in certain research areas such as pilot workload, information transfer provided by various display formats, crew role in automated systems, and pilot training are documented. These analytical techniques produce the following data: real-time viewing of the pilot's scanning behavior, average dwell times, dwell percentages, instrument transition paths, dwell histograms, and entropy rate measures. These types of data are discussed, and overviews of the experimental setup, data analysis techniques, and software are presented. A glossary of terms frequently used in pilot scanning behavior and a bibliography of reports on related research sponsored by NASA Langley Research Center are also presented.

  15. Evaluation of the matrix effect on gas chromatography--mass spectrometry with carrier gas containing ethylene glycol as an analyte protectant.

    PubMed

    Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi

    2016-02-19

    The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  17. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    PubMed Central

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370

  18. Decryption with incomplete cyphertext and multiple-information encryption in phase space.

    PubMed

    Xu, Xiaobin; Wu, Quanying; Liu, Jun; Situ, Guohai

    2016-01-25

    Recently, we have demonstrated that information encryption in phase space offers security enhancement over the traditional encryption schemes operating in real space. However, there is also an important issue with this technique: increasing the cost for data transmitting and storage. To address this issue, here we investigate the problem of decryption using incomplete cyphertext. We show that the analytic solution under the traditional framework set the lower limit of decryption performance. More importantly, we demonstrate that one just needs a small amount of cyphertext to recover the plaintext signal faithfully using compressive sensing, meaning that the amount of data that needs to transmit and store can be significantly reduced. This leads to multiple information encryption so that we can use the system bandwidth more effectively. We also provide an optical experimental result to demonstrate the plaintext recovered in phase space.

  19. Condensation nucleation light scattering detection with ion chromatography for direct determination of glyphosate and its metabolite in water.

    PubMed

    You, Jing; Koropchak, John A

    2003-03-14

    An ion chromatography-condensation nucleation light scattering detection (IC-CNLSD) method was successfully used to directly analyze glyphosate, a polar pesticide, and aminomethylphosaphonic acid, the major metabolite of glyphosate, in water without need of pre-treatment or derivatization. CNLSD gave a LOD of 53 ng/ml for glyphosate, which is much lower than the maximum contaminant level of 700 ng/ml for drinking water issued by the US Environmental Protection Agency. Spiked analytes in different matrixes were tested. A diluted commercial herbicide containing glyphosate was also evaluated. Compared to other reported methods, the IC-CNLSD method has no need of sample derivatization, pre-concentration, and mobile phase conductivity suppression. It is simple, fast and inexpensive. IC-CNLSD is an ideal direct detection technique for such pesticides without chromophores or fluorophores.

  20. Immunoaffinity capillary electrophoresis as a powerful strategy for the quantification of low-abundance biomarkers, drugs, and metabolites in biological matrices

    PubMed Central

    Guzman, Norberto A.; Blanc, Timothy; Phillips, Terry M.

    2009-01-01

    In the last few years, there has been a greater appreciation by the scientific community of how separation science has contributed to the advancement of biomedical research. Despite past contributions in facilitating several biomedical breakthroughs, separation sciences still urgently need the development of improved methods for the separation and detection of biological and chemical substances. In particular, the challenging task of quantifying small molecules and biomolecules, found in low abundance in complex matrices (e.g., serum), is a particular area in need of new high-efficiency techniques. The tandem or on-line coupling of highly selective antibody capture agents with the high-resolving power of CE is being recognized as a powerful analytical tool for the enrichment and quantification of ultra-low abundance analytes in complex matrices. This development will have a significant impact on the identification and characterization of many putative biomarkers and on biomedical research in general. Immunoaffinity CE (IACE) technology is rapidly emerging as the most promising method for the analysis of low-abundance biomarkers; its power comes from a three-step procedure: (i) bioselective adsorption and (ii) subsequent recovery of compounds from an immobilized affinity ligand followed by (iii) separation of the enriched compounds. This technology is highly suited to automation and can be engineered to as a multiplex instrument capable of routinely performing hundreds of assays per day. Furthermore, a significant enhancement in sensitivity can be achieved for the purified and enriched affinity targeted analytes. Thus, a compound that exists in a complex biological matrix at a concentration far below its LOD is easily brought to well within its range of quantification. The present review summarizes several applications of IACE, as well as a chronological description of the improvements made in the fabrication of the analyte concentrator-microreactor device leading to the development of a multidimensional biomarker analyzer. PMID:18646282

  1. A Development Strategy for Creating a Suite of Reference Materials for the in-situ Microanalysis of Non-conventional Raw Materials

    NASA Astrophysics Data System (ADS)

    Renno, A. D.; Merchel, S.; Michalak, P. P.; Munnik, F.; Wiedenbeck, M.

    2010-12-01

    Recent economic trends regarding the supply of rare metals readily justify scientific research into non-conventional raw materials, where a particular need is a better understanding of the relationship between mineralogy, microstructure and the distribution of key metals within ore deposits (geometallurgy). Achieving these goals will require an extensive usage of in-situ microanalytical techniques capable of spatially resolving material heterogeneities which can be key for understanding better resource utilization. The availability of certified reference materials (CRMs) is an essential prerequisite for (1) validating new analytical methods, (2) demonstrating data quality to the contracting authorities, (3) supporting method development and instrument calibration, and (4) establishing traceability between new analytical approaches and existing data sets. This need has led to the granting of funding by the European Union and the German Free State of Saxony for a program to develop such reference materials . This effort will apply the following strategies during the selection of the phases: (1) will use exclusively synthetic minerals, thereby providing large volumes of homogeneous starting material. (2) will focus on matrices which are capable of incorporating many ‘important’ elements while avoid exotic compositions which would not be optimal matrix matches. (3) will emphasise those phases which remain stable during the various microanalytical procedure. This initiative will assess the homogeneity of the reference materials at sampling sizes ranging between 50 and 1 µm; it is also intended to document crystal structural homogeneity too, as this too may potentially impact specific analytical methods. As far as possible both definitive methods as well as methods involving matrix corrections will be used for determining the compositions of the of the individual materials. A critical challenge will be the validation of the determination of analytes concentrations as sub-µg sampling masses. It is planned to cooperate with those who are interested in the development of such reference materials and we invite them to take part in round-robin exercises.

  2. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World”

    PubMed Central

    Roberts, Lynne D.; Howell, Joel A.; Seaman, Kristen; Gibson, David C.

    2016-01-01

    Increasingly, higher education institutions are exploring the potential of learning analytics to predict student retention, understand learning behaviors, and improve student learning through providing personalized feedback and support. The technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students' knowledge, attitudes, and concerns about big data and learning analytics through four focus groups (N = 41). Thematic analysis of the focus group transcripts identified six key themes. The first theme, “Uninformed and Uncertain,” represents students' lack of knowledge about learning analytics prior to the focus groups. Following the provision of information, viewing of videos and discussion of learning analytics scenarios three further themes; “Help or Hindrance to Learning,” “More than a Number,” and “Impeding Independence”; represented students' perceptions of the likely impact of learning analytics on their learning. “Driving Inequality” and “Where Will it Stop?” represent ethical concerns raised by the students about the potential for inequity, bias and invasion of privacy and the need for informed consent. A key tension to emerge was how “personal” vs. “collective” purposes or principles can intersect with “uniform” vs. “autonomous” activity. The findings highlight the need the need to engage students in the decision making process about learning analytics. PMID:28066285

  4. Improving entrepreneurial opportunity recognition through web content analytics

    NASA Astrophysics Data System (ADS)

    Bakar, Muhamad Shahbani Abu; Azmi, Azwiyati

    2017-10-01

    The ability to recognize and develop an opportunity into a venture defines an entrepreneur. Research in opportunity recognition has been robust and focuses more on explaining the processes involved in opportunity recognition. Factors such as prior knowledge, cognitive and creative capabilities are shown to affect opportunity recognition in entrepreneurs. Prior knowledge in areas such as customer problems, ways to serve the market, and technology has been shows in various studies to be a factor that facilitates entrepreneurs to identify and recognize opportunities. Findings from research also shows that experienced entrepreneurs search and scan for information to discover opportunities. Searching and scanning for information has also been shown to help novice entrepreneurs who lack prior knowledge to narrow this gap and enable them to better identify and recognize opportunities. There is less focus in research on finding empirically proven techniques and methods to develop and enhance opportunity recognition in student entrepreneurs. This is important as the country pushes for more graduate entrepreneurs that can drive the economy. This paper aims to discuss Opportunity Recognition Support System (ORSS), an information support system to help especially student entrepreneurs in identifying and recognizing business opportunities. The ORSS aims to provide the necessary knowledge to student entrepreneurs to be able to better identify and recognize opportunities. Applying design research, theories in opportunity recognition are applied to identify the requirements for the support system and the requirements in turn dictate the design of the support system. The paper proposes the use of web content mining and analytics as two core components and techniques for the support system. Web content mining can mine the vast knowledge repositories available on the internet and analytics can provide entrepreneurs with further insights into the information needed to recognize opportunities in a given market or industry.

  5. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    NASA Astrophysics Data System (ADS)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  6. Hollow-fiber flow field-flow fractionation with multi-angle laser scattering detection for aggregation studies of therapeutic proteins.

    PubMed

    Reschiglian, P; Roda, B; Zattoni, A; Tanase, M; Marassi, V; Serani, S

    2014-02-01

    The rapid development of protein-based pharmaceuticals highlights the need for robust analytical methods to ensure their quality and stability. Among proteins used in pharmaceutical applications, an important and ever increasing role is represented by monoclonal antibodies and large proteins, which are often modified to enhance their activity or stability when used as drugs. The bioactivity and the stability of those proteins are closely related to the maintenance of their complex structure, which however are influenced by many external factors that can cause degradation and/or aggregation. The presence of aggregates in these drugs could reduce their bioactivity and bioavailability, and induce immunogenicity. The choice of the proper analytical method for the analysis of aggregates is fundamental to understand their (size) dimensional range, their amount, and if they are present in the sample as generated by an aggregation or as an artifact due to the method itself. Size exclusion chromatography is one of the most important techniques for the quality control of pharmaceutical proteins; however, its application is limited to relatively low molar mass aggregates. Among the techniques for the size characterization of proteins, field-flow fractionation (FFF) represents a competitive choice because of its soft mechanism due to the absence of a stationary phase and application in a broader size range, from nanometer- to micrometer-sized analytes. In this paper, the microcolumn variant of FFF, the hollow-fiber flow FFF, was online coupled with multi-angle light scattering, and a method for the characterization of aggregates with high reproducibility and low limit of detection was demonstrated employing an avidin derivate as sample model.

  7. Applications of surface analytical techniques in Earth Sciences

    NASA Astrophysics Data System (ADS)

    Qian, Gujie; Li, Yubiao; Gerson, Andrea R.

    2015-03-01

    This review covers a wide range of surface analytical techniques: X-ray photoelectron spectroscopy (XPS), scanning photoelectron microscopy (SPEM), photoemission electron microscopy (PEEM), dynamic and static secondary ion mass spectroscopy (SIMS), electron backscatter diffraction (EBSD), atomic force microscopy (AFM). Others that are relatively less widely used but are also important to the Earth Sciences are also included: Auger electron spectroscopy (AES), low energy electron diffraction (LEED) and scanning tunnelling microscopy (STM). All these techniques probe only the very top sample surface layers (sub-nm to several tens of nm). In addition, we also present several other techniques i.e. Raman microspectroscopy, reflection infrared (IR) microspectroscopy and quantitative evaluation of minerals by scanning electron microscopy (QEMSCAN) that penetrate deeper into the sample, up to several μm, as all of them are fundamental analytical tools for the Earth Sciences. Grazing incidence synchrotron techniques, sensitive to surface measurements, are also briefly introduced at the end of this review. (Scanning) transmission electron microscopy (TEM/STEM) is a special case that can be applied to characterisation of mineralogical and geological sample surfaces. Since TEM/STEM is such an important technique for Earth Scientists, we have also included it to draw attention to the capability of TEM/STEM applied as a surface-equivalent tool. While this review presents most of the important techniques for the Earth Sciences, it is not an all-inclusive bibliography of those analytical techniques. Instead, for each technique that is discussed, we first give a very brief introduction about its principle and background, followed by a short section on approaches to sample preparation that are important for researchers to appreciate prior to the actual sample analysis. We then use examples from publications (and also some of our known unpublished results) within the Earth Sciences to show how each technique is applied and used to obtain specific information and to resolve real problems, which forms the central theme of this review. Although this review focuses on applications of these techniques to study mineralogical and geological samples, we also anticipate that researchers from other research areas such as Material and Environmental Sciences may benefit from this review.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenik, E.A.

    X-ray microanalysis in an analytical electron microscope is a proven technique for the measurement of solute segregation in alloys. Solute segregation under equilibrium or nonequilibrium conditions can strongly influence material performance. X-ray microanalysis in an analytical electron microscope provides an alternative technique to measure grain boundary segregation, as well as segregation to other defects not accessible to Auger analysis. The utility of the technique is demonstrated by measurements of equilibrium segregation to boundaries in an antimony containing stainless steel, including the variation of segregation with boundary character and by measurements of nonequilibrium segregation to boundaries and dislocations in an ion-irradiatedmore » stainless steel.« less

  9. Solid Lubrication Fundamentals and Applications. Chapter 2

    NASA Technical Reports Server (NTRS)

    Miyoshi, Kazuhisa

    1998-01-01

    This chapter describes powerful analytical techniques capable of sampling tribological surfaces and solid-film lubricants. Some of these techniques may also be used to determine the locus of failure in a bonded structure or coated substrate; such information is important when seeking improved adhesion between a solid-film lubricant and a substrate and when seeking improved performance and long life expectancy of solid lubricants. Many examples are given here and through-out the book on the nature and character of solid surfaces and their significance in lubrication, friction, and wear. The analytical techniques used include the late spectroscopic methods.

  10. Langmuir probe analysis in electronegative plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bredin, Jerome, E-mail: jerome.bredin@lpp.polytechnique.fr; Chabert, Pascal; Aanesland, Ane

    2014-12-15

    This paper compares two methods to analyze Langmuir probe data obtained in electronegative plasmas. The techniques are developed to allow investigations in plasmas, where the electronegativity α{sub 0} = n{sub –}/n{sub e} (the ratio between the negative ion and electron densities) varies strongly. The first technique uses an analytical model to express the Langmuir probe current-voltage (I-V) characteristic and its second derivative as a function of the electron and ion densities (n{sub e}, n{sub +}, n{sub –}), temperatures (T{sub e}, T{sub +}, T{sub –}), and masses (m{sub e}, m{sub +}, m{sub –}). The analytical curves are fitted to the experimental data bymore » adjusting these variables and parameters. To reduce the number of fitted parameters, the ion masses are assumed constant within the source volume, and quasi-neutrality is assumed everywhere. In this theory, Maxwellian distributions are assumed for all charged species. We show that this data analysis can predict the various plasma parameters within 5–10%, including the ion temperatures when α{sub 0} > 100. However, the method is tedious, time consuming, and requires a precise measurement of the energy distribution function. A second technique is therefore developed for easier access to the electron and ion densities, but does not give access to the ion temperatures. Here, only the measured I-V characteristic is needed. The electron density, temperature, and ion saturation current for positive ions are determined by classical probe techniques. The electronegativity α{sub 0} and the ion densities are deduced via an iterative method since these variables are coupled via the modified Bohm velocity. For both techniques, a Child-Law sheath model for cylindrical probes has been developed and is presented to emphasize the importance of this model for small cylindrical Langmuir probes.« less

  11. Preparing Earth Data Scientists for 'The Sexiest Job of the 21st Century'

    NASA Technical Reports Server (NTRS)

    Kempler, Steven

    2014-01-01

    What Exactly do Earth Data Scientists do, and What do They Need to Know, to do It? There is not one simple answer, but there are many complex answers. Data Science, and data analytics, are new and nebulas, and takes on different characteristics depending on: The subject matter being analyzed, the maturity of the research, and whether the employed subject specific analytics is descriptive, diagnostic, discoveritive, predictive, or prescriptive, in nature. In addition, in a, thus far, business driven paradigm shift, university curriculums teaching data analytics pertaining to Earth science have, as a whole, lagged behind, andor have varied in approach.This presentation attempts to breakdown and identify the many activities that Earth Data Scientists, as a profession, encounter, as well as provide case studies of specific Earth Data Scientist and data analytics efforts. I will also address the educational preparation, that best equips future Earth Data Scientists, needed to further Earth science heterogeneous data research and applications analysis. The goal of this presentation is to describe the actual need for Earth Data Scientists and the practical skills to perform Earth science data analytics, thus hoping to initiate discussion addressing a baseline set of needed expertise for educating future Earth Data Scientists.

  12. Mercury-induced fragmentation of n-decane and n-undecane in positive mode ion mobility spectrometry.

    PubMed

    Gunzer, F

    2015-09-21

    Ion mobility spectrometry is a well-known technique for trace gas analysis. Using soft ionization techniques, fragmentation of analytes is normally not observed, with the consequence that analyte spectra of single substances are quite simple, i.e. showing in general only one peak. If the concentration is high enough, an extra cluster peak involving two analyte molecules can often be observed. When investigating n-alkanes, different results regarding the number of peaks in the spectra have been obtained in the past using this spectrometric technique. Here we present results obtained when analyzing n-alkanes (n-hexane to n-undecane) with a pulsed electron source, which show no fragmentation or clustering at all. However, when investigating a mixture of mercury and an n-alkane, a situation quite typical in the oil and gas industry, a strong fragmentation and cluster formation involving these fragments has been observed exclusively for n-decane and n-undecane.

  13. [Recent Development of Atomic Spectrometry in China].

    PubMed

    Xiao, Yuan-fang; Wang, Xiao-hua; Hang, Wei

    2015-09-01

    As an important part of modern analytical techniques, atomic spectrometry occupies a decisive status in the whole analytical field. The development of atomic spectrometry also reflects the continuous reform and innovation of analytical techniques. In the past fifteen years, atomic spectrometry has experienced rapid development and been applied widely in many fields in China. This review has witnessed its development and remarkable achievements. It contains several directions of atomic spectrometry, including atomic emission spectrometry (AES), atomic absorption spectrometry (AAS), atomic fluorescence spectrometry (AFS), X-ray fluorescence spectrometry (XRF), and atomic mass spectrometry (AMS). Emphasis is put on the innovation of the detection methods and their applications in related fields, including environmental samples, biological samples, food and beverage, and geological materials, etc. There is also a brief introduction to the hyphenated techniques utilized in atomic spectrometry. Finally, the prospects of atomic spectrometry in China have been forecasted.

  14. Three-Dimensional Dynamic Deformation Measurements Using Stereoscopic Imaging and Digital Speckle Photography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prentice, H. J.; Proud, W. G.

    2006-07-28

    A technique has been developed to determine experimentally the three-dimensional displacement field on the rear surface of a dynamically deforming plate. The technique combines speckle analysis with stereoscopy, using a modified angular-lens method: this incorporates split-frame photography and a simple method by which the effective lens separation can be adjusted and calibrated in situ. Whilst several analytical models exist to predict deformation in extended or semi-infinite targets, the non-trivial nature of the wave interactions complicates the generation and development of analytical models for targets of finite depth. By interrogating specimens experimentally to acquire three-dimensional strain data points, both analytical andmore » numerical model predictions can be verified more rigorously. The technique is applied to the quasi-static deformation of a rubber sheet and dynamically to Mild Steel sheets of various thicknesses.« less

  15. Laser desorption ionization mass spectrometry: Recent progress in matrix-free and label-assisted techniques.

    PubMed

    Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit

    2017-10-13

    The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.

  16. Interference by the activated sludge matrix on the analysis of soluble microbial products in wastewater.

    PubMed

    Potvin, Christopher M; Zhou, Hongde

    2011-11-01

    The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. A new multi-step technique with differential transform method for analytical solution of some nonlinear variable delay differential equations.

    PubMed

    Benhammouda, Brahim; Vazquez-Leal, Hector

    2016-01-01

    This work presents an analytical solution of some nonlinear delay differential equations (DDEs) with variable delays. Such DDEs are difficult to treat numerically and cannot be solved by existing general purpose codes. A new method of steps combined with the differential transform method (DTM) is proposed as a powerful tool to solve these DDEs. This method reduces the DDEs to ordinary differential equations that are then solved by the DTM. Furthermore, we show that the solutions can be improved by Laplace-Padé resummation method. Two examples are presented to show the efficiency of the proposed technique. The main advantage of this technique is that it possesses a simple procedure based on a few straight forward steps and can be combined with any analytical method, other than the DTM, like the homotopy perturbation method.

  18. Analytic double product integrals for all-frequency relighting.

    PubMed

    Wang, Rui; Pan, Minghao; Chen, Weifeng; Ren, Zhong; Zhou, Kun; Hua, Wei; Bao, Hujun

    2013-07-01

    This paper presents a new technique for real-time relighting of static scenes with all-frequency shadows from complex lighting and highly specular reflections from spatially varying BRDFs. The key idea is to depict the boundaries of visible regions using piecewise linear functions, and convert the shading computation into double product integrals—the integral of the product of lighting and BRDF on visible regions. By representing lighting and BRDF with spherical Gaussians and approximating their product using Legendre polynomials locally in visible regions, we show that such double product integrals can be evaluated in an analytic form. Given the precomputed visibility, our technique computes the visibility boundaries on the fly at each shading point, and performs the analytic integral to evaluate the shading color. The result is a real-time all-frequency relighting technique for static scenes with dynamic, spatially varying BRDFs, which can generate more accurate shadows than the state-of-the-art real-time PRT methods.

  19. Analytics for Cyber Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plantenga, Todd.; Kolda, Tamara Gibson

    2011-06-01

    This report provides a brief survey of analytics tools considered relevant to cyber network defense (CND). Ideas and tools come from elds such as statistics, data mining, and knowledge discovery. Some analytics are considered standard mathematical or statistical techniques, while others re ect current research directions. In all cases the report attempts to explain the relevance to CND with brief examples.

  20. Visual analysis of online social media to open up the investigation of stance phenomena

    PubMed Central

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2015-01-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool. PMID:29249903

  1. Visual analysis of online social media to open up the investigation of stance phenomena.

    PubMed

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2016-04-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool.

  2. Surface enhanced Raman spectroscopy (SERS) from a molecule adsorbed on a nanoscale silver particle cluster in a holographic plate

    NASA Astrophysics Data System (ADS)

    Jusinski, Leonard E.; Bahuguna, Ramen; Das, Amrita; Arya, Karamjeet

    2006-02-01

    Surface enhanced Raman spectroscopy has become a viable technique for the detection of single molecules. This highly sensitive technique is due to the very large (up to 14 orders in magnitude) enhancement in the Raman cross section when the molecule is adsorbed on a metal nanoparticle cluster. We report here SERS (Surface Enhanced Raman Spectroscopy) experiments performed by adsorbing analyte molecules on nanoscale silver particle clusters within the gelatin layer of commercially available holographic plates which have been developed and fixed. The Ag particles range in size between 5 - 30 nanometers (nm). Sample preparation was performed by immersing the prepared holographic plate in an analyte solution for a few minutes. We report here the production of SERS signals from Rhodamine 6G (R6G) molecules of nanomolar concentration. These measurements demonstrate a fast, low cost, reproducible technique of producing SERS substrates in a matter of minutes compared to the conventional procedure of preparing Ag clusters from colloidal solutions. SERS active colloidal solutions require up to a full day to prepare. In addition, the preparations of colloidal aggregates are not consistent in shape, contain additional interfering chemicals, and do not generate consistent SERS enhancement. Colloidal solutions require the addition of KCl or NaCl to increase the ionic strength to allow aggregation and cluster formation. We find no need to add KCl or NaCl to create SERS active clusters in the holographic gelatin matrix. These holographic plates, prepared using simple, conventional procedures, can be stored in an inert environment and preserve SERS activity after several weeks subsequent to preparation.

  3. The MSCA Program: Developing Analytic Unicorns

    ERIC Educational Resources Information Center

    Houghton, David M.; Schertzer, Clint; Beck, Scott

    2018-01-01

    Marketing analytics students who can communicate effectively with decision makers are in high demand. These "analytic unicorns" are hard to find. The Master of Science in Customer Analytics (MSCA) degree program at Xavier University seeks to fill that need. In this paper, we discuss the process of creating the MSCA program. We outline…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Part, Florian; Zecha, Gudrun; Causon, Tim

    Highlights: • First review on detection of nanomaterials in complex waste samples. • Focus on nanoparticles in solid, liquid and gaseous waste samples. • Summary of current applicable methods for nanowaste detection and characterisation. • Limitations and challenges of characterisation of nanoparticles in waste. - Abstract: Engineered nanomaterials (ENMs) are already extensively used in diverse consumer products. Along the life cycle of a nano-enabled product, ENMs can be released and subsequently accumulate in the environment. Material flow models also indicate that a variety of ENMs may accumulate in waste streams. Therefore, a new type of waste, so-called nanowaste, is generatedmore » when end-of-life ENMs and nano-enabled products are disposed of. In terms of the precautionary principle, environmental monitoring of end-of-life ENMs is crucial to allow assessment of the potential impact of nanowaste on our ecosystem. Trace analysis and quantification of nanoparticulate species is very challenging because of the variety of ENM types that are used in products and low concentrations of nanowaste expected in complex environmental media. In the framework of this paper, challenges in nanowaste characterisation and appropriate analytical techniques which can be applied to nanowaste analysis are summarised. Recent case studies focussing on the characterisation of ENMs in waste streams are discussed. Most studies aim to investigate the fate of nanowaste during incineration, particularly considering aerosol measurements; whereas, detailed studies focusing on the potential release of nanowaste during waste recycling processes are currently not available. In terms of suitable analytical methods, separation techniques coupled to spectrometry-based methods are promising tools to detect nanowaste and determine particle size distribution in liquid waste samples. Standardised leaching protocols can be applied to generate soluble fractions stemming from solid wastes, while micro- and ultrafiltration can be used to enrich nanoparticulate species. Imaging techniques combined with X-ray-based methods are powerful tools for determining particle size, morphology and screening elemental composition. However, quantification of nanowaste is currently hampered due to the problem to differentiate engineered from naturally-occurring nanoparticles. A promising approach to face these challenges in nanowaste characterisation might be the application of nanotracers with unique optical properties, elemental or isotopic fingerprints. At present, there is also a need to develop and standardise analytical protocols regarding nanowaste sampling, separation and quantification. In general, more experimental studies are needed to examine the fate and transport of ENMs in waste streams and to deduce transfer coefficients, respectively to develop reliable material flow models.« less

  5. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  6. Characterizing odors from cattle feedlots with different odor techniques

    USDA-ARS?s Scientific Manuscript database

    Odors from cattle feedlots negatively affect local communities. The purpose of this study was to characterize odors and odorants using different odor sampling techniques. Odors were characterized with field olfactometers (Nasal Ranger®), sensory techniques (GC-O) and analytical techniques (sorbent t...

  7. Note: Model identification and analysis of bivalent analyte surface plasmon resonance data.

    PubMed

    Tiwari, Purushottam Babu; Üren, Aykut; He, Jin; Darici, Yesim; Wang, Xuewen

    2015-10-01

    Surface plasmon resonance (SPR) is a widely used, affinity based, label-free biophysical technique to investigate biomolecular interactions. The extraction of rate constants requires accurate identification of the particular binding model. The bivalent analyte model involves coupled non-linear differential equations. No clear procedure to identify the bivalent analyte mechanism has been established. In this report, we propose a unique signature for the bivalent analyte model. This signature can be used to distinguish the bivalent analyte model from other biphasic models. The proposed method is demonstrated using experimentally measured SPR sensorgrams.

  8. Analytical Electrochemistry: Theory and Instrumentation of Dynamic Techniques.

    ERIC Educational Resources Information Center

    Johnson, Dennis C.

    1980-01-01

    Emphasizes trends in the development of six topics concerning analytical electrochemistry, including books and reviews (34 references cited), mass transfer (59), charge transfer (25), surface effects (33), homogeneous reactions (21), and instrumentation (31). (CS)

  9. Dissolving Bubbles in Glass

    NASA Technical Reports Server (NTRS)

    Weinberg, M. C.; Oronato, P. I.; Uhlmann, D. R.

    1984-01-01

    Analytical expression used to calculate time it takes for stationary bubbles of oxygen and carbon dioxide to dissolve from glass melt. Technique based on analytical expression for bubble radius as function time, with consequences of surface tension included.

  10. On the petrological, geochemical, and geophysical characterization of a returned Mars surface sample and the impact of biological sterilization on the analyses

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A study was conducted: to identify those experiments that could and should be done on a returned Martian sample in order to characterize its inorganic properties; to evaluate, insofar as can be done, the effects of potential biological sterilization of the sample by heating prior to its return; to identify particular analytical techniques needing further improvement in order to make optimum use of a returned sample; and to identify experiments to be done on simulants, with and without sterilization, that better define the limits of information available about the planet from analyses of returned samples.

  11. Scattering Manipulation and Camouflage of Electrically Small Objects through Metasurfaces

    NASA Astrophysics Data System (ADS)

    Vellucci, S.; Monti, A.; Toscano, A.; Bilotti, F.

    2017-03-01

    In this paper, we discuss the intriguing possibility of tailoring the scattering response of an electrically small object for camouflage and illusion applications using metasurfaces. As a significant example, we focus our attention on the cylindrical geometry and derive the analytical conditions needed to camouflage the geometrical and electrical characteristics of dielectric and metallic cylinders coated with ideal metasurfaces. A closed-form expression of the camouflaging metasurface depending on the cylinder's characteristics is derived. Furthermore, the frequency behavior and the limitations of this technique are discussed with the aid of relevant examples. In order to overcome these limitations, a solution based on the use of lossy metasurfaces is proposed.

  12. Identification of novel peptides for horse meat speciation in highly processed foodstuffs.

    PubMed

    Claydon, Amy J; Grundy, Helen H; Charlton, Adrian J; Romero, M Rosario

    2015-01-01

    There is a need for robust analytical methods to support enforcement of food labelling legislation. Proteomics is emerging as a complementary methodology to existing tools such as DNA and antibody-based techniques. Here we describe the development of a proteomics strategy for the determination of meat species in highly processed foods. A database of specific peptides for nine relevant animal species was used to enable semi-targeted species determination. This principle was tested for horse meat speciation, and a range of horse-specific peptides were identified as heat stable marker peptides for the detection of low levels of horse meat in mixtures with other species.

  13. Stress corrosion cracking properties of 15-5PH steel

    NASA Technical Reports Server (NTRS)

    Rosa, Ferdinand

    1993-01-01

    Unexpected occurrence of failures, due to stress corrosion cracking (SCC) of structural components, indicate a need for improved characterization of materials and more advanced analytical procedures for reliably predicting structures performance. Accordingly, the purpose of this study was to determine the stress corrosion susceptibility of 15-5PH steel over a wide range of applied strain rates in a highly corrosive environment. The selected environment for this investigation was a highly acidified sodium chloride (NaCl) aqueous solution. The selected alloy for the study was a 15-5PH steel in the H900 condition. The slow strain rate technique was selected to test the metals specimens.

  14. Mechanical behavior of precipitation hardenable steels exposed to highly corrosive environment

    NASA Technical Reports Server (NTRS)

    Rosa, Ferdinand

    1994-01-01

    Unexpected occurrences of failures, due to stress corrosion cracking (SCC) of structural components, indicate a need for improved characterization of materials and more advanced analytical procedures for reliably predicting structures performance. Accordingly, the purpose of this study was to determine the stress corrosion susceptibility of 15 - 5 PH steel over a wide range of applied strain rates in a highly corrosive environment. The selected environment for this investigation was a 3.5 percent NaCl aqueous solution. The material selected for the study was 15 - 5 PH steel in the H 900 condition. The Slow Strain Rate technique was used to test the metallic specimens.

  15. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  16. Skin microbiome: genomics-based insights into the diversity and role of skin microbes

    PubMed Central

    Kong, Heidi H.

    2011-01-01

    Recent advances in DNA sequencing methodology have enabled studies of human skin microbes that circumvent difficulties in isolating and characterizing fastidious microbes. Sequence-based approaches have identified greater diversity of cutaneous bacteria than studies using traditional cultivation techniques. However, improved sequencing technologies and analytical methods are needed to study all skin microbes, including bacteria, archaea, fungi, viruses, and mites, and how they interact with each other and their human hosts. This review discusses current skin microbiome research, with a primary focus on bacteria, and the challenges facing investigators striving to understand how skin micro-organisms contribute to health and disease. PMID:21376666

  17. Surface Characterization.

    ERIC Educational Resources Information Center

    Fulghum, J. E.; And Others

    1989-01-01

    This review is divided into the following analytical methods: ion spectroscopy, electron spectroscopy, scanning tunneling microscopy, atomic force microscopy, optical spectroscopy, desorption techniques, and X-ray techniques. (MVL)

  18. An Artificial Neural Networks Method for Solving Partial Differential Equations

    NASA Astrophysics Data System (ADS)

    Alharbi, Abir

    2010-09-01

    While there already exists many analytical and numerical techniques for solving PDEs, this paper introduces an approach using artificial neural networks. The approach consists of a technique developed by combining the standard numerical method, finite-difference, with the Hopfield neural network. The method is denoted Hopfield-finite-difference (HFD). The architecture of the nets, energy function, updating equations, and algorithms are developed for the method. The HFD method has been used successfully to approximate the solution of classical PDEs, such as the Wave, Heat, Poisson and the Diffusion equations, and on a system of PDEs. The software Matlab is used to obtain the results in both tabular and graphical form. The results are similar in terms of accuracy to those obtained by standard numerical methods. In terms of speed, the parallel nature of the Hopfield nets methods makes them easier to implement on fast parallel computers while some numerical methods need extra effort for parallelization.

  19. On-chip collection of particles and cells by AC electroosmotic pumping and dielectrophoresis using asymmetric microelectrodes.

    PubMed

    Melvin, Elizabeth M; Moore, Brandon R; Gilchrist, Kristin H; Grego, Sonia; Velev, Orlin D

    2011-09-01

    The recent development of microfluidic "lab on a chip" devices requiring sample sizes <100 μL has given rise to the need to concentrate dilute samples and trap analytes, especially for surface-based detection techniques. We demonstrate a particle collection device capable of concentrating micron-sized particles in a predetermined area by combining AC electroosmosis (ACEO) and dielectrophoresis (DEP). The planar asymmetric electrode pattern uses ACEO pumping to induce equal, quadrilateral flow directed towards a stagnant region in the center of the device. A number of system parameters affecting particle collection efficiency were investigated including electrode and gap width, chamber height, applied potential and frequency, and number of repeating electrode pairs and electrode geometry. The robustness of the on-chip collection design was evaluated against varying electrolyte concentrations, particle types, and particle sizes. These devices are amenable to integration with a variety of detection techniques such as optical evanescent waveguide sensing.

  20. Comparison of soil pollution concentrations determined using AAS and portable XRF techniques.

    PubMed

    Radu, Tanja; Diamond, Dermot

    2009-11-15

    Past mining activities in the area of Silvermines, Ireland, have resulted in heavily polluted soils. The possibility of spreading pollution to the surrounding areas through dust blow-offs poses a potential threat for the local communities. Conventional environmental soil and dust analysis techniques are very slow and laborious and consequently there is a need for fast and accurate analytical methods, which can provide real-time in situ pollution mapping. Laboratory-based aqua regia acid digestion of the soil samples collected in the area followed by the atomic absorption spectrophotometry (AAS) analysis confirmed very high pollution, especially by Pb, As, Cu, and Zn. In parallel, samples were analyzed using portable X-ray fluorescence radioisotope and miniature tube powered (XRF) NITON instruments and their performance was compared. Overall, the portable XRF instrument gave excellent correlation with the laboratory-based reference AAS method.

Top